APTS: University of Warwick

10/01/2023 4-minute read

Introduction

I started my PhD journey in Data Science at Maynooth University in September 2022. I am being funded by the SFI Centre for Research Training (CRT) in Foundations of Data Science. During the first year I have some mandatory modules and assignments concerning the CRT program. Moreover, the CRT program supports the students to attend high-level training related to their specific background.

One of the most important training in statistics is the Academy for PhD Training in Statistics (APTS). The APTS is a collaboration between major UK statistics research groups to organize courses for first-year PhD students in statistics. Each year there are four intensive weeks in different universities. According to their description:

The intention of APTS is to provide courses which will be attractive and relevant to the research preparation and background education of all statistics and probability PhD students.

Besides the high-level courses taught by renowned professors, other advantage to attend the APTS modules is the opportunity to meet first-year PhD students from different universities across UK. This is fantastic!

This post aims to describe the first week modules I took in University of Warwick during 13-16 December, 2022. I will give an overview of the topics covered and in the next post I will provide the solution of one assignment.

Courses

The first week was organized by the Department of Statistics from University of Warwick. We had an intensive week with two modules, namely Statistical Computing and Statistical Inference, which were delivered by professors Darren Wilkinson from Durham University and Simon Shaw from University of Bath, respectively.

Statistical Inference

One interesting feature of APTS courses is the preliminary notes, which help the students prepare for the module. The preliminary notes on Statistical Inference course is available here and it covers the following:

  • Introduce the idea of statistical models
  • Motivate some principles of statistical inference
  • Review the two statistical inference paradigm (Classical and Bayesian approaches)

The course main topics were: principles of inference, decision theory, and confidence sets and hypothesis testing. Various key statistical concepts were discussed during the course and I would say that the two books of David Cox, entitled Theoretical Statistics and Principles of Statistical Inference broad most of them in full detail way.

It was quite attractive for me the second half of the course that compromised the inference problem under a decision problem viewpoint. Generally speaking, the decision making \(d\) acts as an estimate of the parameter \(\theta\) and the specific problem loss function \(L(\theta, d)\) measures the quality of \(d\) when \(\theta\) is known. We have seen that decision theory provides (i) a link between Bayesian and classical procedures, and (ii) Bayesian explanations to inference questions addressed in the Classical approach.

In general, the course was very dense, covered several theoretical key concepts, and provided to me an interesting vision about the decision theory which I haven’t studied yet..

The main notes and course assignment can be found here and here, respectively.

Statistical Computing

This was my favorite course, I could review and learn many important concepts and methods which I will use during my thesis. We mainly use R to perform the computational calculations. Although the main emphasis were on the concepts and not on R programming. The emphasize given in matrix theory was genial and I could realize their importance to compactly write and work with statistical models.

The preliminary notes was great, with some exercises that helped review statistics and mathematics knowledge used during the module.

The three main topics of the course was: (i) Matrix computing, (ii) Optimization and (iii) Calculus by computer (differentiation and integration). The main notes can be found here. Darren Wilkinson didactic is unique and surprisingly, he can manage difficult concepts in a very understandable and easily manner for the student.

We had two laboratory practical session. The first one covered matrix computing exercises, which one of the exercises we should show the equivalence of perform the principal components analysis (PCS) using singular value decomposition (SVD) and the Eigendecomposition. The second lab session was primary about optimization and one interesting exercise we had to implement the maximum likelihood estimators of the linear mixed models.

To sum up, in the next post I will provide the solution of the statistical computing assignment, which the exercises can be found here. This assignment deal with most of the concepts discussed during the module.