Academic Year 2018/19, Term 2
School of Mathematics, The University of Manchester

Teaching staff:

Lecturer: Korbinian Strimmer (Office hour: Friday 3-4pm, ATB 2.221)
Academic tutors: Georgi Boshnakov and Robert Gaunt
Student tutors: Rajenki Das, Bindu Vekaria, Jack Mckenzie, Chunyu Wang and Zili Zhang
Student assistant: Beatriz Costa Gomes

Frequently asked questions, feedback and email:

If you have any suggestions, comments, corrections (e.g. typos in notes) etc. you are most welcome to contact the lecturer directly by email. However, please remember that this is a very large class (approx. 300 students!) so please do check the MATH20802 FAQ to see whether your question has not already been asked and answered before!

Please also note that the size of the class does not allow for personal tuition via email. Therefore, to get feedback please attend the tutorial sessions and ask question in person to your tutor. This will also benefit all students in your class! For further questions the lecturer is available at the end of each lecture and during the office hour on Friday afternoon.

Finally, this course encourages good practice for mental health and work-life balance. In particular, note that no email will be answered on weekends.

Overview and syllabus:

For an outline of this course unit see MATH20802: Statistical Methods or download the course description as PDF.

Dates and location:

The course starts 31st January 2019 and runs until 10th May 2019. The tutorials start in week 3 on 12 February 2019.

The course takes place at the following dates and locations:

Session Time slot (location) Term week
Lectures: Thursday 5pm-6pm (Crawford House TH1) and
Friday 12noon-1pm (Stopford TH1)
1-11
Example classes: Tuesday 3pm-4pm (Boshnakov, Vekaria),
Tuesday 5pm-6pm (Boshnakov, Zhang),
Thursday 11pm-12noon (Strimmer, Das),
Thursday 4pm-5pm (Strimmer, McKenzie),
Friday 9am-10am (Gaunt, Wang) (Alan Turing G209)
3, 4, 6, 8, 9, 11
Computer labs: Groups and times as above for example classes (Alan Turing G105) 5, 10
In-class test: Groups and times as above for example classes (Alan Turing G105).
The test will be an online assessment on Blackboard (40 minutes).
7
Revision week: As above - revision lectures and Q & A classes 12

In-class test and exam:

The in-class test is an online assessment on Blackboard and will take place in week 7 (worth 20%) in Alan Turing G105 during the usual example class / computer lab hours. The written exam (2 hours) is worth the remaining 80%.

Assessment Date Term week
In-class test (20%): 12 March 2019 to 15 March 2019 (40 minutes) 7
Written exam (80%): date tba (2 hours) exam period (13 May to 31 May 2019)

Course material:

Course material can be retrieved from Blackboard. This includes i) the scanned handwritten slides from the lectures, ii) the lectures notes, iii) the example sheets and iv) the instructions for the computer labs. Furthermore, the automated lecture capture system is active for this module so videos of the lectures can be revisited online.

In addition to the above, it is essential to study the material using a text book. The following are recommend to accompany this module (all can be downloaded as PDF):

  1. Cox, D.R. 2006. Principles of statistical inference. Cambridge University Press
  2. Shalizi, C.R. 2019. Advanced Data Analysis from an Elementary Point of View. Cambridge University Press (to appear)
  3. Wood, S. 2014. Core Statistics. Cambridge University Press.

Lecture contents:

There will be 11 weeks of lectures and 1 week of revision. Below you can find the topics discussed in each week to facilitate further study (this table is updated at the end of each week):

Term week Content Links and Keywords
1 Lecture 1: Introduction to the module content: information and likelihood, linear model (regression), Bayesian learning, application in R. Overview of data science - probabilistic inference vs. other schools (machine learning), difference between probability and statistics (= randomness vs. uncertainty, intrinsic property vs. description, ontology vs. epistemology), Overview of probabilistic modeling, model fit by minimising KL divergence.
 
Lecture 2: Shannon entropy, application to discrete uniform model, definition of Kullback-Leibler divergence (relative entropy), KL properties, KL divergence between discrete distributions and link to chi-squared statistics, application to two univariate normals, likelihood function, maximum likelihood as large-sample limit of minimising KL divergence / cross entropy.
See scanned slides for lectures 1 and 2 (available on Blackboard).
 
Examinable topics: randomness, uncertainty, entropy (information theory), differential entropy, Kullback-Leibler divergence, cross entropy, likelihood function.
 
Not relevant for exam but still interesting: book - the master algorithm, epistemology, Bregman divergence, f-divergence.
2 Lecture 3: Maximum likelihood point estimates, bias, variance and MSE (mean squared error), log-likelihood function, score function (for scalar and vector-valued parameters), MLE for Binomial, exponential and normal model, MLE of variance in normal model is biased, properties of MLEs, invariance, relationship to least squares (LS) estimator in normal case.
 
Lecture 4: Optimality properties, consistency, Cramer-Rao bound, MLE as minimally sufficient statistic, MLE as optimal summariser of information in data about a model, observed Fisher information matrix (for vector-valued parameters), quadratic approximation of log-likelihood function around the MLE, relationship of observed information to inverse variance.
See scanned slides for lectures 3 and 4 (available on Blackboard).
 
Examinable topics: Mean squared error, maximum likelihood estimation, score function, observed Fisher information.
 
Not relevant for exam but still interesting: Cramer-Rao bound, sufficient statistic.
3 Lecture 5: Observed Fisher information for estimate of proportion (Binomial model), observed Fisher information matrix for normal model, asymptotical normal distribution of MLEs, construction of corresponding symmetric normal confidence intervals, expected Fisher information, expected Fisher information as local approximation of KL divergence, expected Fisher information of normal model.
 
Lecture 6: (Squared) Wald statistic and corresponding asymptotic distribution, normal example, counter example (uniform distribution) with non-regular likelihood function (not differentiable at MLE hence no observed Fisher information and no asymptotics), likelihood based confidence intervals.
See scanned slides for lectures 5 and 6 (available on Blackboard).
 
Examinable topics: expected Fisher information, confidence intervals using normal approximation and based on likelihood function.
 
Not relevant for exam but still interesting: Information geometry, higher order likelihood inference.
4
5
6
7
8
9
10
11
12 Revision lectures

Please note that the links to Wikipedia given above are for convenience but should not be considered as definite resource! For further study please revisit the notes and read the suggested textbooks!

Computer labs timetable and contents:

There will be two computer labs, below you will find the date and topics. The instruction for the computer labs will be available online on Blackboard.

Term week Lab Topic Work material
5 #1 tba
10 #2 tba

Example class timetable:

There are six example classes taking place in term week 3, 4, 6, 8, 9 and 11. The corresponding example sheets will be available Tuesday noon on Blackboard. The solutions will be published on Friday noon. In term week 12 (revision week) the tutorials will be Q & A sessions for the exam.