Skip to main content
Department of Information Technology

Statistical Estimation Theory and Its Applications 9+3hp

September 2018

fig_set.png

Description

Estimation lies at the heart of many problems in machine learning, data science, statistics, signal processing, and system identification. It encompasses learning models, inferring quantities and predicting variables using data. The aims of this course are i) to understand the fundamental principles of estimation, and ii) develop knowledge on how to apply estimators in practical problems.

The participants will learn how to:

  1. formulate statistical model classes in descriptive, predictive and causal inference
  2. analyze fundamental trade-offs and limits in estimation
  3. derive estimators based on different data models
  4. detect model misspecifications
  5. design and implement practical estimators

Contents

  • Descriptive inference: Nonparametric and parametric approaches. Exponential family, latent-variable, and moment-based models. M-estimators, belief distributions and Z-estimators. Generalized Cramer-Rao bounds.
  • Predictive inference: Regression decomposition. Dense and sparse regression models. Conformal prediction.
  • Causal inference: Counterfactual outcome and structural causal models. Casual effect estimation.

Course Structure

The course gives 9 hp (you can receive an additional 3 hp by carrying out a project).

  • Lectures: 11
  • Project: Students are encouraged to carry out a project that applies the tools in this course to a problem (possibly) related to their field of research. The project can be done alone or in collaboration with another student. The project will be reported in the form of a 3-page paper and presented in class.

Examination

Weekly homeworks

Course literature

The course will be based on

  • Dave Zachariah. Lecture Notes on Statistical Estimation Theory

Supplementary reading

  • C.R. Rao. Linear Statistical Inference and its Applications, Wiley, [1973] 2002.
  • S.M. Kay. Fundamentals of Statistical Signal Processing: Vol. 1 Estimation Theory, Prentice Hall, 1993.
  • C. Bishop. Pattern Recognition and Machine Learning, Springer, 2007.
  • J. Peters, et al. Elements of Causal Inference: Foundations and Learning Algorithms, MIT Press, 2016.

Schedule

TimeEdit

Prerequisites

Undergraduate courses in linear algebra and statistics.

Related Courses

Probabilistic Machine Learning is a complementary course.

Contact Person

Dave Zachariah

Updated  2018-08-06 12:25:23 by Dave Zachariah.