ELE382: Probabilistic Systems and Information Processing

Yuxin Chen, Princeton University, Fall 2018
  1. Review of discrete and continuous probability

    1. Random variables and random vectors

    2. Conditional probability, Bayes’ rule, and independence

    3. Expectations, moments, and moment generating functions

    4. Gaussian random variables and vectors

  2. Hypothesis testing, detection, and classification

    1. Maximum likelihood (ML) rule

    2. Maximum a posteriori (MAP) rule

    3. Optimal detection in Gaussian noise and matched filtering

    4. Likelihood ratio test

  3. Random processes

    1. Correlation and covariance functions

    2. Spectral density and cross-spectral density

    3. Stationarity

    4. Principal component analysis (PCA) and Karhunen-Loeve (KL) decomposition

    5. Gaussian processes

    6. Poisson processes

  4. Linear regression and estimation

    1. Least squares estimation

    2. Maximum likelihiood estimation (MLE)

    3. Minimum mean square estimation (MMSE) and Bayesian estimators

    4. Optimal filtering for random processes

      1. Wiener filter

      2. Kalman filter

  5. Inference in graphical models

    1. Viterbi algorithm

    2. Hidden Markov model (HMM)

    3. Message passing for trees