# Mathematical & Computations

# Computational Probability and Statistics (AMPA 1690, Fall, Harrison):

Probability and statistics are increasingly computational fields. Students will be exposed to several topics that use computers to solve challenging problems in probability and statistics. They will also use computers to develop intuitions about classical analytic results in probability and statistics. Topics include: (1) simulating randomness (pseudo-random number generation, transformation of random variables, rejection sampling), (2) stochastic approximation (law of large numbers, central limit theorem, Monte Carlo integration, importance sampling), (3) random walks (recurrence properties, exit times), (4) graphical models (Markov random fields, Bayes nets, hidden Markov models, dynamic programming), (5) dimensionality reduction (principle components analysis, independent components analysis). Students will learn Matlab as part of the course. Prior exposure to calculus based probability is required. Graduate, Undergraduate

# Recent Application of Probability and Statistics (AMPA 2610, Spring, Harrison):

This course explores a few cool topics in probability and statistics that have had fundamental influences on diverse fields and that are not often covered in the core probability and statistics courses. The emphasis is on depth rather than breadth. The main topics are (1) the maximum entropy principle for large systems and large deviations, (2) the bias-variance dilemma for nonparametric classification, (3) computation and inference for graphical models. Topic (1) will touch on ideas from statistical physics, large deviations, and information theory. Topic (2) will introduce some of the key concepts from classical statistics and then focus on more modern techniques like kernel methods and support vector machines. Topic (3) will introduce graphical models and highlight some important tools like dynamic programming, MCMC, and EM. Requires programming skills and prior exposure to probability, statistics, advanced calculus, linear algebra. Measure theory not required.Graduate, Undergraduate

# Statistical Inference I,II (PHP 2520, Fall, Ombao; PHP 2580, Spring, Gatsonis):

A comprehensive introduction to the theory of modern statistical inference. Semester I presents a survey of fundamental ideas and methods, including sufficiency, likelihood based inference, hypothesis testing, asymptotic theory, and Bayesian inference. Semester II covers such topics as non-parametric statistics, quasi-likelihood, resampling techniques, statistical learning, and methods for high-dimensional Bioinformatics data. Measure theory not required.Graduate, Undergraduate

# Bayesian Statistical Methods (PHP 2530, Fall, Gutman):

Surveys the state of the art in Bayesian methods and their applications. Discussion of the

fundamentals followed by more advanced topics including hierarchical models, Markov Chain, Monte Carlo and other methods for sampling from the posterior distribution, robustness, sensitivity analysis, and approaches to model selection and diagnostics. Features nontrivial applications of Bayesian methods from diverse scientific fields, with emphasis on biomedical research. Graduate

# Linear and Generalized Linear Models (PHP 2601, Fall, Ombao):

Generalized linear models provide a unifying framework for regression. Important examples include linear regression, log-linear models, and logistic regression. GLMs for continuous, binary, ordinal, nominal, and count data. Topics include model parameterization, parametric and semiparametric estimation, and model diagnostics. Methods for incomplete data are introduced. Computing with modern software is emphasized. Graduate

# Essential Statistics (APMA 0650, Spring, Geman):

A first course in statistics emphasizing statistical reasoning and basic concepts. Comprehensive treatment of most commonly used statistical methods. Elementary probability and the role of randomness. Data analysis and statistical computing using Excel. Examples and applications from the popular press and the life, social and physical sciences. No mathematical prerequisites beyond high school algebra. Extra Credit Graduate, Undergraduate

# Introduction to Machine Learning (CSCI 1950F, Spring, Sudderth):

Topics include parameter estimation, probabilistic graphical models, approximate inference, and kernel and nonparametric methods. Applications to regression, categorization, and clustering problems are illustrated by examples from vision, language, communications, and bioinformatics.Graduate, Undergraduate

# Special Topics in Machine Learning (CSCI 2950P, Fall, Sudderth):

This course explores current research topics in statistical machine learning. Focus varies by year, and may include Bayesian nonparametrics; models for spatial, temporal, or structured data; and variational or Monte Carlo approximations. Course meetings combine lectures with presentation and discussion of classical and contemporary research papers. Students will apply some this material to a project, ideally drawn from their own research interests.Graduate, Undergraduate

# Doing Bayesian Data Analysis (CLPS 2910, Spring, Frank):

A tutorial introduction to doing Bayesian statistics for data analysis, starting from the basics of probabilities and Bayes’ theorem. Part 1 of the course will work through contemporary Monte Carlo methods in the context of simple analyses, building up to simple linear regression and Bayesian versions of single-factor ANOVA. In Part 2, null hypothesis significance testing will be contrasted with Bayesian approaches to null value assessment and Bayesian approaches to power. A variety of more complicated realistic applications will be tackled, covering Bayesian versions of multiple linear regression, logistic regression, analysis of variance, etc., including consideration of repeated measures designs.