Machine Learning with Probabilistic ProgrammingFall 2018 | Columbia University
Day and Time: Wednesdays, 4:10p.m. to 6:00p.m.
Location: 413 Kent Hall
The world is full of noise and uncertainty. To make sense of it, we collect data and ask questions. Is there a tumor in this x-ray scan? What affects the quality of my manufacturing plant? How old is this planet I see through the telescope? Does this drug actually work? To pose and answer such questions, data scientists must iterate through a cycle: probabilistically model a system, infer hidden patterns from data, and evaluate how well our model describes reality.
By the end of this course, you will learn how to use probabilistic programming to effectively iterate through this cycle. Specifically, you will master
You will learn to use (and perhaps even contribute to) Edward throughout this course.
This is a graduate-level course. You should be comfortable with probability and statistics, calculus and linear algebra, and basic numerical optimization. You should be familiar with probabilistic machine learning (for example, you took a class that used Bishop or Murphy). You must be proficient writing robust software in Python to analyze data. You do not need to be familiar with Edward.
The textbook for this course is Essentials of Statistical Inference. In addition, I will distribute course notes and point to other readings as needed.
There will be three problem sets corresponding to a two week schedule. The problem sets are more theoretical in nature and involve minimal programming; they are meant to complement your final project. You are expected to complete all assigned questions. While the grade you get on your problem sets is a relatively small component of your total grade, working through, and often struggling at length with, the problem sets is a crucial part of the learning process and will invariably have a major impact on your understanding of the material.
You must submit your problem sets by the end of the class in which they are due.
Moderate collaboration on the problem sets in the form of joint problem solving with one or two classmates is permitted, provided your writeup is your own. Please prepare all written work using LaTeX; I will distribute a template.
The focus of this course is the final project. The goal is for you to choose a real world problem and to loop through the probabilistic modeling cycle using probabilistic programming. You will be expected to write, document, and report your analysis and findings. This will involve a significant amount of programming. Based on the number of students taking the course for credit, you will work either in groups of two or three. I will provide some suggestions; however, you are encouraged to find a problem in a field that excites you.
You will produce an 8-page final report and present your findings to the class in a short presentation at the end of the term. This project will measure your cumulative understanding of the material while providing you with a supportive environment to try out your new skills. Each student within a group will receive an individual grade, corresponding to their involvement in the project.
During the semester you will be expected to attend a seminar given in any department at the university. The speaker in the talk you select should be using probabilistic modeling and some sort of statistical inference in their research. Please ask me for permission beforehand to determine whether a particular talk is acceptable. After you have attended the talk, please write a two page summary and submit it no later than one week after the talk. This summary must include: a review of the talk, a discussion of how probabilistic modeling was used in the proposed research, a critical evaluation of the talk, and suggestions for improvements using probabilistic programming.
Your course grade will be calculated as follows.
|Problem Sets||36 %|
|Final Project||50 %|
|Seminar Summary||10 %|
|September 5||Probabilistic programming; statistical inference review||Ch.1, 2.1, 3, 5||PSET1 out|
|September 12||Python; Edward||PSET1 due|
|September 19||Statistical modeling||PSET2 out|
|September 26||Point estimation; optimization review|
|October 3||Variational inference||PSET3 out; PSET2 due|
|October 10||Markov chain Monte Carlo||Project Proposal Due|
|October 17||Predictive inference and evaluation||PSET3 due|
|October 24||Model criticism|
|October 31||Pitfalls of probabilistic programming|
|November 7||Advanced topics (TBD)|
|November 14||Guest lecture by Yunhao Tang|
|November 21||University holiday; no class.|
|November 28||Final project presentations|
|December 5||Final project presentations||Seminar Summary Due|