Instructors: Kilian Q. Weinberger and Jennifer J. Sun
Office hours:
Kilian Weinberger : Tuesdays 10:00 - 11:00 am (Booking Link) in 410 Gates Hall
Jennifer Sun: Feb 3 onwards Mondays 4:00 - 5:00 pm (Booking Link) in 450 Gates Hall
Lectures: Tuesday and Thursday from 2:55 to 4:10 pm.
Course staff office hours: Calendar Link
Course overview: This class is an introductory course to deep learning. It covers the fundamental principles behind training and inference of deep networks, deep reinforcement learning, the specific architecture design choices applicable for different data modalities, discriminative and generative settings, and the ethical and societal implications of such models.
Prerequisites: Fundamentals of Machine Learning (CS4780 , ECE4200 , STCSI4740), Python fluency (CS1110), and programming proficiency (e.g. CS 2110).
Course logistics: For enrolled students the companion Canvas page serves as a hub for access to Ed Discussions (the course forum) and Gradescope (for HWs). If you are enrolled in the course you should automatically have access to the site. Please let us know if you are unable to access it.
Your grade in this course is comprised of four components: homework, mid-term exam, project and participation.
There will be a number of homework assignments throughout the course, typically made available roughly one to two weeks before the due date. The homeworks will have both theoretical questions and programming questions.
To provide hands on learning with the methods we discuss in class and to get familiar with common ML frameworks, there will be a project. For the project, students will implement the method proposed an existing research paper and will try to reproduce the results in the paper.
There will be one mid-term exam for the class based on the material covered in the lectures.
Given that this is the pilot offering of the course, students are expected to play an active role in providing constructive feedback. The participation grade will be based on feedback provided for lectures and assignments. There will also be daily quizes at the start of class; these will only be graded for participation (the score does not matter).
Final grades are based on homework assignments, project, exam and participation.
For students enrolled in CS 4782, your final grade consists of:A tentative schedule is given below. It is quite possible the specific topics covered on a given day will change slightly. This schedule will be updated as necessary.
Topic | Date | Lecture | References | Notes/assignments | |
---|---|---|---|---|---|
Week 1 | Basics | Jan 21 | Logistics + History | slides | |
Jan 23 | Multi-Layer Perceptrons (MLPs); Backpropagation | DiDL (Ch. 4-5); CS 4780 (Sp2023); Backprop; Tensorflow Playground |
slides(in-class); slides(complete) |
||
Week 2 | Training Neural Networks | Jan 28 | Optimization | DiDL (Ch. 12); Optimization Demo Application |
slides(in-class); slides(complete) |
Jan 30 | Regularization | DiDL (Ch. 3.7, 5.6, 8.5) |
slides(in-class) HW1 Released |
||
Week 3 | Computer Vision | Feb 4 | Convolutional Neural Networks | DiDL (Ch. 7); CNN Visualization WebApp |
slides(in-class); Quiz 1 Released |
Feb 6 | Convolutional Neural Networks(continued) | DiDL (Ch. 7); |
slides(in-class); slides(continued); Quiz 1 Due |
||
Week 4 | Feb 11 | Modern ConvNets | DiDL (Ch. 8) |
slides(in-class); slides(complete); |
|
Natural Language Processing | Feb 13 | Word Embeddings | DiDL (Ch. 9); Word Embedding BlogPost |
slides(in-class); HW1 Due; HW2 Released |
|
Week 5 | Feb 18 | FEB BREAK (No Class) | Feb 20 | Recurrent Neural Networks (RNNs) + Long Short-term Memory (LSTM) |
slides(in-class); |
Week 6 | Natural Language Processing | Feb 25 | Attention; Transformers | DiDL (Ch. 11) |
slides(in-class); A2 Released |
Feb 27 | Transformers(continued); Large Language Models (LLMs) | Speech and Language Processing (Chp. 10-11) |
slides(in-class); |
||
Week 7 | Modern Vision Networks | Mar 4 | Vision Pre-Training (Supervised, Self-supervised) |
slides(in-class); |
|
Mar 6 | Vision-Language Models |
slides(in-class); HW2 Due; A2 Due; HW3 Released; |
|||
Week 8 | Generative Models | Mar 11 | Discriminators; Generative Adversarial Networks (GANs) |
slides(in-class); |
|
Mar 13 | U-Nets; Variational Autoencoders (VAEs) |
slides(in-class); HW3 Due; A3 Due; HW4 Released |
|||
Week 9 | Mar 18 | Diffusion Models |
slides(in-class); |
||
Mar 20 | Diffusion II |
What are Diffusion Models?; Understanding Diffusion Models: A Unified Perspective; Inception Labs: Language Generation with Diffusion |
slides(in-class) |
||
Week 10 | Midterm | Mar 24 | Review Recitation |
Fundamental Topics Deep Learning Topics |
|
Mar 25 | Midterm Jeopardy | HW4 Due(last late day) | |||
Mar 27 | Midterm |
While this course does not explicitly follow a specific textbook, there are useful references on many of the topics covered. Pointers to references will be provided here.
Cornell University provides a comprehensive set of mental health resources and the student group Body Positive Cornell has put together a flyer outlined the resources available.
You are encouraged to actively participate in class. This can take the form of asking questions in class, responding to questions to the class, and actively asking/answering questions on the online discussion board.
Students are free to share code and ideas within their stated project/homework group for a given assignment, but should not discuss details about an assignment with individuals outside their group. The midterm and final exam are individual assignments and must be completed by yourself.
The Cornell Code of Academic Integrity applies to this course.
In compliance with the Cornell University policy and equal access laws, we are available to discuss appropriate academic accommodations that may be required for student with disabilities. Requests for academic accommodations are to be made during the first three weeks of the semester, except for unusual circumstances, so arrangements can be made. Students are encouraged to register with Student Disability Services to verify their eligibility for appropriate accommodations.