Course Syllabus

Deep Learning

Instructor: Dr. Ellick Chan; TA: Mark Harmon

Contact: Ellick.chan@northwestern.edu, MarkHarmon2012@u.northwestern.edu

Office hours: E-mail instructor to set up phone meeting/Skype

Description: Many challenging problems in diverse areas such as computer vision, speech recognition, and machine language translation have recently made great progress by using an emerging technology called deep learning. At its core, deep learning is inspired by a simplified model of how the human brain works by building effective hierarchical representations of complex data. This course will explore applications and theory relevant to problem solving using deep learning. By the end of this course, students will gain intuition about how to apply various techniques judiciously and how to evaluate success. Students will also gain deeper insight into why certain techniques may work or fail for certain kinds of problems.

Logistics: Class will be held in the Krebs room on the following schedule with a poster session after class on June 2nd.

1. Friday, March 31: 4-7 PM

2. Monday, April 3: 10:30-1:30 PM

3. Friday, April 14: 4-7 PM

4. Monday, April 17: 10:30-1:30 PM

5. Friday, April 28: 4-7 PM

6. Monday, May 1: 10:30-1:30 PM

7. Thursday, May 11: 5:30 PM - 8:30 PM

8. Monday, May 15: 10:30-1:30 PM

9. Friday, June 2: 9 AM-12 PM and 4:30-7:30 PM

•        Tentative: Poster session 5 PM – 8 PM

Grading: 30% project, 50% assignments, 20% in-class quizzes, +15% bonus competitions

Late policy: 10% per day, max 3 days.

Honor policy: Students are encouraged to work in pairs for homework, labs and the project, but all students are expected to fully understand the work and contribute fairly to the projects/assignments.

 

Class Objective: Upon successful completion of the course, students should be able to:

  • Understand key concepts related to Deep Learning
    • Derive a simple Feedforward Neural Network (DNN)
    • Understand DNN architecture and parameters
    • Intuitively understand theory on why DNN works
    • Be able to compare DNN to other Machine Learning techniques
  • Apply DNN to real-life problems
    • Work with Tensorflow/Python for DNN programming
      • Computer-based problems, Project
    • Understand how to frame problems in the NN framework
    • Know where to look for papers/help on DNN

 

This course is organized as follows:

  1. Introduction: Friday March 31st 4-7 PM

This lecture covers the basics of deep neural networks, and provides an introduction to some topics this course will cover.

  1. Software tutorial: Monday April 3rd 10:30-1:30 PM

In this lecture, we will introduce software relevant to deep learning such as Numpy, Matplotlib and Tensorflow. At the conclusion of this lecture, we will build a simple classifier in Python.

  1. NN in “depth”: Friday, April 14th 4-7 PM

Now we are ready to investigate neural networks in depth. This lecture will explore the creation of a basic deep neural network and how to train it.

  1. Optimization: Monday, April 17th 10:30-1:30 PM

This lecture explores optimization techniques to help train neural networks quickly and successfully.

  1. Generalization: Friday, April 28th 4-7 PM

This lecture explores how neural networks may fail to generalize to new data and discusses techniques to help including dropout and convolutional networks.

  1. Representations: Monday, May 1st 10:30-1:30 PM

This lecture goes into depth on how data is represented and introduces representation/feature learning.

  1. Generative: Thursday, May 11th

This lecture discusses how to model the data and generate new examples from the model. We discuss how being able to generate new data helps with understanding models.

  1. Recurrent: Monday, May 15th 10:30-1:30 PM

This lecture explores how neural networks can model sequence data using RNN, GRU and LSTM.

  1. Applications and Poster Session: Friday, June 2nd 9 AM-12 PM and 4:30-7:30 PM

This lecture explores some recent applications of neural networks to natural language processing, speech recognition and other uses for deep learning.

 

We conclude the course by reviewing the outlook of some exciting new prospects in deep learning and how students can get involved. Following this short lecture, we will hold a poster session for the class projects.

Course Summary:

Date Details Due