ORIE 6730

ORIE 6730

Course information provided by the Courses of Study 2024-2025.

Empirical observation of deep neural networks shows surprising phenomena that classical statistical theory fails to fully describe. For example, bounds on generalization error from classical statistics grow with the flexibility of the model class but neural networks often exhibit low generalization error despite being extremely flexible. Also, training deep neural networks with stochastic gradient descent produces accurate models despite non-convexity of the loss landscape. A recently emerged literature is developing new theory to explain these and other mysteries. After presenting relevant theoretical results from classical statistics and a brief refresher on deep learning, the course will present theoretical results from recent research articles, complementing them with empirical evidence, emphasizing what is unknown along with what is known.

When Offered Fall.

Prerequisites/Corequisites Prerequisite: CS 4782 or CS 5787, and ORIE 6500.

Comments Familiarity with deep learning and graduate-level familiarity with probability encouraged.

Outcomes
  • Identify the main ideas in classical statistical theory and what phenomena they fail to explain in deep learning.
  • Identify the main hypotheses in the recent literature for the emergence of these phenomena and the evidence for and against these hypotheses.
  • Demonstrate an understanding of the main ideas in recent theoretical results explaining why deep learning works so well.

View Enrollment Information

Syllabi: none
  •   Regular Academic Session. 

  • 3 Credits Stdnt Opt

  • 19576 ORIE 6730   LEC 001

  • Instruction Mode: In Person

Syllabi: none
  •   Regular Academic Session. 

  • 3 Credits Stdnt Opt

  • 20783 ORIE 6730   LEC 030

    • TR Cornell Tech
    • Aug 26 - Dec 9, 2024
    • Frazier, P

  • Instruction Mode: In Person
    Enrollment limited to: Cornell Tech Doctor of Philosophy (PhD) students.