Jia (Kevin) Liu,

Assistant Professor of Electrical and Computer Engineering, The Ohio State University

  • Home
  • Research
  • Publications
  • Awards
  • Grants
  • Activities
  • Teaching
  • My Group

COM S 578X: Optimization for Machine Learning
(Fall 2019)


Personnel

Instructor: Jia (Kevin) Liu, Assistant Professor, Dept. of Computer Science
Contact: 209 Atanasoff Hall, jialiu@iastate.edu
Time & Location: Tue/Thu 8:00am -- 9:20am, Sweeney Hall 1126
Office Hours: Wed 5pm -- 6pm or by appointment
TA: Menglu Yu (mengluy@iastate.edu)
TA Hours: Thu 10am -- 11am

Course Description [Syllabus]

Since its inception as a discipline, machine learning has made extensive use of optimization formulations and algorithms. Likewise, machine learning has contributed substantially to optimization theory, driving the development of new optimization approaches that address the significant challenges presented by machine learning applications. This course gears toward such an intersection of the two fields. Besides describing established optimization theory in machine learning contexts such as first-order methods, stochastic approximations, convex optimization, interior-point methods, proximal methods, etc., this course devotes significant attention to newer themes in machine learning such as regularized optimization, robust optimization, and a variety of gradient descent acceleration methods using ideas of momentum and second-order information.

Course Materials

There is no required textbook. Lectures are developed based on the following references:

    • [BV] S. Boyd and L. Vandenberghe, "Convex Optimization," Cambridge University Press, 2004;
    • [BSS] M. Bazarra, H.D. Sherali, and C.M. Shetty, "Nonlinear Programming: Theory and Algorithms," John Wiley & Sons, 2006;
    • [Nesterov] Y. Nesterov, "Introductory Lectures on Convex Optimization: A Basic Course," Springer, 2004;
    • [NW] J. Nocedal and S. J. Wright, "Numerical Optimization," Springer, 2006;
    • Classic and trending papers in the field, important monographs;
    • And many excellent online teaching resources by colleagues (including, among others, Profs. Stephen Boyd, Ryan Tibshirani, Stephen Wright, Constantine Caramanis, Yuejie Chi) and my own class notes at Virginia Tech from Prof. Hanif D. Sherali.

Homework

    • There will be 5 homework assignments, assigned roughly biweekly. Homework must be typeset in LaTeX (see the homework LaTeX template here)

Midterm

The in-class midterm exam will be closed-book and closed-notes. But you are allowed to bring a 1-page cheat sheet. The midterm covers up to finished lectures.

Final Project

You could choose to finish a project individually or by a team of no more than two persons. Project proposals will be due soon after midterm. Final reports will be due by the beginning of final exam week (Dec. 11). Final reports should follow the NeurIPS format. Each project is required to have a 30-minute in-class presentation at the end of the semester. Potential project ideas include but are not limited to: i) nontrivial extension of the results introduced in class; ii) novel applications in your own research area; iii) new theoretical analysis of an existing algorithm, etc. Each project should contain something new. It is important that you justify its novelty.

Grading Policy

    • Homework: 30%; Midterm: 30%; Final project: 40%.

Late Policy

Without the consent of the instructor, late homework assignments or final report will not be accepted and will result in a grade of zero. In the case of a conference deadline or something of the like, a 5-day written notice of extension is required. In the case of an emergency (sudden sickness, family problems, etc.), an after-the-fact notice is acceptable. But we emphasize that this is reserved for true emergencies.

Schedule

Here is an estimated class schedule, which is subject to change depending on lecture progress and/or class interests. Please check for latest adjustments.

Class
Date
Topics
Lecture Topics Lecture Notes
Reading Assignments
1
8/27
Course Info & Introduction
Course Info & Introduction Lecture 1

2
8/29
Math Background Review
Basic Analysis and Linear Algebra Lecture 2
Appendices in [BV,BSS]
3
9/3

4
9/5
I. Foundations of Convex Analysis
Convex Sets and Convex Functions Lecture 3
[BV, Ch.2, 3]
5
9/10
Duality Lecture 4
[BV,Ch.5]
6
912

7
9/17
Optimality Conditions
Lecture 5 [BSS,Ch.4,5]
8
9/19

9
9/24
II. First-Order Methods
Gradient Descent
Lecture 6
[Nesterov, Ch.1]
10
9/26

11
10/1
[Nesterov, Ch.2]
12
10/3
Accelerated First-Order Methods Lecture 7
[BSS Ch.8.8]
13
10/8

14
10/10

15
10/15
[Nesterov Ch.2.2]
16 10/17

17 10/22

Midterm Exam: Oct. 24 8:00am-9:20pm
18
10/29

Subgradient Method
Lecture 8

19 10/31

20 11/5
III. Stochastic First-Order Methods
Stochastic (Sub) Gradient Method Lecture 9
21 11/7

22 11/12

23 11/14

24 11/19
IV. Sparse/Regularized Optimization
Compressed Sensing, Matrix Completion Lecture 10
25 11/21

Thanksgiving Break
26 12/3
V. Proximal and Operator Splitting
ADMM, Coordinate Descent Lecture 11
27 12/5

28
12/10
In-Class Project Presentation



29
12/12



Final Exam Week

Academic Integrity

This course will follow ISU's Code of Academic Conduct. Discussions of homework assignments and final projects are encouraged. However, what you turn in must be your own. You should not directly copy solutions from others. Any reference (including online resources) used in your solution must be clearly cited.

 
Copyright © 2004- Jia (Kevin) Liu. All rights reserved.
Updated: . Design adapted from TEMPLATED.