Convex and Conic Optimisation

Lecturer

Dr James Saunderson, Monash University

Synopsis

This course is a mathematical introduction to continuous optimisation through the lens of convexity. It introduces the basic theory of convex sets and functions, develops the skill of recognizing and reformulating problems as structured convex optimisation problems, and introduces some key algorithmic ideas used to solve them.

Applications will be used throughout to illustrate the material. They will be drawn from areas such as control and dynamical systems, signal processing, machine learning, game theory, and information theory. No prior knowledge of these topics is required, the applications will be presented in a self-contained way.

Course overview

Week 1: After a general introduction, this week will introduce the basic language of convexity.

  • Convex sets I: definition, examples, operations preserving convexity, convex hull, extreme points;
  • Convex sets II: supporting hyperplanes, separation theorem;
  • Convex functions I: equivalent definitions, examples, operations preserving convexity;
  • Convex functions II: more equivalent definitions, (sub-)gradients, conjugate functions;
  • Quasi-convexity, log-convexity

Week 2: This week will introduce convex optimisation problems, the reduction to conic form, and then discuss three major classes of conic optimisation problems, as well as some applications of each.

  • Convex optimisation problems. Basic properties. Reduction to conic form.
  • Linear programming: basic form and properties
  • Applications of linear programming (possibilities include: two-player zero-sum games, network flows, regression/approximation with respect to L-1 and
    L-infinity norms, optimal control)
  • Quadratic programs, second order cone programs: basic forms, properties, recognition
  • Applications (possibilities include: sparse regression, support vector machines, robust linear programming)
  • Relative entropy programs and applications (possibilities include: channel capacity in information theory, geometric design problems)

Week 3: This week we introduce semidefinite programs and sample some applications, before discussing duality and optimality conditions for different common ways to write convex optimisation problems.

  • Semidefinite programs: basic forms, properties, recognition
  • Applications (possibilities include: optimization of matrix norms, estimation of rotations, certifying stability of linear dynamical systems)
  • Conic dual, optimality conditions for conic optimization, special cases of LP and SDP
  • Classical NLP form of convex optimisation problems: Lagrange duality and the KKT conditions

Week 4: This week we briefly introduce some important algorithms for convex optimization associated with the two flavours of optimality conditions introduced in week 2.

  • Gradient descent, smoothness and strong convexity.
  • (Projected) (sub)gradient methods.
  • Newton’s method and self-concordance for unconstrained problems
  • Interior-point methods I: self-concordant barriers
  • Interior-point methods II: primal path-following method

Prerequisites

  • (Advanced) linear algebra: finite-dimensional vector spaces, inner products, eigenvalue decomposition, (ideally) singular value decomposition
  • Multivariate calculus: gradients and Hessians, Taylor’s theorem
  • Basic analysis: convergence of sequences, open and closed sets, compactness,
  • Probability: basic facts and language about discrete and continuous random variables will be useful at some points.
  • Programming: some basic experience is required (an introductory course or equivalent experience in C++, Julia, Python, Matlab, or similar should suffice). Computational exercises will focus on using high-level modelling tools using either MATLAB, Julia, or Python.

This is intended to be a foundational course.

Assessment

  • Three assignments (20% each) due at the end of weeks 2, 3, and 4.
  • Take home exam (40%)

(may be subject to change)

Attendance requirements

Participation in all lectures and tutorials is expected.

For those completing the subject for their own knowledge/interest, evidence of at least 80% attendance at lectures and tutorials is required to receive a certificate of attendance.

Resources/pre-reading

The following texts will be useful references for the course:

As pre-reading, it is recommended that students read through Appendix A of S. Boyd and L. Vandenberghe, Convex Optimization.

Not sure if you should sign up for this course?

Take this QUIZ to self-evaluate and get a measure of the key foundational knowledge required.

Dr James Saunderson, Monash University

James Saunderson is a Senior Lecturer and ARC DECRA fellow in the Department of Electrical and Computer Systems Engineering at Monash University. He received a PhD in Electrical Engineering and Computer Science from MIT in 2015 and held postdoctoral positions at Caltech and the University of Washington before joining Monash. In 2020 he was the recipient (with Hamza Fawzi and Pablo Parrilo) of the SIAM activity group on optimization best paper prize.