Large-Scale Optimization

Veranstaltung Wochentag Termin Ort
042531 (Übung: 042532) Montag (Übung: Mittwoch) 10.15 - 12.00 Otto-Hahn-Str. 12 - R 1.055 (Mon), R 1.056 (Wed)
Campus Nord

Lecture Information

  • Module description: pdf
  • Language: English
  • Lecturer: Dr. Sangkyun Lee
  • Attendance to practices (Übung) is REQUIRED.
  • Office Hour (lecturer): Thursday 13:00-14:00, LS8, OH 12, Raum 4.023 (or by appointments)


This course focuses on optimization techniques to find solutions of large-scale problems that typically appear in statistical learning / data analysis tasks with big data. Topics would include widely adopted methods in modern research literature such as projected gradient methods, accelerated first order algorithms, conjugate gradient methods, quasi-Newton methods, block coordinate descent, proximal point methods, stochastic sub-gradient algorithms, alternating direction method of multipliers, and semi-definite programming. Efficiency of these methods in terms of convergence rate and overall time complexity will be discussed, so that one can see how to choose a suitable method for his/her own research problems. Separable (approximate) reformulations of optimization problems will also be discussed that are suitable for parallel computation compared to their original formulation. Discussions in the course will be problem- oriented, so that methods will be presented in the context of specific problem instances whenever possible. Homework assignments will be given to provide background knowledge for the course or to check understanding of techniques.


The aim of this course is to provide students with understanding of modern optimization techniques suitable for large-scale/big-data problems, so that students see how they can choose, apply, and/or modify appropriate methods for their own research problems.


  • Lectures: we focus on methods to understand the structure of their target problem, algorithms, and convergence properties in theory.
  • Practices: we will see real-world examples where optimization methods (discussed in lectures) are used.


This lecture will be based on recent publications that will be assigned as reading homeworks. No textbook is required. Another lecture of mine will be helpful if you need background knowledge: Some of my lecture materials are from:
  • EE236C (UCLA) by L. Vendenberghe, link
  • 10-725 (CMU) by G. Gordon R. Tibshirani, link
  • EE364b (Stanford) by S. Boyd, link
The following is a tentative schedule, and may subject to change. Extra material:
  • Optimality conditions for nonsmooth convex optimization [link]
  • Google[x], Project Wing [youtube]
  • Class projets from Stanford EE364b 2014 [link]
  • L1-L2 Optimization in Signal and Image Processing (This provides a nice overview of some applications/methods we discuss. PDF is accessible within TU Dortmund)
  • CVXPY: Python-based modeling language for convex optimization problems link

Final Exam: Mini-Project

  • > Proposal (due by 05.02): a successful proposal is required for the qualification of the final exam. Check the requirements discussed on 14.01 pdf
  • > Presentation (19-20.03): oral presentation of a mini-project (20min: 1-person group or 30min: 2-people group)
  • > Final report (due by 25.03): up to 7 pages (extra pages allowed only for large tables and plots)
  • Final grade will be based on the presentation and the final report.
  • Latex template for proposal and the final report: link


There is no restriction on the choice of topics, but it should be motivating for the student and fit the time limit of the lecture. You can focus on either method or application, but not restricted to.

  • Method focus: e.g. modification of a method for better speed for a certain problem. Need to implement the method. Adaptation from an existing code is allowed. Benchmarking several algorithms for one problem setting is possible.
  • Application focus: e.g. apply an optimization method for your research problem. Don't need to implement the method by yourself, if you can find an off-the-shelf software.


These data sets can be useful for your project.

  • Collections of "standard" data sets:
    > UCI Machine Learning Repository: a common source of data sets link
    > LIBSVM datasets: link
  • Computer vision and image datasets:
    > A collection at UT link
    > Face recognition databases link
  • Movie rating data sets:
    > NF dataset (right clink -> download link as a file. Password required to unzip. About 500 MB): link
    > MovieLens dataset (smaller than NF) link
  • Bioinformatics:
    > Gene expression omnibus (GEO): link
    > The cancer genome atlas (TCGA): link

Grading Criteria:

  • Motivation: why is this an interesting problem for you and/or for the audience?
  • Understanding: the method(s) of choice should be explained in a compact but precise manner
  • Creativity: any good reason behind: the choice of a method / methods, the choice of a problem / problems, etc.
  • Evaluation: how well a hypothesis (e.g. an algorithm will get faster by a modification) is validated by theoretic analysis or experiments?
  • Presentation: how good things are organized and delivered in the presentation?
  • Repeatability: does it provide a script and executables to generate (a part of) experiments and results?

Suggested Topics:

Some suggestions for mini-projets will be added:

Lecture Schedule:

The schedule may change, so please check often.

Bonus Point Summary