Hauptnavigation

Large-Scale Optimization

Veranstaltung Wochentag Termin Ort
042531 (Übung: 042532) Montag (Übung: Mittwoch) 10.15 - 12.00 Otto-Hahn-Str. 12 - R 1.056 Campus Nord

Lecture Information

  • Module description: pdf
  • Language: English
  • Lecturer: Dr. Sangkyun Lee
  • Attendance to practices (Übung) is REQUIRED.
  • Office Hour (lecturer): Thursday 13:00-14:00, LS8, OH 12, Raum 4.023 (or by appointments)

Content:

This course focuses on optimization techniques to find solutions of large-scale problems that typically appear in statistical learning / data analysis tasks with big data. Topics would include widely adopted methods in modern research literature such as projected gradient methods, accelerated first order algorithms, conjugate gradient methods, quasi-Newton methods, block coordinate descent, proximal point methods, stochastic sub-gradient algorithms, alternating direction method of multipliers, and semi-definite programming. Efficiency of these methods in terms of convergence rate and overall time complexity will be discussed, so that one can see how to choose a suitable method for his/her own research problems. Separable (approximate) reformulations of optimization problems will also be discussed that are suitable for parallel computation compared to their original formulation. Discussions in the course will be problem- oriented, so that methods will be presented in the context of specific problem instances whenever possible. Homework assignments will be given to provide background knowledge for the course or to check understanding of techniques.


Aims:

The aim of this course is to provide students with understanding of modern optimization techniques suitable for large-scale/big-data problems, so that students see how they can choose, apply, and/or modify appropriate methods for their own research problems.


Organization:

  • Lectures: we focus on methods to understand the structure of their target problem, algorithms, and convergence properties in theory.
  • Practices: we will see real-world examples where optimization methods (discussed in lectures) are used.


Lectures:

This lecture will be based on recent publications that will be assigned as reading homeworks. No textbook is required. Another lecture of mine will be helpful if you need background knowledge:

Some of my lecture materials are from:
  • EE236C (UCLA) by L. Vendenberghe, link
  • 10-725 (CMU) by G. Gordon R. Tibshirani, link
  • EE364b (Stanford) by S. Boyd, link
The following is a tentative schedule, and may subject to change.


Reading assignments:

Read the following papers as for your homework.


Final Exam: Mini-Project

  • > Proposal (due by 17.07): a successful proposal is required for the qualification of the final exam. Check the requirements discussed on 14.01 pdf
  • > Presentation (18.09): oral presentation of a mini-project (20min: 1-person group or 30min: 2-people group)
  • > Final report (due by 24.09): up to 7 pages (extra pages allowed only for large tables and plots)
  • Final grade will be based on the presentation (50\%) and the final report (50\%).
  • Latex template for proposal and the final report: link

Topics:

There is no restriction on the choice of topics, but it should be motivating for the student and fit the time limit of the lecture. You can focus on either method or application, but not restricted to.

  • Method focus: e.g. modification of a method for better speed for a certain problem. Need to implement the method. Adaptation from an existing code is allowed. Benchmarking several algorithms for one problem setting is possible.
  • Application focus: e.g. apply an optimization method for your research problem. Don't need to implement the method by yourself, if you can find an off-the-shelf software.

Datasets:

These data sets can be useful for your project.

  • Collections of "standard" data sets:
    > UCI Machine Learning Repository: a common source of data sets link
    > LIBSVM datasets: link
  • Computer vision and image datasets:
    > A collection at UT link
    > Face recognition databases link
    > Visual SLAM for driverless cars link
  • Movie rating data sets:
    > NF dataset (right clink -> download link as a file. Password required to unzip. About 500 MB): link
    > MovieLens dataset (smaller than NF) link
  • Bioinformatics:
    > Gene expression omnibus (GEO): link
    > The cancer genome atlas (TCGA): link
  • ECML/PKDD 2015 Discovery Challenge: link

Grading Criteria:

  • Motivation: why is this an interesting problem for you and/or for the audience?
  • Understanding: the method(s) of choice should be explained in a compact but precise manner
  • Creativity: any good reason behind: the choice of a method / methods, the choice of a problem / problems, etc.
  • Evaluation: how well a hypothesis (e.g. an algorithm will get faster by a modification) is validated by theoretic analysis or experiments?
  • Presentation: how good things are organized and delivered in the presentation?
  • Repeatability: does it provide a script and executables to generate (a part of) experiments and results?



Lecture Schedule:

The schedule may change, so please check often.




Bonus Point Summary