Prof. Dr. Sebastian Pokutta
Vice President and Division Head
Mathematical Algorithmic Intelligence
AI in Society, Science, and Technology (AIS²T)
Zuse Institute Berlin (ZIB)
Professor for Optimization and Machine Learning
Institute of Mathematics
Electrical Engineering and Computer Science (courtesy)
Technische Universität Berlin
Research Lab. My lab is interested in Artificial Intelligence, Optimization, and Machine Learning. We develop new methodologies (e.g., new optimization and learning algorithms), work on combining learning and decision-making, as well as design AI Systems for real-world deployment in various application contexts.
- Combettes, C. W., and Pokutta, S. (2020). Boosting Frank-Wolfe by Chasing Gradients. Preprint. [arXiv] [summary] [code]
- Carderera, A., and Pokutta, S. (2020). Second-order Conditional Gradients. Preprint. [arXiv] [code]
- Pfetsch, M., and Pokutta, S. (2020). IPBoost – Non-Convex Boosting via Integer Programming. Preprint. [arXiv] [summary]
- Pokutta, S., Singh, M., and Torrico, A. (2020). On the Unreasonable Effectiveness of the Greedy Algorithm: Greedy Adapts to Sharpness. Preprint. [arXiv] [poster]
- Combettes, C. W., and Pokutta, S. (2019). Revisiting the Approximate Carathéodory Problem via the Frank-Wolfe Algorithm. Preprint. [arXiv] [slides] [code]
Select Recent Talks and Teaching.
- 03/2020: (general) “Künstliche Intelligence im Arbeitsalltag”. Keynote at Künstliche Intelligenz => verständlich @ TH Wildau (Wildau, Germany). [slides]
- 09/2019: (technical) “Smooth Constraint Convex Minimization via Conditional Gradients”. Plenary at 19th French-German-Swiss conference on Optimization (Nice, France). [slides]
- 09/2019: (technical) “Mirror Descent and related methods in Linear and Discrete Optimization”. Talk at Cargese Workshop on Combinatorial Optimization (Cargese, France). [slides]
- 07/2019: (technical) “Locally Accelerated Conditional Gradients”. Talk at Research Institute for Mathematical Sciences Seminar, Kyoto University (Kyoto, Japan). [slides]
- 01/2019: (technical) “Smooth Constraint Convex Minimization via Conditional Gradients”. Plenary at INFORMS Computing Society Conference (Knoxville, TN). [slides]
- SoSe/2020: Discrete Optimization and Machine Learning (seminar)
Recent Blog Posts.
- 03/2020: Boosting Frank-Wolfe by Chasing Gradients
- 02/2020: Non-Convex Boosting via Integer Programming
- 11/2019: Approximate Carathéodory via Frank-Wolfe
- 09/2019: SCIP x Raspberry Pi: SCIP on Edge
- 08/2019: Universal Portfolios: how to (not) get rich
- 03/2020: Preliminary COVID19 forecast model
- 03/2020: The third conference on “Discrete Optimization and Machine Learning” is cancelled due to COVID-19.
- 09/2019: We are organizing the third conference on “Discrete Optimization and Machine Learning” in May 2020 at Kyoto University in Kyoto.
- 09/2019: TU Berlin and the Berlin School of Mathematics, with the support of MATH+, are organizing the “Combinatorial Optimization at Work (Co@Work) 2020” summer school on September 14 - 26, 2020 at ZIB in Berlin. Application deadline is: June 14, 2020. Intended audience: master/PhD students, Post-docs.
- 09/2019: Sanjeena Dang, Antoine Deza, Swati Gupta, Paul McNicholas, Masashi Sugiyama, and I are organizing a focus program on Data Science and Optimization in November 2019 at the Fields Institute in Toronto.