Dynamic Machine Learning Algorithms


In this seminar, we will discuss recent results in algorithms research, with a focus on the relationship between dynamic algorithms and machine learning. We will study basic algorithms for supervised and unsupervised machine learning tasks in which the input data is constantly changing and the algorithm quickly has to adapt its output after each such change. Further topics of interest include the use of novel dynamic matrix data structures in (static) machine learning algorithms and dynamic algorithms for data science tasks. In addition to talks given by students, researchers of the Big Data Algorithms group will present their current research as well as potential topics for master theses.


Course Type: Seminar (SE)
Time and date: Tuesday, 14:30-16:00, SR T05 (Jakob-Haringer-Straße 2)
ECTS: 5 credit points
Instructor: Sebastian Forster
Language: English
Prerequisites: This is a master-level course. Students are expected to have knowledge in computer science, mathematics, and scientific writing and conduct as typically covered in a computer science bachelor program or a related program.

Tentative Guidelines

  • Registration via  PLUSonline
  • Reading and understanding a scientific paper
  • Talk of 45 minutes (+ 15 minutes for questions)
  • Companion report of 5–10 pages

Tentative List of Papers

  • [BEFH+23] MohammadHossein Bateni, Hossein Esfandiari, Hendrik Fichtenberger, Monika Henzinger, Rajesh Jayaram, Vahab Mirrokni, and Andreas Wiese. “Optimal Fully Dynamic 𝑘-Center Clustering for Adaptive and Oblivious Adversaries”. In: Proceedings of the 2023 ACM-SIAM Symposium on Discrete Algorithms. SODA 2023 (Florence, Italy, Jan. 22–25, 2023). Ed. by Nikhil Bansal and Viswanath Nagarajan. SIAM, 2023, pp. 2677–2727. doi:  10.1137/1.9781611977554.ch101.
  • [BPSW21] Jan van den Brand, Binghui Peng, Zhao Song, and Omri Weinstein. “Training (Overparametrized) Neural Networks in Near-Linear Time”. In: 12th Innovations in Theoretical Computer Science Conference. ITCS 2021 (Virtual Conference, Jan. 6–8, 2021). Ed. by James R. Lee. Vol. 185. LIPIcs. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2021, 63:1–63:15. doi:  10.4230/LIPIcs.ITCS.2021.63.
  • [BS23] Marco Bressan and Mauro Sozio. “Fully-Dynamic Approximate Decision Trees With Worst-Case Update Time Guarantees”. In: CoRR abs/2302.03994 (2023). doi:  10.48550/arXiv.2302.03994. arXiv:  2302.03994.
  • [DMVW23] Sami Davies, Benjamin Moseley, Sergei Vassilvitskii, and Yuyan Wang. “Predictive Flows for Faster Ford-Fulkerson”. In: Proceedings of the 40th International Conference on Machine Learning. ICML 2023 (Honolulu, HI, USA, July 23–29, 2023). Ed. by Andreas Krause, Emma Brunskill, Kyunghyun Cho, Barbara Engelhardt, Sivan Sabato, and Jonathan Scarlett. Vol. 202. Proceedings of Machine Learning Research. PMLR, 2023, pp. 7231–7248.
  • [HK20] Monika Henzinger and Sagar Kale. “Fully-Dynamic Coresets”. In: 28th Annual European Symposium on Algorithms. ESA 2020 (Pisa, Italy (Virtual Conference), Sept. 7–9, 2020). Ed. by Fabrizio Grandoni, Grzegorz Herman, and Peter Sanders. Vol. 173. LIPIcs. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, 2020, 57:1–57:21. doi:  10.4230/LIPIcs.ESA.2020.57.
  • [HLSS+23] Monika Henzinger, Andrea Lincoln, Barna Saha, Martin P. Seybold, and Christopher Ye. “On the Complexity of Algorithms with Predictions for Dynamic Graph Problems”. In: CoRR abs/2307.16771 (2023). doi:  10.48550/arXiv.2307.16771. arXiv:  2307.16771.
  • [JPW22] Shunhua Jiang, Binghui Peng, and Omri Weinstein. “Dynamic Least-Squares Regression”. In: CoRR abs/2201.00228 (2022). arXiv:  2201.00228.
  • [LS23] Quanquan C. Liu and Vaidehi Srinivas. “The Predicted-Deletion Dynamic Model: Taking Advantage of ML Predictions, for Free”. In: CoRR abs/2307.08890 (2023). doi:  10.48550/arXiv.2307.08890. arXiv:  2307.08890.
  • [SW20] Saurabh Sawlani and Junxing Wang. “Near-optimal fully dynamic densest subgraph”. In: Proccedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing. STOC 2020 (Chicago, IL, USA (Virtual Conference), June 22–26, 2020). Ed. by Konstantin Makarychev, Yury Makarychev, Madhur Tulsiani, Gautam Kamath, and Julia Chuzhoy. ACM, 2020, pp. 181–193. doi:  10.1145/3357713.3384327.
  • [TDS22] Tom Tseng, Laxman Dhulipala, and Julian Shun. “Parallel Batch-Dynamic Minimum Spanning Forest and the Efficiency of Dynamic Agglomerative Graph Clustering”. In: Proceedings of the 34th ACM Symposium on Parallelism in Algorithms and Architectures. SPAA 2022 (Philadelphia, PA, USA, July 11–14, 2022). Ed. by Kunal Agrawal and I-Ting Angelina Lee. ACM, 2022, pp. 233–245. doi:  10.1145/3490148.3538584.