News

Published on
November 18, 2025
Last update: November 19, 2025

Dimension Reduction Beyond Euclidean Geometry

Chengyuan Deng gave a talk on Tuesday, November 18, 2025, as part of the Colloquia on Algorithms and Learning series.:

Speaker: Chengyuan Deng (Rutgers University, USA)

Topic: Dimension Reduction Beyond Euclidean Geometry

Date and Time: Tuesday, November 18, 2025, at 11:00 a.m.

Location: T02, Jakob-Haringer-Straße 2

Host: Univ.-Prof. Sebastian Forster

Abstract: Dimension Reduction aims to represent data with significantly shorter vectors, while preserving certain inherent properties of the data. Under Euclidean geometry, a few classical techniques have been well-studied, such as Principal Component Analysis (PCA), Multidimensional Scaling (MDS) and the Johnson-Lindenstrauss (JL) Lemma. However, modern applications handle data with various dissimilarity measures, which may not lie in Euclidean space. In this talk, I will share our recent works on extending JL Lemma and MDS to a general Non-Euclidean setting: we only require the dissimilarity measure to be symmetric and reflexive. Note that this setting allows violation of triangle inequality, therefore also considers the non-metric cases. Indeed, the geometry can be very different from Euclidean, but our results show the extension is possible and certain optimality can be achieved. This direction is widely open, I will end with a slightly longer discussion of open questions.

About Chengyuan Deng: Chengyuan Deng is currently a fifth-year PhD student at Rutgers University advised by Jie Gao, and a member of the theory group. He mainly works on graph algorithms, geometry-related problems, learning theory and trustworthy AI. Before Rutgers, he did undergraduate study at Tongji University, China.

Guest Lecture of CHENGYUAN Deng