One World Seminar Series on the
Mathematics of Machine Learning
The One World Seminar Series on the Mathematics of Machine Learning is an online platform for research seminars, workshops and seasonal schools in theoretical machine learning. The focus of the series lies on theoretical advances in machine learning and deep learning as a complement to the one world seminars on probability, on Information, Signals and Data (MINDS), on methods for arbitrary data sources (MADS), and on imaging and inverse problems (IMAGINE).
The series was started during the Covid-19 epidemic in 2020 to bring together researchers from all over the world for presentations and discussions in a virtual environment. It follows in the footsteps of other community projects under the One World Umbrella which originated around the same time.
We welcome suggestions for speakers concerning new and exciting developments and are committed to providing a platform also for junior researchers. We recognize the advantages that online seminars provide in terms of flexibility, and we are experimenting with different formats. Any feedback on different events is welcome.
Next Event
Wed Dec. 4
Optimization, Sampling, and Generative Modeling in Non-Euclidean Spaces
Machine learning in non-Euclidean spaces have been rapidly attracting attention in recent years, and this talk will give some examples of progress on its mathematical and algorithmic foundations. A sequence of developments that eventually leads to non-Euclidean generative modeling will be reported.
More precisely, I will begin with variational optimization, which, together with delicate interplays between continuous- and discrete-time dynamics, enables the construction of momentum-accelerated algorithms that optimize functions defined on manifolds. Selected applications, namely a generic improvement of Transformer, and a low-dim. approximation of high-dim. optimal transport distance, will be described. Then I will turn the optimization dynamics into an algorithm that samples from probability distributions on Lie groups. This sampler provably converges, even without log-concavity condition or its common relaxations. Finally, I will describe how this sampler can lead to a structurally-pleasant diffusion generative model that allows users to, given training data that follow any latent statistical distribution on a Lie group, generate more data exactly on the same manifold that follow the same distribution. If time permits, applications such as molecule design and generative innovation of quantum processes will be briefly discussed.
Zoom link: https://uofglasgow.zoom.us/j/85342467510
Mailing List and Google Calendar
Sign up here to join our mailing list and receive announcements. If your browser automatically signs you into a google account, it may be easiest to join on a university account by going through an incognito window. With other concerns, please reach out to one of the organizers.
Sign up here for our google calendar with all seminars.
Format
Seminars are held online on Zoom. The presentations are recorded and video is made available on our youtube channel. A list of past seminars can be found here. All seminars, unless otherwise stated, are held on Wednesdays at 12 noon ET. The invitation will be shared on this site before the talk and distributed via email.
Board
Wuyang Chen (Simon Fraser University)
Bin Dong (Peking University)
Boumediene Hamzi (Caltech)
Issa Karambal (Quantum Leap Africa)
Qianxiao Li (National University of Singapore)
Matthew Thorpe (University of Warwick)
Tiffany Vlaar (University of Glasgow)
Stephan Wojtowytsch (University of Pittsburgh)
Zhiqin Xu (Shanghai Jiao Tong University)
Former Board Members
Simon Shaolei Du (University of Washington)
Franca Hoffmann (Caltech)
Surbhi Goel (Microsoft Research NY)