On Neural Optimization: Uncertainty, Scarcity and Geometry
-
SpeakersAndres Guzman Cordero , Andres Guzman Cordero
-
LocationTinbergen
Amsterdam
-
Date and time
June 16, 2025
16:00 - 17:30
Optimization problems never get simpler; our solutions just get smarter. Modern optimization challenges increasingly involve high-dimensional search spaces, complex constraints, and inherently uncertain environments. Traditional methods struggle to cope with the combinatorial explosion of possibilities and the cost of high-fidelity simulations in fields ranging from molecular design to supply chain management. Universal function approximation devices called neural networks offer a novel approach to address these bottlenecks. These new devices allow us to learn from the data using stochastic gradient descent. However marvelous, they come with their own set of challenges. This thesis explores both sides of this metaphorical coin, first showing a solution to a common problem uniquely enabled by neural networks, and then tackling the challenge of using these devices. For the first part, we will explore the role of the exponential family when using variational inference in flow-based generative modeling, proposing a novel way to jointly learn probability measures of continuous and discrete data. This approach allows access to an arbitrary distribution function to create synthetic heterogeneous data, such as patient records, efficiently. On the second part, we tackle the complexity of training neural networks to solve ubiquitous problems in science, partial differential equations. Borrowing from the variational Monte Carlo literature, we show that by using Woodbury's matrix identity, momentum on the curvature of the loss landscape, and randomized low-rank approximations, we can train neural networks up to 75 times faster. Together, these contributions highlight how modern, data-driven approaches are reshaping conventional optimization, bridging theoretical rigor with the demands of real-world applications.