Research Master Pre-Defense
On Neural Optimization: Uncertainty, Scarcity and Geometry
-
SpeakersAndres Guzman Cordero , Andres Guzman Cordero
-
Date and time
June 04, 2025
09:00 - 17:00
Modern optimization challenges increasingly involve high-dimensional search spaces, complex constraints, and inherently uncertain environments. In fields ranging from molecular design to supply chain management, traditional methods struggle to cope with the combinatorial explosion of possibilities and the cost of high-fidelity simulations. Neural optimization approaches offer novel avenues to address these bottlenecks. Using deep generative models, Bayesian inference, and advanced linear algebra techniques, researchers can learn compact latent representations, accelerate expensive simulations, and robustly navigate uncertainty. Furthermore, introducing geometry-aware parameterizations and second-order methods can drastically improve convergence rates and solution quality for large-scale scientific problems, illustrating the synergy between cutting-edge machine learning tools and classic operations research principles. This thesis explores five complementary perspectives on neural optimization, focusing on uncertainty, complexity, efficiency, scarcity, and geometry. First, we examine how Bayesian optimization in latent spaces can reduce dimensional complexity while maintaining accuracy for graph-valued tasks. Next, we incorporate generative models adept at handling mixed-type data, enabling more realistic generation of synthetic data. We then turn to neural partial differential equation solvers that employ randomized linear algebra, offering faster and more scalable simulation for iterative optimization loops. Then, we demonstrate how orthonormal parameterizations and second-order optimizers improve complex deterministic calculations, providing deeper insight and speed for high-dimensional systems. Finally, we propose a framework that uses Ricci flows and geometric surgery to learn geodesic mappings during probability flows on different manifolds, facilitating the production of complex, high-fidelity samples in demanding generative tasks. Together, these contributions highlight how modern, data-driven approaches are reshaping conventional optimization, bridging theoretical rigor with the demands of real-world applications.