Highly-constrained, large-dimensional, and non-linear optimizations are found at the root of many of today’s forefront problems in statistics, quantitative finance, risk, operations research, business analytics, and other predictive sciences. Tools for optimization, however, have not changed much in the past 40 years -- until very recently. The abundance of parallel computing resources has stimula
In this talk I’ll begin with traditional optimization methods, then show how to extend these methods to solve high-dimensional non-convex optimization problems with highly nonlinear constraints using the mystic framework.
I’ll start by introducing the cost function, and it’s use in local and global optimization. I’ll then address how to monitor and diagnose optimization convergence and results, tune an optimizer, and customize optimizer stop conditions. Traditional optimizers are limited in their ability to constrain solutions, in that certain algorithms provide box constraints while others provide built-in penalty functions. I’ll discuss how to apply custom penalty functions and symbolic constraints equations, and then demonstrate new methods to efficiently reduce search space through the use of spatial filtering.
Real-world inverse problems are expensive, thus I’ll show several ways to accelerate optimization through parallel computing. Large-scale optimizations also can greatly benefit from efficient solver restarts and saving of state. I’ll cover using asynchronous computing for global caching and archiving, reactive optimization daemon processes, and rigorous automated dimensional reduction.
Next I’ll discuss new optimization algorithms that leverage object-based design and parallel computing to perform fast scalable global optimizations and n-dimensional global searches. Can your favorite optimizer find all the critical points on a potential energy surface in a single invocation? mystic can.
Throughout, I’ll discuss applications of global optimization in statistics and quantitative finance, and illustrate the use of modern optimization techniques in Bayesian inference, machine learning, mixed integer programming, nonlinear regression, and parameter estimation.
The audience should have interest in solving hard real-world optimization problems, but need not have experience with optimization. The audience should walk away with a working knowledge of how to use modern constrained optimization tools, how to enable optimizers to leverage high-performance parallel computing, and how to utilize legacy data and surrogate models in statistical and predictive risk modeling.