From October 24-27, the INFORMS (Institute for Operations Research and the Management Sciences) annual meeting will be held both in-person (Anaheim, CA) and online. SigOpt last attended this meeting in-person in 2019. SigOpt will also be attending in person this year, and this includes the research team in full force: Harvey Cheng, Gustavo Malkomes, Michael McCourt, and Eric Lee. This is SigOpt’s first in person conference since the start of the pandemic, and we are thrilled to be able to see everyone after so long!
SigOpt will present two papers at INFORMS. Both are previous works published in PLMR (Proceedings of Machine Learning Research) and ICML (International Conference on Machine Learning), and we are excited to have the opportunity to finally present these in-person.
Authors: Ryan Turner, David Eriksson, Michael McCourt, Juha Kiili, Eero Laaksonen, Zhen Xu, Isabelle Guyon
Speakers: Michael McCourt and Ryan Turner
Time: Tuesday, October 26, 7:45 am - 9:15 am
Place: TB09. In Person: Optimization and Surrogate Methods for Black-Box Systems
We present results and insights from the black-box optimization (BBO) challenge at NeurIPS 2020 which ran from July–October, 2020. The challenge emphasized the importance of evaluating derivative-free optimizers for tuning the hyperparameters of machine learning models. This was the first black-box optimization challenge with a machine learning emphasis, and has a widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyperparameter tuning in almost every machine learning project as well as many applications outside of machine learning.
Authors: Gustavo Malkomes, Bolong Cheng, Eric H Lee, Michael McCourt
Speaker: Eric Lee
Time: Wednesday, October 27, 7:45 am - 9:15 am
Place: VWB50. Virtual: Learning and Optimizing with Structure
We discuss constraint active search (CAS), which attempts to sample a diverse set of points from an unknown feasible region implicitly defined by a set of black-box constraints. CAS applies to many problems in engineering design and simulation, which require balancing competing objectives under the presence of uncertainty. Sample-efficient multiobjective optimization methods focus on the objective function values in metric space and ignore the sampling behavior of the design configurations in parameter space. Consequently, they may provide little insight on choosing designs given metric uncertainty or limited precision. CAS accounts for the importance of the parameter space and is thus more suitable for multiobjective design problems.
SigOpt will be tabling at the INFORMS career fair. We know that there are many outstanding students who will be attending and presenting at this meeting, and we encourage anyone who wants to learn more about SigOpt research and our goals to stop by our career fair booth. If anyone cannot make this time, please reach out to someone on the research team to set up a separate meeting.
Along with our Academic and Internship Programs, conferences like INFORMS are exciting ways in which SigOpt’s research team continues to invest in and collaborate with the broader machine learning and Bayesian optimization communities.
SigOpt presented at two virtual sessions and one in-person session, participated in the career fair, and hosted a wonderful dinner for Bayesian optimization researchers. Thank you to everyone that stopped by; anyone that missed us can reach out directly. Below are some pictures from these events.