meta optimization
Recently Published Documents


TOTAL DOCUMENTS

54
(FIVE YEARS 20)

H-INDEX

8
(FIVE YEARS 3)

2021 ◽  
Vol 2021 (12) ◽  
pp. 124016
Author(s):  
Samuel S Schoenholz ◽  
Ekin D Cubuk

Abstract We introduce JAX MD, a software package for performing differentiable physics simulations with a focus on molecular dynamics. JAX MD includes a number of physics simulation environments, as well as interaction potentials and neural networks that can be integrated into these environments without writing any additional code. Since the simulations themselves are differentiable functions, entire trajectories can be differentiated to perform meta-optimization. These features are built on primitive operations, such as spatial partitioning, that allow simulations to scale to hundreds-of-thousands of particles on a single GPU. These primitives are flexible enough that they can be used to scale up workloads outside of molecular dynamics. We present several examples that highlight the features of JAX MD including: integration of graph neural networks into traditional simulations, meta-optimization through minimization of particle packings, and a multi-agent flocking simulation. JAX MD is available at https://www.github.com/google/jax-md.


2021 ◽  
Vol 2131 (2) ◽  
pp. 022005
Author(s):  
L V Enikeeva ◽  
E N Shvareva ◽  
D A Dubovtsev ◽  
I M Gubaydullin

Abstract The paper considers the selection of the optimal parameters of the heuristic algorithm, which is used for mathematical modeling of the chemical-technological process. This adjustment of the algorithm parameters is called meta-optimization. The gravity search algorithm is used as a heuristic algorithm. Meta-optimization is performed by a genetic algorithm. The meta-optimization is tested on the problem of modeling the process of propane pre-reforming. It is shown that setting even one parameter of a heuristic algorithm is a time-consuming operation. The article presents the results of the numerical solution of the meta-optimization algorithm.


2021 ◽  
Author(s):  
Veronika Lesch ◽  
Tanja Noack ◽  
Johannes Hefter ◽  
Samuel Kounev ◽  
Christian Krupitzer

CATENA ◽  
2021 ◽  
Vol 199 ◽  
pp. 105114
Author(s):  
Mahdi Panahi ◽  
Esmaeel Dodangeh ◽  
Fatemeh Rezaie ◽  
Khabat Khosravi ◽  
Hiep Van Le ◽  
...  

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Takumi Aotani ◽  
Taisuke Kobayashi ◽  
Kenji Sugimoto

2021 ◽  
pp. 659-670
Author(s):  
Enrique de la Cal ◽  
Alberto Gallucci ◽  
Jose Ramón Villar ◽  
Kaori Yoshida ◽  
Mario Koeppen
Keyword(s):  

2020 ◽  
Author(s):  
Bart P. G. Van Parys ◽  
Peyman Mohajerin Esfahani ◽  
Daniel Kuhn

We study stochastic programs where the decision maker cannot observe the distribution of the exogenous uncertainties but has access to a finite set of independent samples from this distribution. In this setting, the goal is to find a procedure that transforms the data to an estimate of the expected cost function under the unknown data-generating distribution, that is, a predictor, and an optimizer of the estimated cost function that serves as a near-optimal candidate decision, that is, a prescriptor. As functions of the data, predictors and prescriptors constitute statistical estimators. We propose a meta-optimization problem to find the least conservative predictors and prescriptors subject to constraints on their out-of-sample disappointment. The out-of-sample disappointment quantifies the probability that the actual expected cost of the candidate decision under the unknown true distribution exceeds its predicted cost. Leveraging tools from large deviations theory, we prove that this meta-optimization problem admits a unique solution: The best predictor-prescriptor-pair is obtained by solving a distributionally robust optimization problem over all distributions within a given relative entropy distance from the empirical distribution of the data. This paper was accepted by Chung Piaw Teo, optimization.


2020 ◽  
Vol 96 ◽  
pp. 106619
Author(s):  
Hui Li ◽  
Zhiguo Huang ◽  
Xiao Liu ◽  
Chenbo Zeng ◽  
Peng Zou

Sign in / Sign up

Export Citation Format

Share Document