Design of Multifunctional Truss-Like Periodic Materials Using a Global-Local Optimization Method

2011 ◽  
Vol 312-315 ◽  
pp. 1073-1078 ◽  
Author(s):  
Pablo A. Muñoz-Rojas ◽  
M.A. Luersen ◽  
T.A. Carniel ◽  
E. Bertoti

Porous materials have gained wide use in high level engineering structures due to their high stiffness/weight ratio, good energy absorption properties, etc. Frequently, thermal behavior is also an issue of concern and optimized multifunctional thermo-mechanical responses are sought for. This paper presents the application of a hybrid two-stage method for achieving an optimized layout of periodic truss-like structures in order to obtain a good compromise between thermal and mechanical elastic properties. The first stage employs a derivative free optimization method, which explores the design space, not getting trapped by local minima. The second stage uses a derivative based optimization algorithm to perform a refinement of the solution obtained in the first stage.

2021 ◽  
Author(s):  
Faruk Alpak ◽  
Yixuan Wang ◽  
Guohua Gao ◽  
Vivek Jain

Abstract Recently, a novel distributed quasi-Newton (DQN) derivative-free optimization (DFO) method was developed for generic reservoir performance optimization problems including well-location optimization (WLO) and well-control optimization (WCO). DQN is designed to effectively locate multiple local optima of highly nonlinear optimization problems. However, its performance has neither been validated by realistic applications nor compared to other DFO methods. We have integrated DQN into a versatile field-development optimization platform designed specifically for iterative workflows enabled through distributed-parallel flow simulations. DQN is benchmarked against alternative DFO techniques, namely, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method hybridized with Direct Pattern Search (BFGS-DPS), Mesh Adaptive Direct Search (MADS), Particle Swarm Optimization (PSO), and Genetic Algorithm (GA). DQN is a multi-thread optimization method that distributes an ensemble of optimization tasks among multiple high-performance-computing nodes. Thus, it can locate multiple optima of the objective function in parallel within a single run. Simulation results computed from one DQN optimization thread are shared with others by updating a unified set of training data points composed of responses (implicit variables) of all successful simulation jobs. The sensitivity matrix at the current best solution of each optimization thread is approximated by a linear-interpolation technique using all or a subset of training-data points. The gradient of the objective function is analytically computed using the estimated sensitivities of implicit variables with respect to explicit variables. The Hessian matrix is then updated using the quasi-Newton method. A new search point for each thread is solved from a trust-region subproblem for the next iteration. In contrast, other DFO methods rely on a single-thread optimization paradigm that can only locate a single optimum. To locate multiple optima, one must repeat the same optimization process multiple times starting from different initial guesses for such methods. Moreover, simulation results generated from a single-thread optimization task cannot be shared with other tasks. Benchmarking results are presented for synthetic yet challenging WLO and WCO problems. Finally, DQN method is field-tested on two realistic applications. DQN identifies the global optimum with the least number of simulations and the shortest run time on a synthetic problem with known solution. On other benchmarking problems without a known solution, DQN identified compatible local optima with reasonably smaller numbers of simulations compared to alternative techniques. Field-testing results reinforce the auspicious computational attributes of DQN. Overall, the results indicate that DQN is a novel and effective parallel algorithm for field-scale development optimization problems.


2014 ◽  
Vol 63 (14) ◽  
pp. 149203
Author(s):  
Huang Qi-Can ◽  
Hu Shu-Juan ◽  
Qiu Chun-Yu ◽  
Li Kuan ◽  
Yu Hai-Peng ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document