Machine learning inversion design and application verification of a broadband acoustic filtering structure

2022 ◽  
Vol 187 ◽  
pp. 108522
Author(s):  
BaoZhu Cheng ◽  
Mou Wang ◽  
Nansha Gao ◽  
Hong Hou
Author(s):  
Diana Benavides-Prado

Increasing amounts of data have made the use of machine learning techniques much more widespread. A lot of research in machine learning has been dedicated to the design and application of effective and efficient algorithms to explain or predict facts. The development of intelligent machines that can learn over extended periods of time, and that improve their abilities as they execute more tasks, is still a pending contribution from computer science to the world. This weakness has been recognised for some decades, and an interest to solve it seems to be increasing, as demonstrated by recent leading work and broader discussions at main events in the field [Chen and Liu, 2015; Chen et al., 2016]. Our research is intended to help fill that gap.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Mohammed J. Zaki ◽  
Wagner Meira, Jr
Keyword(s):  

2020 ◽  
Author(s):  
Marc Peter Deisenroth ◽  
A. Aldo Faisal ◽  
Cheng Soon Ong
Keyword(s):  

Author(s):  
Lorenza Saitta ◽  
Attilio Giordana ◽  
Antoine Cornuejols

Sign in / Sign up

Export Citation Format

Share Document