Co-design Center for Exascale Machine Learning Technologies (ExaLearn)

Author(s):  
Francis J Alexander ◽  
James Ang ◽  
Jenna A Bilbrey ◽  
Jan Balewski ◽  
Tiernan Casey ◽  
...  

Rapid growth in data, computational methods, and computing power is driving a remarkable revolution in what variously is termed machine learning (ML), statistical learning, computational learning, and artificial intelligence. In addition to highly visible successes in machine-based natural language translation, playing the game Go, and self-driving cars, these new technologies also have profound implications for computational and experimental science and engineering, as well as for the exascale computing systems that the Department of Energy (DOE) is developing to support those disciplines. Not only do these learning technologies open up exciting opportunities for scientific discovery on exascale systems, they also appear poised to have important implications for the design and use of exascale computers themselves, including high-performance computing (HPC) for ML and ML for HPC. The overarching goal of the ExaLearn co-design project is to provide exascale ML software for use by Exascale Computing Project (ECP) applications, other ECP co-design centers, and DOE experimental facilities and leadership class computing facilities.

Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2674 ◽  
Author(s):  
Konstantinos Liakos ◽  
Patrizia Busato ◽  
Dimitrios Moshou ◽  
Simon Pearson ◽  
Dionysis Bochtis

Machine learning has emerged with big data technologies and high-performance computing to create new opportunities for data intensive science in the multi-disciplinary agri-technologies domain. In this paper, we present a comprehensive review of research dedicated to applications of machine learning in agricultural production systems. The works analyzed were categorized in (a) crop management, including applications on yield prediction, disease detection, weed detection crop quality, and species recognition; (b) livestock management, including applications on animal welfare and livestock production; (c) water management; and (d) soil management. The filtering and classification of the presented articles demonstrate how agriculture will benefit from machine learning technologies. By applying machine learning to sensor data, farm management systems are evolving into real time artificial intelligence enabled programs that provide rich recommendations and insights for farmer decision support and action.


10.6036/10007 ◽  
2021 ◽  
Vol 96 (5) ◽  
pp. 528-533
Author(s):  
XAVIER LARRIVA NOVO ◽  
MARIO VEGA BARBAS ◽  
VICTOR VILLAGRA ◽  
JULIO BERROCAL

Cybersecurity has stood out in recent years with the aim of protecting information systems. Different methods, techniques and tools have been used to make the most of the existing vulnerabilities in these systems. Therefore, it is essential to develop and improve new technologies, as well as intrusion detection systems that allow detecting possible threats. However, the use of these technologies requires highly qualified cybersecurity personnel to analyze the results and reduce the large number of false positives that these technologies presents in their results. Therefore, this generates the need to research and develop new high-performance cybersecurity systems that allow efficient analysis and resolution of these results. This research presents the application of machine learning techniques to classify real traffic, in order to identify possible attacks. The study has been carried out using machine learning tools applying deep learning algorithms such as multi-layer perceptron and long-short-term-memory. Additionally, this document presents a comparison between the results obtained by applying the aforementioned algorithms and algorithms that are not deep learning, such as: random forest and decision tree. Finally, the results obtained are presented, showing that the long-short-term-memory algorithm is the one that provides the best results in relation to precision and logarithmic loss.


Author(s):  
Francis Alexander ◽  
Ann Almgren ◽  
John Bell ◽  
Amitava Bhattacharjee ◽  
Jacqueline Chen ◽  
...  

As noted in Wikipedia, skin in the game refers to having ‘incurred risk by being involved in achieving a goal’, where ‘ skin is a synecdoche for the person involved, and game is the metaphor for actions on the field of play under discussion’. For exascale applications under development in the US Department of Energy Exascale Computing Project, nothing could be more apt, with the skin being exascale applications and the game being delivering comprehensive science-based computational applications that effectively exploit exascale high-performance computing technologies to provide breakthrough modelling and simulation and data science solutions. These solutions will yield high-confidence insights and answers to the most critical problems and challenges for the USA in scientific discovery, national security, energy assurance, economic competitiveness and advanced healthcare. This article is part of a discussion meeting issue ‘Numerical algorithms for high-performance computational science’.


Author(s):  
Hartwig Anzt ◽  
Erik Boman ◽  
Rob Falgout ◽  
Pieter Ghysels ◽  
Michael Heroux ◽  
...  

Sparse solvers provide essential functionality for a wide variety of scientific applications. Highly parallel sparse solvers are essential for continuing advances in high-fidelity, multi-physics and multi-scale simulations, especially as we target exascale platforms. This paper describes the challenges, strategies and progress of the US Department of Energy Exascale Computing project towards providing sparse solvers for exascale computing platforms. We address the demands of systems with thousands of high-performance node devices where exposing concurrency, hiding latency and creating alternative algorithms become essential. The efforts described here are works in progress, highlighting current success and upcoming challenges. This article is part of a discussion meeting issue ‘Numerical algorithms for high-performance computational science’.


2021 ◽  
Vol 26 (2) ◽  
pp. 78-89
Author(s):  
Anastasia V. Kolmogorova ◽  

The article explores the ways of making emotional lexemes semantic description consistent with interpretative intuition of the ordinary language speaker. The research novelty is determined by the fact that it is based on the data retrieved from the emotional assessment of 3920 internet-texts in Russian made by informants via using a specially designed computer interface. When applied this interface, we can aggregate the weight of 8 emotions (distress, enjoyment, anger, surprise, shame, excitement, disgust, fear) in text. Thus, the data we have used for this publication includes two sets of 150 internet-texts assessed by 2000 informants with the highest score of emotions of distress or anger. The scope of the study covers the semantics of two mentioned above lexemes (grust’ and gnev) analyzed through the prism of collective introspection of informants. The article purpose is to discuss the case when a semantic description of emotives is given by an expert, which largely uses “the best texts” of corresponding emotions, according to the collective opinion of informants. Our methods include psycholinguistic experiment, corpus and semantic analysis. The research led us to three main conclusions. Firstly, the semantic descriptions of emotives grust’ and gnev obtained in proposed way represent prototypical scenarios of living an emotion in social context and take into account not only the introspective sensations of an expert-linguist, but the interpretative strategies of language users. Secondly, such semantic explanation provides us with keys for explaining, why machine learning technologies are better at detecting anger than sadness in text. Finally, it creates a precedent in using new technologies for making an ecological semantic description of emotive vocabulary. The research results can find application in emotiology, lexicographic practice and didactics.


Author(s):  
Thomas M Evans ◽  
Julia C White

Multiphysics coupling presents a significant challenge in terms of both computational accuracy and performance. Achieving high performance on coupled simulations can be particularly challenging in a high-performance computing context. The US Department of Energy Exascale Computing Project has the mission to prepare mission-relevant applications for the delivery of the exascale computers starting in 2023. Many of these applications require multiphysics coupling, and the implementations must be performant on exascale hardware. In this special issue we feature six articles performing advanced multiphysics coupling that span the computational science domains in the Exascale Computing Project.


Author(s):  
Prarthana Dutta ◽  
Naresh Babu Muppalaneni ◽  
Ripon Patgiri

The world has been evolving with new technologies and advances day-by-day. With the advent of various learning technologies in every field, the research community is able to provide solution in every aspect of life with the applications of Artificial Intelligence, Machine Learning, Deep Learning, Computer Vision, etc. However, with such high achievements, it is found to lag behind the ability to provide explanation against its prediction. The current situation is such that these modern technologies are able to predict and decide upon various cases more accurately and speedily than a human, but failed to provide an answer when the question of why to trust its prediction is put forward. In order to attain a deeper understanding into this rising trend, we explore a very recent and talked-about novel contribution which provides rich insight on a prediction being made -- ``Explainability.'' The main premise of this survey is to provide an overview for researches explored in the domain and obtain an idea of the current scenario along with the advancements published to-date in this field. This survey is intended to provide a comprehensive background of the broad spectrum of Explainability.


MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


2017 ◽  
Vol 3 (10) ◽  
Author(s):  
Dr. A. Joycilin Shermila

Times have changed and teachers have evolved. New technologies have opened up the classroom to the outside world. Teachers who were seen with textbooks and blackboard are now using varied technological tools to empower learners to publish works and engage learners with live audience in real contexts. In this digital era an ever-expanding array of powerful software has been made available. The flipped classroom is a shift from passive to active learning to focus on higher order thinking skills such as analysis, synthesis and evaluation. This model of teaching combines pedagogy and learning technologies. Significant learning happens through facilitating active learning through engaged learners. In this approach learning materials are provided through text, video, audio and multimedia. Students take responsibility of their learning. They work together applying course concepts with guidance from the instructor. This increased interaction helps to create a learning community that encourages them to build knowledge inside and outside the classroom.


Sign in / Sign up

Export Citation Format

Share Document