scholarly journals Compositionally-Restricted Attention-Based Network for Materials Property Prediction

Author(s):  
Anthony Wang ◽  
Steven Kauwe ◽  
Ryan Murdock ◽  
Taylor Sparks

<div>In this paper, we demonstrate a novel application of the Transformer self-attention mechanism. Our network, the Compositionally-Restricted Attention-Based network, referred to as CrabNet, explores the area of structure-agnostic materials property predictions when only a chemical formula is provided.</div><div>Our results show that CrabNet's performance matches or exceeds current best practice methods on nearly all of 28 total benchmark datasets. We also demonstrate how CrabNet's architecture lends itself towards model interpretability by showing different visualization approaches that are made possible by CrabNet's design.</div><div>We feel confident that CrabNet, and its attention-based framework, will be of keen interest to future materials informatics researchers. </div>

Author(s):  
Anthony Wang ◽  
Steven Kauwe ◽  
Ryan Murdock ◽  
Taylor Sparks

<div>In this paper, we demonstrate a novel application of the Transformer self-attention mechanism. Our network, the Compositionally-Restricted Attention-Based network, referred to as CrabNet, explores the area of structure-agnostic materials property predictions when only a chemical formula is provided.</div><div>Our results show that CrabNet's performance matches or exceeds current best practice methods on nearly all of 28 total benchmark datasets. We also demonstrate how CrabNet's architecture lends itself towards model interpretability by showing different visualization approaches that are made possible by CrabNet's design.</div><div>We feel confident that CrabNet, and its attention-based framework, will be of keen interest to future materials informatics researchers. </div>


2021 ◽  
Author(s):  
Anthony Wang ◽  
Steven Kauwe ◽  
Ryan Murdock ◽  
Taylor Sparks

In this paper, we demonstrate a novel application of the Transformer self-attention mechanism. Our network, the Compositionally-Restricted Attention-Based network, referred to as CrabNet, explores the area of structure-agnostic materials property predictions when only a chemical formula is provided.Our results show that CrabNet's performance matches or exceeds current best practice methods on nearly all of 28 total benchmark datasets. We also demonstrate how CrabNet's architecture lends itself towards model interpretability by showing different visualization approaches that are made possible by its design.We feel confident that CrabNet, and its attention-based framework, will be of keen interest to future materials informatics researchers. For trained model weights, please see: http://doi.org/10.5281/zenodo.4633866


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Anthony Yu-Tung Wang ◽  
Steven K. Kauwe ◽  
Ryan J. Murdock ◽  
Taylor D. Sparks

AbstractIn this paper, we demonstrate an application of the Transformer self-attention mechanism in the context of materials science. Our network, the Compositionally Restricted Attention-Based network (), explores the area of structure-agnostic materials property predictions when only a chemical formula is provided. Our results show that ’s performance matches or exceeds current best-practice methods on nearly all of 28 total benchmark datasets. We also demonstrate how ’s architecture lends itself towards model interpretability by showing different visualization approaches that are made possible by its design. We feel confident that and its attention-based framework will be of keen interest to future materials informatics researchers.


2020 ◽  
Vol 10 (24) ◽  
pp. 9132
Author(s):  
Liguo Weng ◽  
Xiaodong Zhang ◽  
Junhao Qian ◽  
Min Xia ◽  
Yiqing Xu ◽  
...  

Non-intrusive load disaggregation (NILD) is of great significance to the development of smart grids. Current energy disaggregation methods extract features from sequences, and this process easily leads to a loss of load features and difficulties in detecting, resulting in a low recognition rate of low-use electrical appliances. To solve this problem, a non-intrusive sequential energy disaggregation method based on a multi-scale attention residual network is proposed. Multi-scale convolutions are used to learn features, and the attention mechanism is used to enhance the learning ability of load features. The residual learning further improves the performance of the algorithm, avoids network degradation, and improves the precision of load decomposition. The experimental results on two benchmark datasets show that the proposed algorithm has more advantages than the existing algorithms in terms of load disaggregation accuracy and judgments of the on/off state, and the attention mechanism can further improve the disaggregation accuracy of low-frequency electrical appliances.


Author(s):  
Kun Zhang ◽  
Guangyi Lv ◽  
Linyuan Wang ◽  
Le Wu ◽  
Enhong Chen ◽  
...  

Sentence semantic matching requires an agent to determine the semantic relation between two sentences, which is widely used in various natural language tasks such as Natural Language Inference (NLI) and Paraphrase Identification (PI). Among all matching methods, attention mechanism plays an important role in capturing the semantic relations and properly aligning the elements of two sentences. Previous methods utilized attention mechanism to select important parts of sentences at one time. However, the important parts of the sentence during semantic matching are dynamically changing with the degree of sentence understanding. Selecting the important parts at one time may be insufficient for semantic understanding. To this end, we propose a Dynamic Re-read Network (DRr-Net) approach for sentence semantic matching, which is able to pay close attention to a small region of sentences at each step and re-read the important words for better sentence semantic understanding. To be specific, we first employ Attention Stack-GRU (ASG) unit to model the original sentence repeatedly and preserve all the information from bottom-most word embedding input to up-most recurrent output. Second, we utilize Dynamic Re-read (DRr) unit to pay close attention to one important word at one time with the consideration of learned information and re-read the important words for better sentence semantic understanding. Extensive experiments on three sentence matching benchmark datasets demonstrate that DRr-Net has the ability to model sentence semantic more precisely and significantly improve the performance of sentence semantic matching. In addition, it is very interesting that some of finding in our experiments are consistent with the findings of psychological research.


Author(s):  
Andreia F. Paiva ◽  
Adam Nolan ◽  
Charlotte Thumser ◽  
Flavia H. Santos

Abstract: Background and Aims: Screening and assessment of cognitive changes in adults with Intellectual Disabilities, mainly Down Syndrome (DS), is crucial to offer appropriate services to their needs. We present a systematic review of the existing instruments assessing dementia, aiming to support researchers and clinicians’ best practice. Methods: Searches were carried out in the databases Web of Science; PubMed; PsycINFO in March 2019 and updated in May 2020. Studies were selected and examined if they: (1) focused on assessing age-related cognitive changes in person with ID; (2) included adults and/or older adults; (3) included scales and batteries for cognitive assessment. Results: Forty-eight cross-sectional studies and twenty-six longitudinal studies were selected representing a total sample of 5,851 participants (4,089 DS and 1,801 with other ID). In those studies, we found 38 scales, questionnaires, and inventories, and 13 batteries for assessing cognitive and behavioural changes in adults with DS and other ID. Conclusion: The most used instrument completed by an informant or carer was the Dementia Questionnaire for Learning Disabilities (DLD), and its previous versions. We discuss the strengths and limitations of the instruments and outline recommendations for future use.


2016 ◽  
Vol 43 (7) ◽  
pp. 599 ◽  
Author(s):  
Jordan O. Hampton ◽  
Timothy H. Hyndman ◽  
Michael Laurence ◽  
Andrew L. Perry ◽  
Peter Adams ◽  
...  

Increased scrutiny of animal welfare in wildlife management has seen a recent proliferation in the use of procedural documents (standard operating procedures, codes of practice etc.). Some procedural documents are presumed to represent ‘best practice’ methods, whereby adherence to prescribed inputs is explicitly purported to generate humane outcomes. However, the relationship between what is done to animals (inputs) and what they experience (outputs), as assessed by animal-based measures, has received little attention. Procedural documents are commonly developed in the absence of empirical animal-based measures, creating uncertainty in animal welfare outcomes. Prescribed procedures are valuable as guidelines for standardising methodology, but the development of ‘welfare standards’ that focus on desired thresholds for animal-based measures offers many advantages for improving animal welfare. Refinement of the use of procedural documents in wildlife management is required to ensure they generate desirable outcomes for animals, and do not preclude the development of improved methods.


2021 ◽  
Author(s):  
Jamie Dorey ◽  
Georgy Rassadkin ◽  
Douglas Ridgway

Abstract The field experience in the continental US suggests that approximately 33% of plug and abandonment operations are non-routine, and 5% require re-entry (Greer C.R., 2018). In some scenarios, the most cost-efficient option for the intervention is drilling an intercept well to re-enter the target well or multiple wells externally using advanced survey management and magnetic ranging techniques. This paper presents the methods applied of relief well methodologies from the planning to execution of a complex multiple-well abandonment project. Improvements in Active Magnetic Ranging sensor design and applications have improved the availability of highly precise tools for the purpose of locating and intercepting wellbores where access is not possible. These instruments were commonplace on relief well interventions, however, have found a new application in solving one of the major issues facing the oil and gas industry. Subsurface abandonments are a complex task that requires a robust methodology. In this paper, we describe the techniques that have been built upon the best practices from industry experience (ISCWSA WISC eBook). This paper also illustrates how the combination of advanced survey management, gyro surveying, and magnetic ranging can be used following the best industry practices for fast and cost-efficient non-routine plug and abandonment. Case studies of several abandonment projects are presented showing the various technical challenges which are common on idle and legacy wells. The projects include wells that are currently under the ownership of an operator and orphaned wells that have been insufficiently abandoned and left idle over many decades. The case studies outline how the application of relief well methodologies to the execution of complex sub surface interventions led to the successful outcomes of meeting environmental and government regulations for wellbore abandonment. This includes performing multiple zonal isolations between reservoirs, water zones and preventing oil and gas seepage to the surface. The projects and their outcomes prove economically viable strategies for tackling the growing issue of idle and orphaned wells globally in a fiscally responsible manner. Combining industry best practice methods for relief well drilling, along with the technological advancements in magnetic ranging systems is a solution for one of the largest dilemmas facing the oil and gas industry in relation to idle and orphaned wellbores. These applications allow previously considered impossible abandonments to be completed with a high probability of long-term success in permanent abandonment.


2022 ◽  
Vol 22 (3) ◽  
pp. 1-21
Author(s):  
Prayag Tiwari ◽  
Amit Kumar Jaiswal ◽  
Sahil Garg ◽  
Ilsun You

Self-attention mechanisms have recently been embraced for a broad range of text-matching applications. Self-attention model takes only one sentence as an input with no extra information, i.e., one can utilize the final hidden state or pooling. However, text-matching problems can be interpreted either in symmetrical or asymmetrical scopes. For instance, paraphrase detection is an asymmetrical task, while textual entailment classification and question-answer matching are considered asymmetrical tasks. In this article, we leverage attractive properties of self-attention mechanism and proposes an attention-based network that incorporates three key components for inter-sequence attention: global pointwise features, preceding attentive features, and contextual features while updating the rest of the components. Our model follows evaluation on two benchmark datasets cover tasks of textual entailment and question-answer matching. The proposed efficient Self-attention-driven Network for Text Matching outperforms the state of the art on the Stanford Natural Language Inference and WikiQA datasets with much fewer parameters.


Sign in / Sign up

Export Citation Format

Share Document