scholarly journals Navigating through the Maze of Homogeneous Catalyst Design with Machine Learning

Author(s):  
Gabriel dos Passos Gomes ◽  
Robert Pollice ◽  
Alan Aspuru-Guzik

<div><div><div><p>The ability to forge difficult chemical bonds through catalysis has transformed society on all fronts, from feeding our ever-growing populations to increasing our life-expectancies through the synthesis of new drugs. However, developing new chemical reactions and catalytic systems is a tedious task that requires tremendous discovery and optimization efforts. Over the past decade, advances in machine learning have revolutionized a whole new way to approach data- intensive problems, and many of these developments have started to enter chemistry. However, similar progress in the field of homogenous catalysis are only in their infancy. In this article, we want to outline our vision for the future of catalyst design and the role of machine learning to navigate this maze.</p></div></div></div>

2020 ◽  
Author(s):  
Gabriel dos Passos Gomes ◽  
Robert Pollice ◽  
Alan Aspuru-Guzik

<div><div><div><p>The ability to forge difficult chemical bonds through catalysis has transformed society on all fronts, from feeding our ever-growing populations to increasing our life-expectancies through the synthesis of new drugs. However, developing new chemical reactions and catalytic systems is a tedious task that requires tremendous discovery and optimization efforts. Over the past decade, advances in machine learning have revolutionized a whole new way to approach data- intensive problems, and many of these developments have started to enter chemistry. However, similar progress in the field of homogenous catalysis are only in their infancy. In this article, we want to outline our vision for the future of catalyst design and the role of machine learning to navigate this maze.</p></div></div></div>


2018 ◽  
Vol 18 (2) ◽  
pp. 174-192 ◽  
Author(s):  
Chiara Bonacchi ◽  
Mark Altaweel ◽  
Marta Krzyzanska

This article assesses the role of the pre-modern past in the construction of political identities relating to the UK’s membership in the European Union by examining how materials and ideas from Iron Age to Early Medieval Britain and Europe were leveraged by those who discussed the topic of Brexit in over 1.4 million messages published in dedicated Facebook pages. Through a combination of data-intensive and qualitative investigations of textual data, we identify the ‘heritages’ invoked in support of pro- or anti-Brexit sentiments. We show how these heritages are centred around myths of origins, resistance and collapse that incorporate tensions and binary divisions . We highlight the strong influence of past expert practices in shaping such deeply entrenched dualistic thinking and reflect over the longue durée agency of heritage expertise. This is the first systematic study of public perceptions and experience of the past in contemporary society undertaken through digital heritage research fuelled by big data. As such, the article contributes novel methodological approaches and substantially advances theory in cultural heritage studies. It is also the first published work to analyse the role of heritage in the construction of political identities in relation to Brexit via extensive social research.


2020 ◽  
Vol 2 (11) ◽  
Author(s):  
Petar Radanliev ◽  
David De Roure ◽  
Rob Walton ◽  
Max Van Kleek ◽  
Rafael Mantilla Montalvo ◽  
...  

AbstractWe explore the potential and practical challenges in the use of artificial intelligence (AI) in cyber risk analytics, for improving organisational resilience and understanding cyber risk. The research is focused on identifying the role of AI in connected devices such as Internet of Things (IoT) devices. Through literature review, we identify wide ranging and creative methodologies for cyber analytics and explore the risks of deliberately influencing or disrupting behaviours to socio-technical systems. This resulted in the modelling of the connections and interdependencies between a system's edge components to both external and internal services and systems. We focus on proposals for models, infrastructures and frameworks of IoT systems found in both business reports and technical papers. We analyse this juxtaposition of related systems and technologies, in academic and industry papers published in the past 10 years. Then, we report the results of a qualitative empirical study that correlates the academic literature with key technological advances in connected devices. The work is based on grouping future and present techniques and presenting the results through a new conceptual framework. With the application of social science's grounded theory, the framework details a new process for a prototype of AI-enabled dynamic cyber risk analytics at the edge.


AI Magazine ◽  
2015 ◽  
Vol 36 (1) ◽  
pp. 5-14 ◽  
Author(s):  
Krzysztof Janowicz ◽  
Frank Van Harmelen ◽  
James A. Hendler ◽  
Pascal Hitzler

While catchphrases such as big data, smart data, data-intensive science, or smart dust highlight different aspects, they share a common theme: Namely, a shift towards a data-centric perspective in which the synthesis and analysis of data at an ever-increasing spatial, temporal, and thematic resolution promises new insights, while, at the same time, reducing the need for strong domain theories as starting points. In terms of the envisioned methodologies, those catchphrases tend to emphasize the role of predictive analytics, that is, statistical techniques including data mining and machine learning, as well as supercomputing. Interestingly, however, while this perspective takes the availability of data as a given, it does not answer the question how one would discover the required data in today’s chaotic information universe, how one would understand which datasets can be meaningfully integrated, and how to communicate the results to humans and machines alike. The semantic web addresses these questions. In the following, we argue why the data train needs semantic rails. We point out that making sense of data and gaining new insights works best if inductive and deductive techniques go hand-in-hand instead of competing over the prerogative of interpretation.


2021 ◽  
Vol 3 (2) ◽  
pp. 96-110 ◽  
Author(s):  
Gabriel dos Passos Gomes ◽  
Robert Pollice ◽  
Alán Aspuru-Guzik

2021 ◽  
Vol 22 (19) ◽  
pp. 10808
Author(s):  
Elena G. Varlamova ◽  
Egor A. Turovsky ◽  
Ekaterina V. Blinova

This review presents the latest data on the importance of selenium nanoparticles in human health, their use in medicine, and the main known methods of their production by various methods. In recent years, a multifaceted study of nanoscale complexes in medicine, including selenium nanoparticles, has become very important in view of a number of positive features that make it possible to create new drugs based on them or significantly improve the properties of existing drugs. It is known that selenium is an essential trace element that is part of key antioxidant enzymes. In mammals, there are 25 selenoproteins, in which selenium is a key component of the active site. The important role of selenium in human health has been repeatedly proven by several hundred works in the past few decades; in recent years, the study of selenium nanocomplexes has become the focus of researchers. A large amount of accumulated data requires generalization and systematization in order to improve understanding of the key mechanisms and prospects for the use of selenium nanoparticles in medicine, which is the purpose of this review.


2018 ◽  
Vol 5 (2) ◽  
pp. 205395171880855 ◽  
Author(s):  
Thomas Birtchnell

Since the inception of recorded music there has been a need for standards and reliability across sound formats and listening environments. The role of the audio mastering engineer is prestigious and akin to a craft expert combining scientific knowledge, musical learning, manual precision and skill, and an awareness of cultural fashions and creative labour. With the advent of algorithms, big data and machine learning, loosely termed artificial intelligence in this creative sector, there is now the possibility of automating human audio mastering processes and radically disrupting mastering careers. The emergence of dedicated products and services in artificial intelligence-driven audio mastering poses profound questions for the future of the music industry, already having faced significant challenges due to the digitalization of music over the past decades. The research reports on qualitative and ethnographic inquiry with audio mastering engineers on the automation of their expertise and the potential for artificial intelligence to augment or replace aspects of their workflows. Investigating audio mastering engineers' awareness of artificial intelligence, the research probes the importance of criticality in their labour. The research identifies intuitive performance and critical listening as areas where human ingenuity and communication pose problems for simulation. Affective labour disrupts speculation of algorithmic domination by highlighting the pragmatic strategies available for humans to adapt and augment digital technologies.


2019 ◽  
pp. 216847901986455
Author(s):  
Paul Baldrick

An examination for potential direct or indirect adverse effects on the immune system (immunotoxicity) is an established component of nonclinical testing to support safe use of new drugs. Testing recommendations occur in various regulatory guidance documents, especially ICH S8, and these will be presented. Key evaluation usually occurs in toxicology studies with further investigative work a consideration if a positive signal is seen. Expectations around whether findings may occur are related to the type of compound being developed, including a chemically synthesized small molecule, a small molecule oncology drug, a biopharmaceutical, an oligonucleotide, a gene therapy/stem cell product, a vaccine, or reformulation of drugs in liposomes or depots. Examples of immunotoxicity/immunogenicity findings will be discussed for all of these types of compound. Overall, it can be concluded that our main tool for evaluation of potential immunotoxicity/immunogenicity for a new drug still remains standard toxicology study testing with key assessment for effects on clinical pathology and lymphoid organs/tissues (weights and cellularity). Additional evaluation from studies using a T cell–dependent antibody response (TDAR) and lymphocyte phenotyping is also valuable, if needed. Thus, using the tools from the past, it is the role of toxicologists to work with clinical teams now and in the future, to interpret findings from nonclinical testing to possible adverse findings in humans.


2020 ◽  
Vol 19 (3) ◽  
pp. 253-259
Author(s):  
L. Ramírez-Vázquez ◽  
A. Negrón-Mendoza

AbstractLife originated on Earth possibly as a physicochemical process; thus, geological environments and their hypothetical characteristics on early Earth are essential for chemical evolution studies. Also, it is necessary to consider the energy sources that were available in the past and the components that could have contributed to promote chemical reactions. It has been proposed that the components could have been mineral surfaces. The aim of this work is to determine the possible role of mineral surfaces on chemical evolution, and to study of the stability of relevant molecules for metabolism, such as α-ketoglutaric acid (α-keto acid, Krebs cycle participant), using ionizing radiation and thermal energy as energy sources and mineral surfaces to promote chemical reactions. Preliminary results show α-ketoglutaric acid can be relatively stable at the simulated conditions of an impact-generated hydrothermal system; thus, those systems might have been plausible environments for chemical evolution on Earth.


2018 ◽  
Vol 20 (5) ◽  
pp. 1878-1912 ◽  
Author(s):  
Ahmet Sureyya Rifaioglu ◽  
Heval Atas ◽  
Maria Jesus Martin ◽  
Rengul Cetin-Atalay ◽  
Volkan Atalay ◽  
...  

Abstract The identification of interactions between drugs/compounds and their targets is crucial for the development of new drugs. In vitro screening experiments (i.e. bioassays) are frequently used for this purpose; however, experimental approaches are insufficient to explore novel drug-target interactions, mainly because of feasibility problems, as they are labour intensive, costly and time consuming. A computational field known as ‘virtual screening’ (VS) has emerged in the past decades to aid experimental drug discovery studies by statistically estimating unknown bio-interactions between compounds and biological targets. These methods use the physico-chemical and structural properties of compounds and/or target proteins along with the experimentally verified bio-interaction information to generate predictive models. Lately, sophisticated machine learning techniques are applied in VS to elevate the predictive performance. The objective of this study is to examine and discuss the recent applications of machine learning techniques in VS, including deep learning, which became highly popular after giving rise to epochal developments in the fields of computer vision and natural language processing. The past 3 years have witnessed an unprecedented amount of research studies considering the application of deep learning in biomedicine, including computational drug discovery. In this review, we first describe the main instruments of VS methods, including compound and protein features (i.e. representations and descriptors), frequently used libraries and toolkits for VS, bioactivity databases and gold-standard data sets for system training and benchmarking. We subsequently review recent VS studies with a strong emphasis on deep learning applications. Finally, we discuss the present state of the field, including the current challenges and suggest future directions. We believe that this survey will provide insight to the researchers working in the field of computational drug discovery in terms of comprehending and developing novel bio-prediction methods.


Sign in / Sign up

Export Citation Format

Share Document