scholarly journals Correction to: Deep Learning for Drug Design: an Artificial Intelligence Paradigm for Drug Discovery in the Big Data Era

2018 ◽  
Vol 20 (4) ◽  
Author(s):  
Yankang Jing ◽  
Yuemin Bian ◽  
Ziheng Hu ◽  
Lirong Wang ◽  
Xiang-Qun Xie
2018 ◽  
Vol 20 (3) ◽  
Author(s):  
Yankang Jing ◽  
Yuemin Bian ◽  
Ziheng Hu ◽  
Lirong Wang ◽  
Xiang-Qun Sean Xie

Author(s):  
Manish Kumar Tripathi ◽  
Abhigyan Nath ◽  
Tej P. Singh ◽  
A. S. Ethayathulla ◽  
Punit Kaur

Author(s):  
Reza Yogaswara

Artificial Intelligence (AI) atau kecerdasan buatan menjadi penggerak revolusi industri 4.0 yang menjanjikan banyak kemudahan bagi sektor pemerintah maupun industri. Internet of Things (IoT) dan big data contohnya dimana AI dapat diimplementasikan, teknologi yang telah banyak diadopsi di era industri 4.0 ini mampu menghubungkan setiap perangkat, seseorang dapat mengotomatisasi semua perangkat tanpa harus berada di lokasi, lebih dari itu, saat ini telah banyak mesin yang dapat menginterprestasi suatu kondisi atau kejadian tertentu dengan bantuan AI, sebagaimana telah kamera cerdas pendeteksi kepadatan volume kendaraan di jalan raya menggunakan teknologi Deep Learning Neural Network, yang telah diimplementasikan pada beberapa Pemerintah Daerah Kabupaten dan Kota dalam mendukung program Smart City yang telah dicanangkan. Pada sektor industri, banyak juga dari mereka yang telah mengotomatisasi mesin produksi dan manufaktur menggunakan robot dan Artificial Intelligence, sehingga Industri 4.0 akan meningkatkan daya saing melalui perangkat cerdas, setiap entitas yang mampu menguasai teknologi ini disitulah keunggulan kompetitifnya (competitive advantage). Namun ditengah perkembangan industri 4.0 yang cukup masif pemerintah harus bergerak cepat dalam mengadopsi platform ini, jika tidak, mereka akan menurunkan efisiensi proses bisnis untuk menjaga stabilitas layanan publik. Oleh sebab itu diperlukan keilmuan dan pemahaman yang benar bagi pemerintah dalam menghadapai era Industri 4.0, dimana Chief Information Officer (CIO) dapat mengambil peranan penting dalam memberikan dukungan yang didasari atas keilmuan mereka terkait tren teknologi industri 4.0, khususnya AI yang telah banyak diadopsi di berbagai sektor.


2019 ◽  
Vol 4 (4) ◽  
pp. 206-213 ◽  
Author(s):  
Benquan Liu ◽  
Huiqin He ◽  
Hongyi Luo ◽  
Tingting Zhang ◽  
Jingwei Jiang

Different kinds of biological databases publicly available nowadays provide us a goldmine of multidiscipline big data. The Cancer Genome Atlas is a cancer database including detailed information of many patients with cancer. DrugBank is a database including detailed information of approved, investigational and withdrawn drugs, as well as other nutraceutical and metabolite structures. PubChem is a chemical compound database including all commercially available compounds as well as other synthesisable compounds. Protein Data Bank is a crystal structure database including X-ray, cryo-EM and nuclear magnetic resonance protein three-dimensional structures as well as their ligands. On the other hand, artificial intelligence (AI) is playing an important role in the drug discovery progress. The integration of such big data and AI is making a great difference in the discovery of novel targeted drug. In this review, we focus on the currently available advanced methods for the discovery of highly effective lead compounds with great absorption, distribution, metabolism, excretion and toxicity properties.


2021 ◽  
Vol 22 (18) ◽  
pp. 9983
Author(s):  
Jintae Kim ◽  
Sera Park ◽  
Dongbo Min ◽  
Wankyu Kim

Drug discovery based on artificial intelligence has been in the spotlight recently as it significantly reduces the time and cost required for developing novel drugs. With the advancement of deep learning (DL) technology and the growth of drug-related data, numerous deep-learning-based methodologies are emerging at all steps of drug development processes. In particular, pharmaceutical chemists have faced significant issues with regard to selecting and designing potential drugs for a target of interest to enter preclinical testing. The two major challenges are prediction of interactions between drugs and druggable targets and generation of novel molecular structures suitable for a target of interest. Therefore, we reviewed recent deep-learning applications in drug–target interaction (DTI) prediction and de novo drug design. In addition, we introduce a comprehensive summary of a variety of drug and protein representations, DL models, and commonly used benchmark datasets or tools for model training and testing. Finally, we present the remaining challenges for the promising future of DL-based DTI prediction and de novo drug design.


2020 ◽  
Vol 13 (9) ◽  
pp. 253
Author(s):  
Mattia Bernetti ◽  
Martina Bertazzo ◽  
Matteo Masetti

The big data concept is currently revolutionizing several fields of science including drug discovery and development. While opening up new perspectives for better drug design and related strategies, big data analysis strongly challenges our current ability to manage and exploit an extraordinarily large and possibly diverse amount of information. The recent renewal of machine learning (ML)-based algorithms is key in providing the proper framework for addressing this issue. In this respect, the impact on the exploitation of molecular dynamics (MD) simulations, which have recently reached mainstream status in computational drug discovery, can be remarkable. Here, we review the recent progress in the use of ML methods coupled to biomolecular simulations with potentially relevant implications for drug design. Specifically, we show how different ML-based strategies can be applied to the outcome of MD simulations for gaining knowledge and enhancing sampling. Finally, we discuss how intrinsic limitations of MD in accurately modeling biomolecular systems can be alleviated by including information coming from experimental data.


2020 ◽  
Vol 237 (12) ◽  
pp. 1438-1441
Author(s):  
Soenke Langner ◽  
Ebba Beller ◽  
Felix Streckenbach

AbstractMedical images play an important role in ophthalmology and radiology. Medical image analysis has greatly benefited from the application of “deep learning” techniques in clinical and experimental radiology. Clinical applications and their relevance for radiological imaging in ophthalmology are presented.


2021 ◽  
Vol 11 (4) ◽  
pp. 280
Author(s):  
Andrea Termine ◽  
Carlo Fabrizio ◽  
Claudia Strafella ◽  
Valerio Caputo ◽  
Laura Petrosini ◽  
...  

In the big data era, artificial intelligence techniques have been applied to tackle traditional issues in the study of neurodegenerative diseases. Despite the progress made in understanding the complex (epi)genetics signatures underlying neurodegenerative disorders, performing early diagnosis and developing drug repurposing strategies remain serious challenges for such conditions. In this context, the integration of multi-omics, neuroimaging, and electronic health records data can be exploited using deep learning methods to provide the most accurate representation of patients possible. Deep learning allows researchers to find multi-modal biomarkers to develop more effective and personalized treatments, early diagnosis tools, as well as useful information for drug discovering and repurposing in neurodegenerative pathologies. In this review, we will describe how relevant studies have been able to demonstrate the potential of deep learning to enhance the knowledge of neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases through the integration of all sources of biomedical data.


With the evolution of artificial intelligence to deep learning, the age of perspicacious machines has pioneered that can even mimic as a human. A Conversational software agent is one of the best-suited examples of such intuitive machines which are also commonly known as chatbot actuated with natural language processing. The paper enlisted some existing popular chatbots along with their details, technical specifications, and functionalities. Research shows that most of the customers have experienced penurious service. Also, the inception of meaningful cum instructive feedback endure a demanding and exigent assignment as enactment for chatbots builtout reckon mostly upon templates and hand-written rules. Current chatbot models lack in generating required responses and thus contradict the quality conversation. So involving deep learning amongst these models can overcome this lack and can fill up the paucity with deep neural networks. Some of the deep Neural networks utilized for this till now are Stacked Auto-Encoder, sparse auto-encoders, predictive sparse and denoising auto-encoders. But these DNN are unable to handle big data involving large amounts of heterogeneous data. While Tensor Auto Encoder which overcomes this drawback is time-consuming. This paper has proposed the Chatbot to handle the big data in a manageable time.


Sign in / Sign up

Export Citation Format

Share Document