scholarly journals Gene Teller: an extensible Alexa Skill for gene-relevant databases

Author(s):  
Jon D Hill

Abstract Summary Voice assistants have become increasingly embedded in consumer electronics, as the quality of their interaction improves and the cost of hardware continues to drop. Despite their ubiquity, these assistants remain underutilized as a means of accessing biological research data. Gene Teller is a voice assistant service based on the Alexa Skills Kit and Amazon Lambda functions that enables scientists to query for gene-centric information in an intuitive manner. It includes several features, such as synonym disambiguation and short-term memory, that enable a natural conversational interaction, and is extensible to include new resources. The underlying architecture, based on Simple Storage Service and Amazon Web Services Lambda, is cost efficient and scalable. Availability and implementation A publicly accessible version of Gene Teller is available as an Alexa Skill from the Amazon Marketplace at https://www.amazon.com/dp/B08BRD8SS8. The source code is freely available on GitHub at https://github.com/solinvicta/geneTeller.

2020 ◽  
Vol 224 (1) ◽  
pp. 669-681
Author(s):  
Sihong Wu ◽  
Qinghua Huang ◽  
Li Zhao

SUMMARY Late-time transient electromagnetic (TEM) data contain deep subsurface information and are important for resolving deeper electrical structures. However, due to their relatively small signal amplitudes, TEM responses later in time are often dominated by ambient noises. Therefore, noise removal is critical to the application of TEM data in imaging electrical structures at depth. De-noising techniques for TEM data have been developed rapidly in recent years. Although strong efforts have been made to improving the quality of the TEM responses, it is still a challenge to effectively extract the signals due to unpredictable and irregular noises. In this study, we develop a new type of neural network architecture by combining the long short-term memory (LSTM) network with the autoencoder structure to suppress noise in TEM signals. The resulting LSTM-autoencoders yield excellent performance on synthetic data sets including horizontal components of the electric field and vertical component of the magnetic field generated by different sources such as dipole, loop and grounded line sources. The relative errors between the de-noised data sets and the corresponding noise-free transients are below 1% for most of the sampling points. Notable improvement in the resistivity structure inversion result is achieved using the TEM data de-noised by the LSTM-autoencoder in comparison with several widely-used neural networks, especially for later-arriving signals that are important for constraining deeper structures. We demonstrate the effectiveness and general applicability of the LSTM-autoencoder by de-noising experiments using synthetic 1-D and 3-D TEM signals as well as field data sets. The field data from a fixed loop survey using multiple receivers are greatly improved after de-noising by the LSTM-autoencoder, resulting in more consistent inversion models with significantly increased exploration depth. The LSTM-autoencoder is capable of enhancing the quality of the TEM signals at later times, which enables us to better resolve deeper electrical structures.


2020 ◽  
Vol 20 ◽  
pp. 101164 ◽  
Author(s):  
Amit Kumar ◽  
Saurabh Mishra ◽  
A.K. Taxak ◽  
Rajiv Pandey ◽  
Zhi-Guo Yu

VLSI Design ◽  
2000 ◽  
Vol 11 (3) ◽  
pp. 259-283 ◽  
Author(s):  
Shawki Areibi ◽  
Anthony Vannelli

The main goal of the paper is to explore the effectiveness of a new method called Tabu Search [1] on partitioning and compare it with two techniques widely used in CAD tools for circuit partitioning i.e., Sanchis Interchange method and Simulated Annealing, in terms of the running time and quality of solution. The proposed method integrates the well known iterative multi-way interchange method with Tabu Search and leads to a very powerful network partitioning heuristic. It is characterized by an ability to escape local optima which usually cause simple descent algorithms to terminate by using a short term memory of recent solutions. Moreover, Tabu Search permits backtracking to previous solutions, which explore different directions and generates better partitions.The quality of the test results on MCNC benchmark circuits are very promising in most cases. Tabu Search yields netlist partitions that contain 20%–67% fewer cut nets and are generated 2/3 to (1/2) times faster than the best netlist partitions obtained by using an interchange method. Comparable partitions to those obtained by Simulated Annealing are obtained 5 to 20 times faster.


Medicinus ◽  
2020 ◽  
Vol 7 (7) ◽  
pp. 216
Author(s):  
Stevanie Budianto ◽  
Yusak M.T Siahaan

<p><strong>Background:</strong> Memory is a place where information is stored from the learning process or experience. There are several types of memory , one of them is short term memory. Declining sleep quality is directly proportional to the decrease in short-term memory. Poor sleep quality is often associated with medical student due to exams or vast amount of tasks. Therefore, researcher wants to see whether there is significant correlation between sleep quality and short-term memory function in students.</p><p><strong>Aim:</strong> To assess the association of the quality of sleep towards short term memory function of medical student of Pelita Harapan University.</p><p><strong>Methods:</strong> This study was conducted with a cross-sectional method, with taking sample using the method of a simple random sample. A total of 90 respondents at University of Pelita Harapan were taken. Data collected sorted out according to the inclusion and exclusion criteria. Quality of sleep assessed with PSQI questionnaire while short-term memory assessed by Digit span backward test. Results processed with SPSS version 24 and tested with Chi Square.</p><p><strong>Results</strong>: Data analyzed by Chi square test showed there are 33 students (58.9%) have poor sleep quality and short term memory function. There are also significant association between the quality of sleep and short term memory function (p value = 0.026)</p><p><strong>Conclusion:</strong> There is significant association between the quality of sleep and short term memory function of medical students of Pelita Harapan University.</p>


2021 ◽  
Vol 6 (4) ◽  
pp. 40-49
Author(s):  
Nur Natasya Mohd Anuar ◽  
Nur Fatihah Fauzi ◽  
Huda Zuhrah Ab Halim ◽  
Nur Izzati Khairudin ◽  
Nurizatul Syarfinas Ahmad Bakhtiar ◽  
...  

Predictions of future events must be factored into decision-making. Predictions of water quality are critical to assist authorities in making operational, management, and strategic decisions to keep the quality of water supply monitored under specific criteria. Taking advantage of the good performance of long short-term memory (LSTM) deep neural networks in time-series prediction, the purpose of this paper is to develop and train a Long-Short Term Memory (LSTM) Neural Network to predict water quality parameters in the Selangor River. The primary goal of this study is to predict five (5) water quality parameters in the Selangor River, namely Biochemical Oxygen Demand (BOD), Ammonia Nitrogen (NH3-N), Chemical Oxygen Demand (COD), pH, and Dissolved Oxygen (DO), using secondary data from different monitoring stations along the river basin. The accuracy of this method was then measured using RMSE as the forecast measure. The results show that by using the Power of Hydrogen (pH), the dataset yielded the lowest RMSE value, with a minimum of 0.2106 at station 004 and a maximum of 1.2587 at station 001. The results of the study indicate that the predicted values of the model and the actual values were in good agreement and revealed the future developing trend of water quality parameters, showing the feasibility and effectiveness of using LSTM deep neural networks to predict the quality of water parameters.


Author(s):  
Rajib L. Saha ◽  
Sumanta Singha ◽  
Subodha Kumar

Many firms buy cloud services from cloud vendors, such as Amazon Web Services to serve end users. One of the key factors that affect the quality of cloud services is congestion. Congestion leads to a potential loss of end users, resulting in lower demand for cloud services. Although discount can stimulate demand, its effect under congestion is ambiguous; a higher discount leads to higher demand, but it can further lead to higher congestion, thereby lowering demand. We explore how congestion moderates both cloud vendor pricing and the buyer’s fulfillment decisions. We seek to answer how the congestion sensitivity of the end users and the cost of technology impact buyer profitability and the cloud vendor’s choice of discount. We also examine how the cost of technology determines the buyer’s willingness to pass on savings to end users. Our results show that the buyer is not necessarily worse off even when the end users are more intolerant to congestion. In fact, when end users are more congestion sensitive, the demand for cloud services can sometimes increase, and the discount offered by the vendor can decrease. We also observe that a lower cost of technology can sometimes hurt the buyer, and the buyer can pass on lower benefits to end users.


2006 ◽  
Vol 27 (4) ◽  
pp. 552-556 ◽  
Author(s):  
Shula Chiat

In line with the original presentation of nonword repetition as a measure of phonological short-term memory (Gathercole & Baddeley, 1989), the theoretical account Gathercole (2006) puts forward in her Keynote Article focuses on phonological storage as the key capacity common to nonword repetition and vocabulary acquisition. However, evidence that nonword repetition is influenced by a variety of factors other than item length has led Gathercole to qualify this account. In line with arguments put forward by Snowling, Chiat, and Hulme (1991), one of Gathercole's current claims is that nonword repetition and word learning are constrained by “the quality of temporary storage of phonological representations, and this quality is multiply determined.” Phonological storage is not just a quantity-limited capacity.


2006 ◽  
Vol 27 (4) ◽  
pp. 581-584 ◽  
Author(s):  
Elisabet Service

The first report of a connection between vocabulary learning and phonological short-term memory was published in 1988 (Baddeley, Papagno, & Vallar, 1988). At that time, both Susan Gathercole and I were involved in longitudinal studies, investigating the relation between nonword repetition and language learning. We both found a connection. Now, almost 20 years later, in her Keynote Gathercole (2006) reviews a multitude of data bearing on the interpretation of this often replicated connection. Her main conclusions are three. First, both nonword repetition and word learning are constrained by the quality of temporary storage. She sees this storage as multiply determined, that is, affected by factors like perceptual analysis, phonological awareness (ability to identify and reflect on the speech sounds that make up words). Second, both nonword repetition and word learning are also affected by sensory, cognitive, and motor processes. Third, an impairment of phonological storage is typically associated with specific language impairment (SLI) but may not be a sole causal factor.


Information ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 224
Author(s):  
Amirmohammad Pasdar ◽  
Young Choon Lee ◽  
Tahereh Hassanzadeh ◽  
Khaled Almi’ani

The interaction between artificial intelligence (AI), edge, and cloud is a fast-evolving realm in which pushing computation close to the data sources is increasingly adopted. Captured data may be processed locally (i.e., on the edge) or remotely in the clouds where abundant resources are available. While many emerging applications are processed in situ due primarily to their data intensiveness and short-latency requirement, the capacity of edge resources remains limited. As a result, the collaborative use of edge and cloud resources is of great practical importance. Such collaborative use should take into account data privacy, high latency and high bandwidth consumption, and the cost of cloud usage. In this paper, we address the problem of resource allocation for data processing jobs in the edge-cloud environment to optimize cost efficiency. To this end, we develop Cost Efficient Cloud Bursting Scheduler and Recommender (CECBS-R) as an AI-assisted resource allocation framework. In particular, CECBS-R incorporates machine learning techniques such as multi-layer perceptron (MLP) and long short-term memory (LSTM) neural networks. In addition to preserving privacy due to employing edge resources, the edge utility cost plus public cloud billing cycles are adopted for scheduling, and jobs are profiled in the cloud-edge environment to facilitate scheduling through resource recommendations. These recommendations are outputted by the MLP neural network and LSTM for runtime estimation and resource recommendation, respectively. CECBS-R is trained with the scheduling outputs of Facebook and grid workload traces. The experimental results based on unseen workloads show that CECBS-R recommendations achieve a ∼65% cost saving in comparison to an online cost-efficient scheduler (BOS), resource management service (RMS), and an adaptive scheduling algorithm with QoS satisfaction (AsQ).


Sign in / Sign up

Export Citation Format

Share Document