Predictive Warp Scheduling for Efficient Execution in GPGPU

Author(s):  
Abhinish Anand ◽  
Winnie Thomas ◽  
Suryakant Toraskar ◽  
Virendra Singh
Keyword(s):  
Author(s):  
Mohammad Mahdi Javanmard ◽  
Zafar Ahmad ◽  
Jaroslaw Zola ◽  
Louis-Noel Pouchet ◽  
Rezaul Chowdhury ◽  
...  

2013 ◽  
Vol 17 (5) ◽  
pp. 921-947 ◽  
Author(s):  
José Losada ◽  
Juan Raposo ◽  
Alberto Pan ◽  
Paula Montoto

2021 ◽  
Vol 15 (1) ◽  
pp. 98-111
Author(s):  
Dong He ◽  
Maureen Daum ◽  
Walter Cai ◽  
Magdalena Balazinska

We design, implement, and evaluate DeepEverest, a system for the efficient execution of interpretation by example queries over the activation values of a deep neural network. DeepEverest consists of an efficient indexing technique and a query execution algorithm with various optimizations. We prove that the proposed query execution algorithm is instance optimal. Experiments with our prototype show that DeepEverest, using less than 20% of the storage of full materialization, significantly accelerates individual queries by up to 63X and consistently outperforms other methods on multi-query workloads that simulate DNN interpretation processes.


2015 ◽  
Vol 39 (4-5) ◽  
pp. 271-285 ◽  
Author(s):  
Ying Zhang ◽  
Lide Duan ◽  
Bin Li ◽  
Lu Peng ◽  
Srinivasan Sadagopan

Author(s):  
Dr. C. K. Gomathy

Abstract: Apache Sqoop is mainly used to efficiently transfer large volumes of data between Apache Hadoop and relational databases. It helps to certain tasks, such as ETL (Extract transform load) processing, from an enterprise data warehouse to Hadoop, for efficient execution at a much less cost. Here first we import the table which presents in MYSQL Database with the help of command-line interface application called Sqoop and there is a chance of addition of new rows and updating new rows then we have to execute the query again. So, with the help of our project there is no need of executing queries again for that we are using Sqoop job, which consists of total commands for import and next after import we retrieve the data from hive using Java JDBC and we convert the data to JSON Format, which consists of data in an organized way and easy to access manner by using GSON Library. Keywords: Sqoop, Json, Gson, Maven and JDBC


Sign in / Sign up

Export Citation Format

Share Document