Adaptive Modeling and Dynamic Targeting for Real Time Analytics in Mobile Advertising

Author(s):  
Donald Kridel ◽  
Dan Dolk ◽  
David Castillo

Mobile marketing campaigns are now largely deployed through demand side platforms (DSPs) who provide dynamic customer targeting and a performance-intensive real-time bidding (RTB) version of predictive analytics as a service. Matching users with the campaigns they are most likely to engage with in extreme real-time environments requires adaptive model management, advanced parallel processing hardware/software, and the integration of multiple very large databases. The authors present (1) an adaptive modeling strategy to satisfy the performance thresholds of 40 to 100ms for DSPs to decide whether and how much to bid for a potential client to receive a particular advertisement via their mobile device. (2) a dynamic customer profiling technique to map mobile devices to specific lattices (geographic locations), and to track user behavior via device-histories. In this “big data” decision environment, analytic model management is automated via model feedback loops which adjust the models dynamically as real-time data streams in.

2021 ◽  
Author(s):  
Paulinus Abhyudaya Bimastianto ◽  
Shreepad Purushottam Khambete ◽  
Hamdan Mohamed Alsaadi ◽  
Suhail Mohammed Al Ameri ◽  
Erwan Couzigou ◽  
...  

Abstract This project used predictive analytics and machine learning-based modeling to detect drilling anomalies, namely stuck pipe events. Analysis focused on historical drilling data and real-time operational data to address the limitations of physics-based modeling. This project was designed to enable drilling crews to minimize downtime and non-productive time through real-time anomaly management. The solution used data science techniques to overcome data consistency/quality issues and flag drilling anomalies leading to a stuck pipe event. Predictive machine learning models were deployed across seven wells in different fields. The models analyzed both historical and real-time data across various data channels to identify anomalies (difficulties that impact non-productive time). The modeling approach mimicked the behavior of drillers using surface parameters. Small deviations from normal behavior were identified based on combinations of surface parameters, and automated machine learning was used to accelerate and optimize the modeling process. The output was a risk score that flags deviations in rig surface parameters. During the development phase, multiple data science approaches were attempted to monitor the overall health of the drilling process. They analyzed both historical and real-time data from torque, hole depth and deviation, standpipe pressure, and various other data channels. The models detected drilling anomalies with a harmonic model accuracy of 80% and produced valid alerts on 96% of stuck pipe and tight hole events. The average forewarning was two hours. This allowed personnel ample time to make corrections before stuck pipe events could occur. This also enabled the drilling operator to save the company upwards of millions of dollars in drilling costs and downtime. This project introduced novel data aggregation and deep learning-based normal behavior modeling methods. It demonstrates the benefits of adopting predictive analytics and machine learning in drilling operations. The approach enabled operators to mitigate data issues and demonstrate real-time, high-frequency and high-accuracy predictions. As a result, the operator was able to significantly reduce non-productive time.


Author(s):  
Arka Ghosh ◽  
M. Reza Hosseini ◽  
Riyadh Al-Ameri ◽  
Gintaris Kaklauskas ◽  
Bahareh Nikmehr

Concreting is generally a manual, labour intensive and time-consuming process, putting additional burden on constrained resources. Current practices of concreting are wasteful, non-sustainable and end products usually lack proper quality conformance. This paper, as the first outcome of an ongoing research project, proposes concrete as an area ripe for being disrupted by new technological developments and the wave of automation. It puts forward arguments to show that The Internet of Things (IoT), as an emerging concept, has the potential to revolutionize concreting operations, resulting in substantial time savings, confidence in its durability and enhanced quality conformance. A conceptual framework for a digital concrete quality control (DCQC) drawing upon IoT is outlined; DCQC facilitates automated lifecycle monitoring of concrete, controlled by real-time monitoring of parameters like surface humidity, temperature variance, moisture content, vibration level, and crack occurrence and propagation of concrete members through embedded sensors. Drawing upon an analytical approach, discussions provide evidence for the advantages of adopting DCQC. The proposed system is of particular appeal for practitioners, as a workable solution for reducing water, energy consumption and required man-hours for concreting procedures, as well as, providing an interface for access to real-time data, site progress monitoring, benchmarking, and predictive analytics purposes.


2019 ◽  
Vol 8 (4) ◽  
pp. 9266-9270

Internet of things (IoT) is a quick-moving gathering of web associated sensors implanted in a wide-extending assortment of physical articles. While things can be any physical item (energize or lifeless) on the planet, to which you could associate or implant a sensor. Sensors can take countless potential estimations. Sensors produce gigantic measures of new, organized, unstructured, ongoing information, and structures enormous information. IoT information is exceptionally huge and confused, which can give genuine-time setting and supposition data about genuine articles or nature. Among the different challenges that the present IoT is facing, the three prime areas of concern are, need of efficient framework to receive IoT data, a need of a new scalable parallel indexing technique for efficiently storing IoT data and securing IoT generated data at all the stages i.e. from the edge devices to the cloud. A new efficient framework is introduced, which can retrieve meaningful information from these IoT devices and efficiently index it. For processing such enormous real time data generated from IoT devices, new techniques are introducing which are scalable and secure. The research proposes a general IoT network architecture. It describes the interconnectivity among the different things such as sensors, receivers and cloud. The proposed architecture efficiently receives real time data from all the sensors. The prime focus is on the elimination of the existing issues in IoT. Along with this, the provision has to make for standard future proofing against these new proposed schemes.


2021 ◽  
Author(s):  
Salvatore Della Villa ◽  
Robert Steele ◽  
Dongwon Shin ◽  
Sangkeun (Matt) Lee ◽  
Travis Johnston ◽  
...  

Abstract At the Turbo Expo 2018: Turbomachinery Conference & Expedition, in Oslo, Norway, an innovative approach for assessing operating and near real-time data from power generating assets with meaningful predictive analytics was presented and discussed. GT2018-75030, entitled; Energy Innovation: A Focus on Power Generation Data Capture & Analytics in a Competitive Market established a challenging objective for the industry: “To advance the notion that the fusion of total plant data, from three primary sources, with the ability to transform, analyze, and act based on integrating subject matter expertise is essential for effectively managing assets for optimum performance and profitability; executing and delivering on the promise of “Big Data” and advanced analytics.” Throughout 2019 and 2020, a team comprised of members from Strategic Power Systems, Inc. ® (SPS), Turbine Logic (TL), and two National Labs; National Energy Technology Laboratory (NETL) and Oak Ridge National Laboratory (ORNL), collaborated on the paper’s hypothesis. The team worked with the support of funding from DOE’s Fossil Energy Program through its HPC4 Materials Program, which provided access to the High-Performance Computing assets at both laboratories. The team brought unique skills, strengths, and capabilities that would serve as the basis for an effective, open, and challenging collaboration. The engineering and data science disciplines that converged on this project provided the back-bone for the unbiased analysis and model building that took place; relying on a unique and up-to-date source of plant operating and design data essential for performing the engineering scope of work. A key objective was to use the data and the modeling to be predictive; to characterize remaining life, expended life, and to determine the “next failure” for critical systems and components. Proof-of-concepts were tested for longer term, data-driven reliability prediction for fleets of power generating assets, near real-time prediction of power plant faults which could lead to imminent failure, and physics-based model prediction of life consumption of critical parts. Each of these pilot scale projects is summarized with key results presented.


The rise of IoT Real time data has led to new demands for mining systems to learn complex models with millions to billions of parameters, which promise adequate capacity to digest massive datasets and offer powerful predictive analytics. To support Big Data mining, high-performance powerful computing platforms are required, which impose regular designs to unleash the full power of the Big Data. Pattern mining poses a lot of interesting research problems and there are many areas that are still not well understood. The specifically very elementary challenges are to understand the meaningful data from the junk data that anticipated into the internet, refer as “Smart Data”. Eighty-five percent of the entire data are noisy or meaningless. It is a very tough often assigned to verify and separate to refine the data from the noisy junk. Researchers’ are proposing an algorithm of distributed pattern mining to give some sort of solution of the heterogeneity, scaling and hidden Big Data problems. The algorithm has evaluated in parameters like cost, speed, space and overhead. Researchers’ used IoT as the source of Big Data that generates heterogeneous Big Data. In this paper, we are representing the results of all tests proved that; the new method gives accurate results and valid outputs based on verifying them with the results of the other valid methods. Also, the results show that, the new method can handle the big datasets and decides the frequent pattern and produces the associate rule sets faster than that of the conventional methods and less amount of memory storage for processing. Overall the new method has a challenging performance as regard the memory storage and the speed of processing as compared to the conventional methods of frequent pattern mining like Apriori and FP-Growth techniques.


2021 ◽  
Author(s):  
Carla Sanasi ◽  
Luca Dal Forno ◽  
Giorgio Ricci Maccarini ◽  
Luigi Mutidieri ◽  
Pamela Tempone ◽  
...  

Abstract The evolution of the energy market requires companies to increase their operating efficiency, leveraging on collaborative environment and existing assets, including Data. A new focus on data governance and integration is needed to maximize the value of data and ensure "real-time" efficient response. The decoupling of data from applications enables organization by domain and data type in one cross-functional data hub. This scheme is independent from the scope of the activity and will therefore maintain its validity when dealing with new business requiring subsurface data utilization. The integrated data platform will feed advanced digital tools capable to control the risks, optimize performance and reduce emissions associated with the operations. Eni is putting this idea into practice with a new data infrastructure which is integrated across all the subsurface disciplines (G&G, Exploration, Upstream Laboratories, Reservoir and Well Operations departments). In this paper, the example of real time data exploitation will be discussed. Real time data workflow was first established in well operations for operational supervision and later developed for real time performance optimization, through the introduction of predictive analytics. Its latest evolution in the broader subsurface domain encompasses the application of AI to operations geology processes and the extension to all operated activities. This approach will equally support new company goals, such as decarbonization, increasing performance of subsurface activities related to underground storage of CO2 in depleted reservoirs.


2007 ◽  
Vol 51 (3.4) ◽  
pp. 465-475 ◽  
Author(s):  
G. F. Anderson ◽  
D. A. Selby ◽  
M. Ramsey

2020 ◽  
Vol 2020 (185-186) ◽  
pp. 11-24
Author(s):  
Kara Larkan‐Skinner ◽  
Jessica M. Shedd

2019 ◽  
Vol 17 (1/2) ◽  
pp. 69-75 ◽  
Author(s):  
Dean Wilson

Policing, particularly in the United States, is being progressively datafied. This process has a historical trajectory that is crucial to the analysis and critique of new platform-based security architectures. Predictive policing has already attracted considerable attention, partially due to its seemingly novel fusion of predictive analytics and police work. Hyperbolic early claims—often mobilizing science fiction imagery—that the future could, in fact, be predicted, pointed towards utopic/dystopic imaginaries of seamlessly integrated control. Predictive policing is, however, increasingly only one component of cloud-based data systems that are coursing through police activity. The imaginary of these transformations can be analysed through the security imaginary of policing as a process of real-time data transmission, perpetually self-adjusting and self-augmenting through machine calculation. The historical contextualization of this imaginary suggests useful vectors of inquiry that position platform policing squarely within the mechanisms of contemporary capitalism.


Sign in / Sign up

Export Citation Format

Share Document