SERRANO: Transparent Application Deployment in a Secure, Accelerated and Cognitive Cloud Continuum

Author(s):  
Aristotelis Kretsis ◽  
Panagiotis Kokkinos ◽  
Polyzois Soumplis ◽  
Juan Jose Vegas Olmos ◽  
Marcell Feher ◽  
...  
Author(s):  
Fabrizio Moggio ◽  
Mauro Boldi ◽  
Silvia Canale ◽  
Vincenzo Suraci ◽  
Claudio Casetti ◽  
...  

Author(s):  
Yanish Pradhananga ◽  
Pothuraju Rajarajeswari

The evolution of Internet of Things (IoT) brought about several challenges for the existing Hardware, Network and Application development. Some of these are handling real-time streaming and batch bigdata, real- time event handling, dynamic cluster resource allocation for computation, Wired and Wireless Network of Things etc. In order to combat these technicalities, many new technologies and strategies are being developed. Tiarrah Computing comes up with integration the concept of Cloud Computing, Fog Computing and Edge Computing. The main objectives of Tiarrah Computing are to decouple application deployment and achieve High Performance, Flexible Application Development, High Availability, Ease of Development, Ease of Maintenances etc. Tiarrah Computing focus on using the existing opensource technologies to overcome the challenges that evolve along with IoT. This paper gives you overview of the technologies and design your application as well as elaborate how to overcome most of existing challenge.


Author(s):  
Zhengzhe Xiang ◽  
Yuhang Zheng ◽  
Mengzhu He ◽  
Longxiang Shi ◽  
Dongjing Wang ◽  
...  

AbstractRecently, the Internet-of-Things technique is believed to play an important role as the foundation of the coming Artificial Intelligence age for its capability to sense and collect real-time context information of the world, and the concept Artificial Intelligence of Things (AIoT) is developed to summarize this vision. However, in typical centralized architecture, the increasing of device links and massive data will bring huge congestion to the network, so that the latency brought by unstable and time-consuming long-distance network transmission limits its development. The multi-access edge computing (MEC) technique is now regarded as the key tool to solve this problem. By establishing a MEC-based AIoT service system at the edge of the network, the latency can be reduced with the help of corresponding AIoT services deployed on nearby edge servers. However, as the edge servers are resource-constrained and energy-intensive, we should be more careful in deploying the related AIoT services, especially when they can be composed to make complex applications. In this paper, we modeled complex AIoT applications using directed acyclic graphs (DAGs), and investigated the relationship between the AIoT application performance and the energy cost in the MEC-based service system by translating it into a multi-objective optimization problem, namely the CA$$^3$$ 3 D problem — the optimization problem was efficiently solved with the help of heuristic algorithm. Besides, with the actual simple or complex workflow data set like the Alibaba Cloud and the Montage project, we conducted comprehensive experiments to evaluate the results of our approach. The results showed that the proposed approach can effectively obtain balanced solutions, and the factors that may impact the results were also adequately explored.


2018 ◽  
Vol 17 (2) ◽  
pp. 1-31 ◽  
Author(s):  
Maria Méndez Real ◽  
Philipp Wehner ◽  
Vianney Lapotre ◽  
Diana Göhringer ◽  
Guy Gogniat

Author(s):  
Marco Gribaudo ◽  
Thi Thao Nguyen Ho ◽  
Barbara Pernici ◽  
Giuseppe Serazzi

2021 ◽  
Author(s):  
Mehdi Alipour K ◽  
◽  
Bin Dai ◽  
Jimmy Price ◽  
Christopher Michaell Jones ◽  
...  

Measuring formation pressure and collecting representative samples are the essential tasks of formation testing operations. Where, when and how to measure pressure or collect samples are critical questions which must be addressed in order to complete any job successfully. Formation testing data has a crucial role in reserve estimation especially at the stage of field exploration and appraisal, but can be time consuming and expensive. Optimum location has a major impact on both the time spent performing and the success of pressure testing and sampling. Success and optimization of rig-time paradoxically requires careful and extensive but also quick pre-job planning. The current practice of finding optimum locations for testing heavily rely on expert knowledge. With nearly complete digitization of data collection, the oil industry is now dealing with massive data flow giving rise to the question of its application and the necessity to collect. Some data may be so called “dark data” of which a very tiny portion is used for decision making. For instance, a variety of petrophysical logs may be collected in a single well to provide measures of formation properties. The logs may include conventional gamma ray, neutron, density, caliper, resistivity or more advanced tools such as high-resolution image logs, acoustic, or NMR. These data can be integrated to help decide where to pressure test and sample, however, this effort is nearly exclusively driven by experts and is manpower intensive. In this paper we present a workflow to gather, process and analyze conventional log data in order to optimize formation testing operations. The data is from an enormous geographic distribution of wells. Tremendous effort has been performed to extract, transform and load (ETL) the data into a usable format. Stored files contains multi-million to multi-billions rows of data thereby creating technology challenges in terms of reading, processing and analyzing in a timely manner for pre-job planning. We address the technological challenges by deploying cutting-edge data technology to solve this problem. Upon completion of the workflow we have been able to build a scalable petrophysical interpretation log platform which can be easily utilized for machine learning and application deployment. This type of data base is invaluable asset especially in places where there is a need for knowledge of analogous wells. Exploratory data analysis on worldwide data on mobility and some key influencing features on pressure test and sampling quality, is performed and presented. We further show how this data is integrated and analyzed in order to automate selection of locations for which to formation test.


Sign in / Sign up

Export Citation Format

Share Document