models at runtime
Recently Published Documents


TOTAL DOCUMENTS

33
(FIVE YEARS 7)

H-INDEX

8
(FIVE YEARS 0)

2021 ◽  
Vol 11 (20) ◽  
pp. 9743
Author(s):  
Mohammed Mounir Bouhamed ◽  
Gregorio Díaz ◽  
Allaoua Chaoui ◽  
Oussama Kamel ◽  
Radouane Nouara

Models@runtime (models at runtime) are based on computation reflection. Runtime models can be regarded as a reflexive layer causally connected with the underlying system. Hence, every change in the runtime model involves a change in the reflected system, and vice versa. To the best of our knowledge, there are no runtime models for Python applications. Therefore, we propose a formal approach based on Petri Nets (PNs) to model, develop, and reconfigure Python applications at runtime. This framework is supported by a tool whose architecture consists of two modules connecting both the model and its execution. The proposed framework considers execution exceptions and allows users to monitor Python expressions at runtime. Additionally, the application behavior can be reconfigured by applying Graph Rewriting Rules (GRRs). A case study using Service-Level Agreement (SLA) violations is presented to illustrate our approach.


2021 ◽  
Author(s):  
Felipe A. Lopes

The programmable network architectures that emerged in the last decade have allowed new ways to enable Autonomic Networks. However, there are several open issues to address before making such a possibility into a feasible reality. For instance, defining network goals, translating them into network rules, and granting the correct functioning of the network control loop in a self-adaptive manner are examples of complex tasks required to enable an autonomic networking environment. Fortunately, architectures based on the concept of Models at Runtime (MART) provide ways to overcome such complexity. This paper proposes a MART-based framework – using the RFC 7575 as reference (i.e., definitions and design goals for autonomic networking) – to implement autonomic management into a programmable network. The evaluation shows the proposed framework is suitable for satisfying the functional and performance requirements of a simulated network.


Author(s):  
Sebastian Pilarski ◽  
Martin Staniszewski ◽  
Matthew Bryan ◽  
Frederic Villeneuve ◽  
Dániel Varró

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1339
Author(s):  
Miguel de Prado ◽  
Manuele Rusci ◽  
Alessandro Capotondi ◽  
Romain Donze ◽  
Luca Benini ◽  
...  

Standard-sized autonomous vehicles have rapidly improved thanks to the breakthroughs of deep learning. However, scaling autonomous driving to mini-vehicles poses several challenges due to their limited on-board storage and computing capabilities. Moreover, autonomous systems lack robustness when deployed in dynamic environments where the underlying distribution is different from the distribution learned during training. To address these challenges, we propose a closed-loop learning flow for autonomous driving mini-vehicles that includes the target deployment environment in-the-loop. We leverage a family of compact and high-throughput tinyCNNs to control the mini-vehicle that learn by imitating a computer vision algorithm, i.e., the expert, in the target environment. Thus, the tinyCNNs, having only access to an on-board fast-rate linear camera, gain robustness to lighting conditions and improve over time. Moreover, we introduce an online predictor that can choose between different tinyCNN models at runtime—trading accuracy and latency—which minimises the inference’s energy consumption by up to 3.2×. Finally, we leverage GAP8, a parallel ultra-low-power RISC-V-based micro-controller unit (MCU), to meet the real-time inference requirements. When running the family of tinyCNNs, our solution running on GAP8 outperforms any other implementation on the STM32L4 and NXP k64f (traditional single-core MCUs), reducing the latency by over 13× and the energy consumption by 92%.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Priscila Cedillo ◽  
Emilio Insfran ◽  
Silvia Abrahao ◽  
Jean Vanderdonckt

2018 ◽  
Author(s):  
Paulo César F. Melo ◽  
Fábio M. Costa

Making cities smarter can help improve city services, optimize resource and infrastructure utilization and increase quality of life. Smart Cities connect citizens in novel ways by leveraging the latest advances in information and communication technologies (ICT). The integration of rich sensing capabilities in today's mobile devices allows their users to actively participate in sensing the environment. In Mobile CrowdSensing (MCS) citizens of a Smart City collect, share and jointly use services based on sensed data. The main challenges for smart cities regarding MCS is the heterogeneity of devices and the dynamism of the environment. To overcome these challenges, this paper presents an architecture based on models at runtime (M@rt) to support dynamic MCS queries in Smart Cities. The architecture is proposed as an extension of the InterSCity platform, leveraging on its existing services and on its capability to integrate city infrastructure resources.


2017 ◽  
Vol 63 ◽  
pp. 332-352 ◽  
Author(s):  
Germán H. Alférez ◽  
Vicente Pelechano

Sign in / Sign up

Export Citation Format

Share Document