scholarly journals Static Worst-Case Execution Time Optimization using DPSO for ASIP Architecture

2018 ◽  
Vol 14 (25) ◽  
pp. 1-11
Author(s):  
Mood Venkanna ◽  
Rameshwar Rao

Introduction: The application of specific instructions significantly improves energy, performance, and code size of configurable processors. The design of these instructions is performed by the conversion of patterns related to application-specific operations into effective complex instructions. This research was presented at the icitkm Conference, University of Delhi, India in 2017.Methods: Static analysis was a prominent research method during late the 1980’s. However, end-to-end measurements consist of a standard approach in industrial settings. Both static analysis tools perform at a high-level in order to determine the program structure, which works on source code, or is executable in a disassembled binary. It is possible to work at a low-level if the real hardware timing information for the executable task has the desired features.Results: We experimented, tested and evaluated using a H.264 encoder application that uses nine cis, covering most of the computation intensive kernels. Multimedia applications are frequently subject to hard real time constraints in the field of computer vision. The H.264 encoder consists of complicated control flow with more number of decisions and nested loops. The parameters evaluated were different numbers of A partitions (300 slices on a Xilinx Virtex 7each), reconfiguration bandwidths, as well as relations of cpu frequency and fabric frequency fCPU/ffabric. ffabric remains constant at 100MHz, and we selected a multiplicity of its values for fCPU that resemble realistic units. Note that while we anticipate the wcet in seconds (wcetcycles/ f CPU) to be lower (better) with higher fCPU, the wcet cycles increase (at a constant ffabric) because hardware cis perform less computations on the reconfigurable fabric within one cpu cycle.Conclusions: The method is similar to tree hybridization and path-based methods which are less precise, and to the global ipet method, which is more precise. Optimization is evaluated with the Discrete Particle Swarm Optimization (dpso) algorithm for wcet. For several real-world applications involving embedded processors, the proposed technique develops improved instruction sets in comparison to native instruction sets.Originality: For wcet estimation, flow analysis, low-level analysis and calculation phases of the program need to be considered. Flow analysis phase or the high-level of analysis helps to extract the program’s dynamic behavior that gives information on functions being called, number of loop iteration, dependencies among if-statements, etc. This is due to the fact that the analysis is unaware of the execution path corresponding to the longest execution time.Limitations: This path is executed within a kernel iteration that relies upon the nature of mb, either i-mb or p-mb, determined by the motion estimation kernel, that is, its’ input depends on the i-mb and p-mb paths ,which also contain separate cis leading to the instability of the worst-case path, that is, adding more partitions to the current worst-case path can result in the other path becoming the worst case. The pipeline stalls for the reconfiguration delay and continues when entering the kernel once the reconfiguration process finishes.

2003 ◽  
Vol 10 (19) ◽  
Author(s):  
Christian Kirkegaard ◽  
Anders Møller ◽  
Michael I. Schwartzbach

XML documents generated dynamically by programs are typically represented as text strings or DOM trees. This is a low-level approach for several reasons: 1) Traversing and modifying such structures can be tedious and error prone; 2) Although schema languages, e.g. DTD, allow classes of XML documents to be defined, there are generally no automatic mechanisms for statically checking that a program transforms from one class to another as intended. We introduce X<small>ACT</small>, a high-level approach for Java using XML templates as a first-class data type with operations for manipulating XML values based on XPath. In addition to an efficient runtime representation, the data type permits static type checking using DTD schemas as types. By specifying schemas for the input and output of a program, our algorithm will statically verify that valid input data is always transformed into valid output data and that no errors occur during processing.


Author(s):  
Александр Викторович Кныш ◽  
Дмитрий Александрович Кобзев ◽  
Оксана Николаевна Давиденко ◽  
Сергей Анатольевич Детистов ◽  
Иван Александрович Шечев ◽  
...  

В условиях существующего многообразия автоматизированных систем управления технологическими процессами (АСУТП), возрастающих рисков компьютерных инцидентов, обусловленных развитием информационных технологий, неизменно актуальными являются вопросы повышения качества программного обеспечения (ПО) АСУТП. В настоящей статье на примере АСУТП организаций системы «Транснефть» представлена возможность использования методов статического анализа исходного кода ПО с целью обеспечения информационной безопасности АСУТП. Рассмотрены причины низкого качества ПО и подходы к его повышению. Проанализированы методы анализа исходного кода ПО (статический, динамический, интерактивный), сделан вывод о том, что наиболее перспективной является комбинация трех видов статического анализа: сигнатурного анализа, анализа потока управления, анализа потока данных. Указанная комбинация легла в основу методики выявления ошибок, потенциально опасных конструкций, логических бомб и неиспользуемых переменных в ПО АСУТП, разработанной в рамках научно-исследовательской работы «Создание системы анализа исходного кода программного обеспечения автоматизированных систем управления технологическими процессами». Основным достоинством созданной методики является ее инвариантность по отношению к языкам программирования и разновидностям дефектов. При этом общий алгоритм поиска дефектов остается неизменным: меняются только сигнатуры, правила выявления. With the existing variety of automated process control systems (APCS) and the increasing risks of computer incidents caused by the development of information technology, the issues of improving the quality of the APCS software are invariably topical. This article presents the possibility of using the software source code static analysis methods for ensuring the information security of the APCS using the example of Transneft system entities’ APCS. The reasons for the low quality of software and approaches to its improvement are considered. Methods of software source code analysis (static, dynamic, interactive) are analyzed, and it is concluded that the most promising is a combination of three types of static analysis: signature analysis, control flow analysis, data flow analysis. This combination serves as the basis for the methodology of detecting errors, potentially dangerous structures, logic bombs, and unused variables in the APCS software developed as part of the research work entitled “Creation of a System for Analyzing the Source Code of Automated Process Control System Software”. The main advantage of the created methodology is its invariance with respect to programming languages and types of defects. At the same time, the general defect searching algorithm remains unchanged: only signatures and detection rules are subject to change.


Author(s):  
Marco Autili ◽  
Ivano Malavolta ◽  
Alexander Perucci ◽  
Gian Luca Scoccia ◽  
Roberto Verdecchia

AbstractMobile platforms are rapidly and continuously changing, with support for new sensors, APIs, and programming abstractions. Static analysis is gaining a growing interest, allowing developers to predict properties about the run-time behavior of mobile apps without executing them. Over the years, literally hundreds of static analysis techniques have been proposed, ranging from structural and control-flow analysis to state-based analysis.In this paper, we present a systematic mapping study aimed at identifying, evaluating and classifying characteristics, trends and potential for industrial adoption of existing research in static analysis of mobile apps. Starting from over 12,000 potentially relevant studies, we applied a rigorous selection procedure resulting in 261 primary studies along a time span of 9 years. We analyzed each primary study according to a rigorously-defined classification framework. The results of this study give a solid foundation for assessing existing and future approaches for static analysis of mobile apps, especially in terms of their industrial adoptability.Researchers and practitioners can use the results of this study to (i) identify existing research/technical gaps to target, (ii) understand how approaches developed in academia can be successfully transferred to industry, and (iii) better position their (past and future) approaches for static analysis of mobile apps.


2019 ◽  
Vol 1 (1) ◽  
pp. 31-39
Author(s):  
Ilham Safitra Damanik ◽  
Sundari Retno Andani ◽  
Dedi Sehendro

Milk is an important intake to meet nutritional needs. Both consumed by children, and adults. Indonesia has many producers of fresh milk, but it is not sufficient for national milk needs. Data mining is a science in the field of computers that is widely used in research. one of the data mining techniques is Clustering. Clustering is a method by grouping data. The Clustering method will be more optimal if you use a lot of data. Data to be used are provincial data in Indonesia from 2000 to 2017 obtained from the Central Statistics Agency. The results of this study are in Clusters based on 2 milk-producing groups, namely high-dairy producers and low-milk producing regions. From 27 data on fresh milk production in Indonesia, two high-level provinces can be obtained, namely: West Java and East Java. And 25 others were added in 7 provinces which did not follow the calculation of the K-Means Clustering Algorithm, including in the low level cluster.


Author(s):  
Margarita Khomyakova

The author analyzes definitions of the concepts of determinants of crime given by various scientists and offers her definition. In this study, determinants of crime are understood as a set of its causes, the circumstances that contribute committing them, as well as the dynamics of crime. It is noted that the Russian legislator in Article 244 of the Criminal Code defines the object of this criminal assault as public morality. Despite the use of evaluative concepts both in the disposition of this norm and in determining the specific object of a given crime, the position of criminologists is unequivocal: crimes of this kind are immoral and are in irreconcilable conflict with generally accepted moral and legal norms. In the paper, some views are considered with regard to making value judgments which could hardly apply to legal norms. According to the author, the reasons for abuse of the bodies of the dead include economic problems of the subject of a crime, a low level of culture and legal awareness; this list is not exhaustive. The main circumstances that contribute committing abuse of the bodies of the dead and their burial places are the following: low income and unemployment, low level of criminological prevention, poor maintenance and protection of medical institutions and cemeteries due to underperformance of state and municipal bodies. The list of circumstances is also open-ended. Due to some factors, including a high level of latency, it is not possible to reflect the dynamics of such crimes objectively. At the same time, identification of the determinants of abuse of the bodies of the dead will reduce the number of such crimes.


2021 ◽  
pp. 002224372199837
Author(s):  
Walter Herzog ◽  
Johannes D. Hattula ◽  
Darren W. Dahl

This research explores how marketing managers can avoid the so-called false consensus effect—the egocentric tendency to project personal preferences onto consumers. Two pilot studies were conducted to provide evidence for the managerial importance of this research question and to explore how marketing managers attempt to avoid false consensus effects in practice. The results suggest that the debiasing tactic most frequently used by marketers is to suppress their personal preferences when predicting consumer preferences. Four subsequent studies show that, ironically, this debiasing tactic can backfire and increase managers’ susceptibility to the false consensus effect. Specifically, the results suggest that these backfire effects are most likely to occur for managers with a low level of preference certainty. In contrast, the results imply that preference suppression does not backfire but instead decreases false consensus effects for managers with a high level of preference certainty. Finally, the studies explore the mechanism behind these results and show how managers can ultimately avoid false consensus effects—regardless of their level of preference certainty and without risking backfire effects.


Author(s):  
Richard Stone ◽  
Minglu Wang ◽  
Thomas Schnieders ◽  
Esraa Abdelall

Human-robotic interaction system are increasingly becoming integrated into industrial, commercial and emergency service agencies. It is critical that human operators understand and trust automation when these systems support and even make important decisions. The following study focused on human-in-loop telerobotic system performing a reconnaissance operation. Twenty-four subjects were divided into groups based on level of automation (Low-Level Automation (LLA), and High-Level Automation (HLA)). Results indicated a significant difference between low and high word level of control in hit rate when permanent error occurred. In the LLA group, the type of error had a significant effect on the hit rate. In general, the high level of automation was better than the low level of automation, especially if it was more reliable, suggesting that subjects in the HLA group could rely on the automatic implementation to perform the task more effectively and more accurately.


2020 ◽  
Vol 4 (POPL) ◽  
pp. 1-32 ◽  
Author(s):  
Michael Sammler ◽  
Deepak Garg ◽  
Derek Dreyer ◽  
Tadeusz Litak
Keyword(s):  

Materials ◽  
2021 ◽  
Vol 14 (5) ◽  
pp. 1226
Author(s):  
Beatriz Fraga-De Cal ◽  
Antonio Garrido-Marijuan ◽  
Olaia Eguiarte ◽  
Beñat Arregi ◽  
Ander Romero-Amorrortu ◽  
...  

Prefabricated solutions incorporating thermal insulation are increasingly adopted as an energy conservation measure for building renovation. The InnoWEE European project developed three technologies from Construction and Demolition Waste (CDW) materials through a manufacturing process that supports the circular economy strategy of the European Union. Two of them consisted of geopolymer panels incorporated into an External Thermal Insulation Composite System (ETICS) and a ventilated façade. This study evaluates their thermal performance by means of monitoring data from three pilot case studies in Greece, Italy, and Romania, and calibrated building simulation models enabling the reliable prediction of energy savings in different climates and use scenarios. Results showed a reduction in energy demand for all demo buildings, with annual energy savings up to 25% after placing the novel insulation solutions. However, savings are highly dependent on weather conditions since the panels affect cooling and heating loads differently. Finally, a parametric assessment is performed to assess the impact of insulation thickness through an energy performance prediction and a cash flow analysis.


Sign in / Sign up

Export Citation Format

Share Document