The Implement of a Linux Database Data Processing Mechanism Based on Cache

2014 ◽  
Vol 989-994 ◽  
pp. 4447-4451
Author(s):  
Yong Jun Zhang ◽  
Chun Hui Li

Although the simple structure and easy operation of completing data process depending only on the real-time request of database, the database call time is far longer than the lookup time. If the database is called too frequently, the processing speed will surely be cut down and the real-time performance will be degraded. Based on the situation, this paper will put forward one method, which combined cache with database can improve the performance of the system in the main controller of integrated security system of equipment management.

2015 ◽  
Vol 2 (1) ◽  
pp. 35-41
Author(s):  
Rivan Risdaryanto ◽  
Houtman P. Siregar ◽  
Dedy Loebis

The real-time system is now used on many fields, such as telecommunication, military, information system, evenmedical to get information quickly, on time and accurate. Needless to say, a real-time system will always considerthe performance time. In our application, we define the time target/deadline, so that the system should execute thewhole tasks under predefined deadline. However, if the system failed to finish the tasks, it will lead to fatal failure.In other words, if the system cannot be executed on time, it will affect the subsequent tasks. In this paper, wepropose a real-time system for sending data to find effectiveness and efficiency. Sending data process will beconstructed in MATLAB and sending data process has a time target as when data will send.


2021 ◽  
Vol 40 (3) ◽  
pp. 1-12
Author(s):  
Hao Zhang ◽  
Yuxiao Zhou ◽  
Yifei Tian ◽  
Jun-Hai Yong ◽  
Feng Xu

Reconstructing hand-object interactions is a challenging task due to strong occlusions and complex motions. This article proposes a real-time system that uses a single depth stream to simultaneously reconstruct hand poses, object shape, and rigid/non-rigid motions. To achieve this, we first train a joint learning network to segment the hand and object in a depth image, and to predict the 3D keypoints of the hand. With most layers shared by the two tasks, computation cost is saved for the real-time performance. A hybrid dataset is constructed here to train the network with real data (to learn real-world distributions) and synthetic data (to cover variations of objects, motions, and viewpoints). Next, the depth of the two targets and the keypoints are used in a uniform optimization to reconstruct the interacting motions. Benefitting from a novel tangential contact constraint, the system not only solves the remaining ambiguities but also keeps the real-time performance. Experiments show that our system handles different hand and object shapes, various interactive motions, and moving cameras.


2020 ◽  
Vol 14 ◽  
pp. 174830262096239 ◽  
Author(s):  
Chuang Wang ◽  
Wenbo Du ◽  
Zhixiang Zhu ◽  
Zhifeng Yue

With the wide application of intelligent sensors and internet of things (IoT) in the smart job shop, a large number of real-time production data is collected. Accurate analysis of the collected data can help producers to make effective decisions. Compared with the traditional data processing methods, artificial intelligence, as the main big data analysis method, is more and more applied to the manufacturing industry. However, the ability of different AI models to process real-time data of smart job shop production is also different. Based on this, a real-time big data processing method for the job shop production process based on Long Short-Term Memory (LSTM) and Gate Recurrent Unit (GRU) is proposed. This method uses the historical production data extracted by the IoT job shop as the original data set, and after data preprocessing, uses the LSTM and GRU model to train and predict the real-time data of the job shop. Through the description and implementation of the model, it is compared with KNN, DT and traditional neural network model. The results show that in the real-time big data processing of production process, the performance of the LSTM and GRU models is superior to the traditional neural network, K nearest neighbor (KNN), decision tree (DT). When the performance is similar to LSTM, the training time of GRU is much lower than LSTM model.


2014 ◽  
Vol 933 ◽  
pp. 584-589
Author(s):  
Zhi Chun Zhang ◽  
Song Wei Li ◽  
Wei Ren Wang ◽  
Wei Zhang ◽  
Li Jun Qi

This paper presents a system in which the cluster devices are controlled by single-chip microcomputers, with emphasis on the cluster management techniques of single-chip microcomputers. Each device in a cluster is controlled by a single-chip microcomputer collecting sample data sent to and driving the device by driving data received from the same cluster management computer through COMs. The cluster management system running on the cluster management computer carries out such control as initial SCM identification, run time slice management, communication resource utilization, fault tolerance and error corrections on single-chip microcomputers. Initial SCM identification is achieved by signal responses between the single-chip microcomputers and the cluster management computer. By using the port priority and the parallelization of serial communications, the systems real-time performance is maximized. The real-time performance can be adjusted and improved by increasing or decreasing COMs and the ports linked to each COM, and the real-time performance can also be raised by configuring more cluster management computers. Fault-tolerant control occurs in the initialization phase and the operational phase. In the initialization phase, the cluster management system incorporates unidentified single-chip microcomputers into the system based on the history information recorded on external storage media. In the operational phase, if an operation error of reading and writing on a single-chip microcomputer reaches a predetermined threshold, the single-chip microcomputer is regarded as serious fault or not existing. The cluster management system maintains accuracy maintenance database on external storage medium to solve nonlinear control of specific devices and accuracy maintenance due to wear. The cluster management system uses object-oriented method to design a unified driving framework in order to enable the implementation of the cluster management system simplified, standardized and easy to transplant. The system has been applied in a large-scale simulation system of 230 single-chip microcomputers, which proves that the system is reliable, real-time and easy to maintain.


2017 ◽  
Vol 8 (2) ◽  
pp. 88-105 ◽  
Author(s):  
Gunasekaran Manogaran ◽  
Daphne Lopez

Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.


Author(s):  
Junyi Hou ◽  
Lei Yu ◽  
Yifan Fang ◽  
Shumin Fei

Aiming at the problem that the mixed noise interference caused by the mixed projection noise system is not accurate and the real-time performance is poor, this article proposes an adaptive system switching filtering method based on Bayesian estimation switching rules. The method chooses joint bilateral filtering and improved adaptive median filtering as the filtering subsystems and selects the sub-filtering system suitable for the noise by switching rules to achieve the purpose of effectively removing noise. The simulation experiment was carried out by the self-developed human–computer interactive projection image system platform. Through the subjective evaluation, objective evaluation, and running time comparison analysis, a better filtering effect was achieved, and the balance between the filtering precision and the real-time performance of the interactive system was well obtained. Therefore, the proposed method can be widely applied to various human–computer interactive image filtering systems.


2016 ◽  
Vol 4 (3) ◽  
pp. 163-181
Author(s):  
Pouria Sarhadi ◽  
Reza Nad Ali Niachari ◽  
Morteza Pouyan Rad ◽  
Javad Enayati

Purpose The purpose of this paper is to propose a software engineering procedure for real-time software development and verification of an autonomous underwater robotic system. High performance and robust software are one of the requirements of autonomous systems design. A simple error in the software can easily lead to a catastrophic failure in a complex system. Then, a systematic procedure is presented for this purpose. Design/methodology/approach This paper utilizes software engineering tools and hardware-inthe-loop (HIL) simulations for real-time system design of an autonomous underwater robot. Findings In this paper, the architecture of the system is extracted. Then, using software engineering techniques a suitable structure for control software is presented. Considering the desirable targets of the robot, suitable algorithms and functions are developed. After the development stage, proving the real-time performance of the software is disclosed. Originality/value A suitable approach for analyzing the real-time performance is presented. This approach is implemented using HIL simulations. The developed structure is applicable to other autonomous systems.


Sign in / Sign up

Export Citation Format

Share Document