Data Communication Methods for the Fault Diagnosis Instrument System

2010 ◽  
Vol 42 ◽  
pp. 386-390
Author(s):  
Deng Pan Zhang ◽  
Hong Li Zhu ◽  
Yong Gang Shi

The data communication plays a very important role in fault diagnosis instrument system. In this paper, a flexible software bus method is proposed which is designed for the communication among instrument components. In this way, the data processing modules can be prefabricated together dynamically and in parallel. To carry out the communication between the bus and components, the design principle and the architecture of the components is discussed, and then the fault diagnosis instrument cases can be constructed by configuring the instrument components with the communication addresses. Based on the software bus, the processed data of fault diagnosis can propagate among the configured data flow ways asynchronously, and users can plug and unplug any components to the instrument platform. The application cases shows that the proposed way can accelerate data exchange among the modules based on the components and improve the working processes efficiency.

2014 ◽  
Vol 926-930 ◽  
pp. 2155-2159
Author(s):  
Chao Zhang ◽  
Xiang Jun Pan

The wind turbine remote monitoring and fault diagnosis is the key of improving unit efficiency, reducing maintenance costs, and ensuring safe and stable operation of the wind farms. A wide variety of wind farm monitoring systems, non-uniform communication standards, and heterogeneous communication platforms is result in a difficulty of information exchange. In this paper, on the basis of following IEC61400-25 standard, we put forward using XML technology to encapsulate data to avoid the formation of the "information island" in wind farms; By combining with VC++ Socket technology to establish a network communication system, this paper solves the data exchange problem of the wind turbine remote monitoring and fault diagnosis system through achievement of data receiving and dispatching; In the end, we use the waveform curve way to display the XML parsing of data, which is convenient for staff to observe and analyze the data.


2011 ◽  
Vol 141 ◽  
pp. 244-250
Author(s):  
Jian Wan ◽  
Tai Yong Wang ◽  
Jing Chuan Dong ◽  
Pan Zhang ◽  
Yan Hao

To insure that sampling signal integrity, accuracy and real-time performance can adapt to the development of rotating machine fault diagnosis technology, a master-slave architecture handheld rotating machine fault diagnosis instrument was developed based on S3C2410 ARM IC and TMS320VC5509A DSP IC. It provided an effective method for the field monitoring and diagnosis of the large rotating machine. The whole design idea and the structure of the hardware and the software were systematically introduced. The paper focused on the master-slave architecture design of the hardware, the communication methods between the master and the slave processor, and the signal pretreatment module design. Put into practice, the practicability, reliability and stability of the instrument were confirmed.


Author(s):  
И.В. Бычков ◽  
Г.М. Ружников ◽  
В.В. Парамонов ◽  
А.С. Шумилов ◽  
Р.К. Фёдоров

Рассмотрен инфраструктурный подход обработки пространственных данных для решения задач управления территориальным развитием, который основан на сервис-ориентированной парадигме, стандартах OGC, web-технологиях, WPS-сервисах и геопортале. The development of territories is a multi-dimensional and multi-aspect process, which can be characterized by large volumes of financial, natural resources, social, ecological and economic data. The data is highly localized and non-coordinated, which limits its complex analysis and usage. One of the methods of large volume data processing is information-analytical environments. The architecture and implementation of the information-analytical environment of the territorial development in the form of Geoportal is presented. Geoportal provides software instruments for spatial and thematic data exchange for its users, as well as OGC-based distributed services that deal with the data processing. Implementation of the processing and storing of the data in the form of services located on distributed servers allows simplifying their updating and maintenance. In addition, it allows publishing and makes processing to be more open and controlled process. Geoportal consists of following modules: content management system Calipso (presentation of user interface, user management, data visualization), RDBMS PostgreSQL with spatial data processing extension, services of relational data entry and editing, subsystem of launching and execution of WPS-services, as well as services of spatial data processing, deployed at the local cloud environment. The presented article states the necessity of using the infrastructural approach when creating the information-analytical environment for the territory management, which is characterized by large volumes of spatial and thematical data that needs to be processed. The data is stored in various formats and applications of service-oriented paradigm, OGC standards, web-technologies, Geoportal and distributed WPS-services. The developed software system was tested on a number of tasks that arise during the territory development.


2016 ◽  
Vol 49 (1) ◽  
pp. 302-310 ◽  
Author(s):  
Michael Kachala ◽  
John Westbrook ◽  
Dmitri Svergun

Recent advances in small-angle scattering (SAS) experimental facilities and data analysis methods have prompted a dramatic increase in the number of users and of projects conducted, causing an upsurge in the number of objects studied, experimental data available and structural models generated. To organize the data and models and make them accessible to the community, the Task Forces on SAS and hybrid methods for the International Union of Crystallography and the Worldwide Protein Data Bank envisage developing a federated approach to SAS data and model archiving. Within the framework of this approach, the existing databases may exchange information and provide independent but synchronized entries to users. At present, ways of exchanging information between the various SAS databases are not established, leading to possible duplication and incompatibility of entries, and limiting the opportunities for data-driven research for SAS users. In this work, a solution is developed to resolve these issues and provide a universal exchange format for the community, based on the use of the widely adopted crystallographic information framework (CIF). The previous version of the sasCIF format, implemented as an extension of the core CIF dictionary, has been available since 2000 to facilitate SAS data exchange between laboratories. The sasCIF format has now been extended to describe comprehensively the necessary experimental information, results and models, including relevant metadata for SAS data analysis and for deposition into a database. Processing tools for these files (sasCIFtools) have been developed, and these are available both as standalone open-source programs and integrated into the SAS Biological Data Bank, allowing the export and import of data entries as sasCIF files. Software modules to save the relevant information directly from beamline data-processing pipelines in sasCIF format are also developed. This update of sasCIF and the relevant tools are an important step in the standardization of the way SAS data are presented and exchanged, to make the results easily accessible to users and to promote further the application of SAS in the structural biology community.


Author(s):  
Aryo Pinandito

Information system is one of the most important business supports in organizations. Web-based applications become an appropriate solution to overcome the dynamically changing environment among different units in an organization. Model-View-Controller (MVC) is a well-known design pattern in web-based application development due to the separation of an application into several parts, hence it is easy to reuse and maintain. However, such design pattern requires improvements since the information system handles business process choreography and integration between application. Therefore, modifying the interaction of object of class in a design pattern become a challenging problem. In this paper, an application framework based on Model-CollectionService-Controller-Presenter (MCCP) design pattern, which is a modification of an MVC, was proposed. The proposed framework allows multiple different applications to run and provides inter-application data exchange mechanisms to improve the data communication process between applications. Several performance comparisons with another popular web application framework are also presented.


Author(s):  
Dan Pescaru ◽  
Daniel-Ioan Curiac

This chapter presents the main challenges in developing complex systems built around the core concept of Video-Based Wireless Sensor Networks. It summarizes some innovative solutions proposed in scientific literature on this field. Besides discussion on various issues related to such systems, the authors focus on two crucial aspects: video data processing and data exchange. A special attention is paid to localization algorithms in case of random deployment of nodes having no specific localization hardware installed. Solutions for data exchange are presented by highlighting the data compression and communication efficiency in terms of energy saving. In the end, some open research topics related with Video-Based Wireless Sensor Networks are identified and explained.


Author(s):  
Mais Haj Qasem ◽  
Alaa Abu-Srhan ◽  
Hutaf Natoureah ◽  
Esra Alzaghoul

Fog-computing is a new network architecture and computing paradigm that uses user or near-users devices (network edge) to carry out some processing tasks. Accordingly, it extends the cloud computing with more flexibility the one found in the ubiquitous networks. A smart city based on the concept of fog-computing with flexible hierarchy is proposed in this paper. The aim of the proposed design is to overcome the limitations of the previous approaches, which depends on using various network architectures, such as cloud-computing, autonomic network architecture and ubiquitous network architecture. Accordingly, the proposed approach achieves a reduction of the latency of data processing and transmission with enabled real-time applications, distribute the processing tasks over edge devices in order to reduce the cost of data processing and allow collaborative data exchange among the applications of the smart city. The design is made up of five major layers, which can be increased or merged according to the amount of data processing and transmission in each application. The involved layers are connection layer, real-time processing layer, neighborhood linking layer, main-processing layer, data server layer. A case study of a novel smart public car parking, traveling and direction advisor is implemented using IFogSim and the results showed that reduce the delay of real-time application significantly, reduce the cost and network usage compared to the cloud-computing paradigm. Moreover, the proposed approach, although, it increases the scalability and reliability of the users’ access, it does not sacrifice much time, nor cost and network usage compared to fixed fog-computing design.


2019 ◽  
Vol 12 (12) ◽  
pp. 3254-3264 ◽  
Author(s):  
José Aagel Pecina Sánchez ◽  
Daniel U. Campos‐Delgado ◽  
Diego R. Espinoza‐Trejo ◽  
Andres A. Valdez‐Fernández ◽  
Cristian H. De Angelo

2018 ◽  
Vol 1 (102) ◽  
pp. 305
Author(s):  
Rosario Serra Cristóbal

Resumen:La gestión coordinada de las fronteras y el funcionamiento eficaz de los sistemas de tratamiento de datos de circulación de personas pueden servir como mecanismo de alerta temprana frente al riesgo de ataques terroristas. Puede fortalecer la capacidad colectiva de los Estados para detectar, prevenir y combatir el terrorismo al facilitar el intercambio oportuno de información, permitiendo así adoptar de forma responsable decisiones cruciales.Este trabajo analiza los concretos instrumentos de gestión de datos en fronteras que pueden ser útiles en la lucha antiterrorista, porque el primer paso en inteligencia reside en la obtención de información, que luego será analizada y tratada para convertir esa información en conocimiento. Como tendremos oportunidad de comprobar, muchas de las bases de datos en fronteras se crearon para controlar la entrada de inmigrantes en las fronteras europeas, pero la información que ofrecen dichos sistemas puede servir también para luchar contra ese reto que nos amenaza, el del terrorismo yihadista. No obstante, este trabajo subraya que se trata de fenómenos distintos.Es cierto que la nueva oleada de ataques yihadistas ha coincidido, en el mismo espacio temporal, con la mayor crisis migratoria a la que se ha tenido que enfrentar Europa debido a crisis humanitarias y posteriormente a la guerra de Siria u otros conflictos. Pero, no son lo mismo. El terrorismo yihadista y la inmigración poco o nada tienen que ver, por mucho que se hayan querido vincular o se hayan pretendido justificar determinadas políticas contra la inmigración como algo necesario para luchar contra el terrorismo yihadista, con el fácil argumento de que frenando la inmigración se evita la entrada de potenciales terroristas en Europa.El trabajo advierte del riesgo de que la lucha contra el terrorismo sea utilizada para reforzar los controles de personas en las fronteras con el verdadero objetivo de frenar los flujos migratorios. Al tiempo, subraya la necesidad de que en dichos controles se sigan directrices y prácticas claras y se respeten plenamente las obligaciones que los Estados tienen de conformidad con el Derecho internacional, tal como ha recordado el Tribunal Europeo de Derechos Humanos y el Tribunal de Justicia de la Unión Europea. De hecho, no son pocos los casos en los que estos Tribunales han subrayado la relevancia indubitada de principios como la reserva de ley, la necesidad o la proporcionalidad como sustrato de la licitud de muchas medidas que incluyen el tratamiento de datos personales.Summary:1. Jihadist terrorism as a cross-border phenomenon. 2. The benefit of data exchange on crossing-borders in the Schengen area. 3. New guidelines on data processing and the safeguard of national security. 4. The register of passengers (The Personal Name Record or PNR). 5. When the data cross the external borders. The exchange of data with third countries. 5.1. The failed PNR Agreement with Canada and the EU Court of Justice’s standards regarding the transfer of passengers’ data. 5.2. The exchange of data with the United States. The EU-US Umbrella Agreement and the Privacy Shield. 6. The use of profiles and blacklists of alleged terrorists in cross-bording. 7. ConclusionsAbstract:EU Coordinated border management and effective functioning of data processing systems related to the movement of persons may serve as an early warning mechanism against the risk of terrorist attacks. It can strengthen the collective capacity of States to detect, prevent and combat terrorism by facilitating the timely exchange of information, thereby enabling crucial decisions to be adopted in a responsible manner.This paper analyzes the concrete border data management tools that can be useful in the fight against terrorism. The first step in intelligence lies in obtaining information, which will then be analyzed and treated to turn that information into useful knowledge. As we will have an opportunity to verify, numerous border databases were created to control the entry of immigrants into European borders, but the information offered by these systems can also serve to fight against this challenge that threatens us, that of jihadist terrorism.Nevertheless, we emphasize that terrorism and immigration are different phenomena. The truth is that the new wave of Jihadist attacks took place along the largest migratory crisis that Europe faced due to different humanitarian crises and to the war in Syria and other conflicts. But they represent different realities. Jihadist terrorism and immigration have little or nothing in common. In spite of this, many wish to link both with a view to justify certain anti-immigration policies as necessary actions for coping with Jihadist terrorism. This has been done based on a simple narrative: holding back immigration prevents the entry of potential terrorists in Europe.This paper shows that the risk that the fight against terrorism will be used as a basis to reinforce people controls at the borders, while the true objective of these measures is to curb migratory flows. At the same time, it underlines the need for clear guidelines and practices to be followed when implementing such controls. It also vindicates the need for States to observe their obligations laid down by international law, as recalled by the European Court of Human Rights and the EU Court of the Justice. In fact, in many cases, these jurisdictions highlighted the undoubted relevance of the statutory reserve principle, the principle of necessity or the principle of proportionality, as legal basis for the adoption of measures that include personal data processing.


2006 ◽  
Author(s):  
Nicholas A. Walton ◽  
Mark Cropper ◽  
Gerard Gilmore ◽  
Floor van Leeuwen ◽  
Mike Irwin ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document