An Integrated Architecture for Satellite Control Center System

2014 ◽  
Vol 951 ◽  
pp. 269-273 ◽  
Author(s):  
Hui Ping Shang ◽  
Ying Jia ◽  
Jian Xing Wang

This paper presents a layered and integrated approach to the SCC system in TT&C area. The SCC system is developed to encapsulate this approach. A four-layer hierarchy is proposed for the SCC system among the following layers: user agent layer, application process layer, lower layer services and support, and hardware. The SCC system includes nine functional components: monitoring and control, data display, command operation, data processing, data storage and management, orbit calculation and analysis, interface computer software, duplex management and time service. The SCC system is implemented as five functional subsystems: data processing system, database management system, operation and control system, and interface computer system. Details of the overall system architecture and the integration of the above components and subsystems are included in the paper.

2018 ◽  
Vol 1 (2) ◽  
pp. 14-24
Author(s):  
Dame Christine Sagala ◽  
Ali Sadikin ◽  
Beni Irawan

The data processing systems is a very necessary way to manipulate a data into useful information. The system makes data storage, adding, changing, scheduling to reporting well integrated, so that it can help parts to exchange information and make decisions quickly. The problems faced by GKPI Pal Merah Jambi are currently still using Microsoft Office Word and in disseminating information such as worship schedules, church activities and other worship routines through paper and wall-based worship services. To print worship and report reports requires substantial operational funds, in addition to data collection and storage there are still deficiencies including recording data on the book, difficulty in processing large amounts of data and stored in only one special place that is passive. Based on the above problems, the author is interested in conducting research with the title Designing Data Processing Systems for Web-Based Churches in the GKPI Pal Merah Church in Jambi. The purpose of this study is to design and produce a data processing system for the church. Using this system can facilitate data processing in the GKPI Pal Merah Jambi Church. This study uses a waterfall development method, a method that provides a systematic and sequential approach to system needs analysis, design, implementation and unit testing, system testing and care. Applications built using the web with MySQL DBMS database, PHP programming language and Laravel.


Author(s):  
Abou_el_ela Abdou Hussein

Day by day advanced web technologies have led to tremendous growth amount of daily data generated volumes. This mountain of huge and spread data sets leads to phenomenon that called big data which is a collection of massive, heterogeneous, unstructured, enormous and complex data sets. Big Data life cycle could be represented as, Collecting (capture), storing, distribute, manipulating, interpreting, analyzing, investigate and visualizing big data. Traditional techniques as Relational Database Management System (RDBMS) couldn’t handle big data because it has its own limitations, so Advancement in computing architecture is required to handle both the data storage requisites and the weighty processing needed to analyze huge volumes and variety of data economically. There are many technologies manipulating a big data, one of them is hadoop. Hadoop could be understand as an open source spread data processing that is one of the prominent and well known solutions to overcome handling big data problem. Apache Hadoop was based on Google File System and Map Reduce programming paradigm. Through this paper we dived to search for all big data characteristics starting from first three V's that have been extended during time through researches to be more than fifty six V's and making comparisons between researchers to reach to best representation and the precise clarification of all big data V’s characteristics. We highlight the challenges that face big data processing and how to overcome these challenges using Hadoop and its use in processing big data sets as a solution for resolving various problems in a distributed cloud based environment. This paper mainly focuses on different components of hadoop like Hive, Pig, and Hbase, etc. Also we institutes absolute description of Hadoop Pros and cons and improvements to face hadoop problems by choosing proposed Cost-efficient Scheduler Algorithm for heterogeneous Hadoop system.


2012 ◽  
Vol 503-504 ◽  
pp. 1330-1333
Author(s):  
Yan Kun Wang ◽  
Yun Xu Shi ◽  
Hong Mei Fan

The mine safety monitoring system is a set of sensor technology, electronics technology, power electronics technology, computer technology, wireless communication and network technology in one of China's leading multi-functional computer network systems, including underground, Inoue environment and equipment the detection of network systems and the Inoue monitoring data processing system. Environment and equipment for testing network system to achieve underground, of Inoue environment physical monitoring and control; monitoring data processing system is a comprehensive treatment of the collected data in order to achieve the sub-station set up and control equipment or detection sensors, through LAN detection information sharing, may constitute the enterprise information system.


Author(s):  
Ahmad Junaidi

Developments in science and technology in a rapid growth has pushed people to seek and implement ways or new methods of surveillance and control data processing system to run smoothly. Ability and speed in processing the data repeatedly and with a very large number have no doubt to generate the reports required in the process of strategic decision making. So that at the present moment has a lot of companies and government agencies want receipts of computer technology to assist in solving problems of their data processing. In the Directorate of Prisoners and Evidence (DITTAHTI) of West Sumatra Regional Police, the processing of prisoner data and evidence of goods has been done frequently, but has not obtained optimal results. This is due to the use of information technology is still very less and still implemented offline and manual by city and district police. Optimization of data processing is necessary for integrity, access rights and data availability can be maintained properly. Application System to be proposed later is PHP MYSQL. All data entry will be processed in a Database. Diverse data will be more easily and quickly processed in a well-structured system. Keywords : Information System, Directorate of Prisoner and Evidence, PHP, MYSQL


2020 ◽  
Vol 09 (01) ◽  
pp. 2050003
Author(s):  
D. Cutajar ◽  
A. Magro ◽  
J. Borg ◽  
K. Z. Adami ◽  
G. Bianchi ◽  
...  

The growing population of artificial satellites in near-Earth orbit has made the monitoring of orbital debris objects ever more important. Orbital debris objects pose a threat to these satellites as their orbit cannot be changed in order to avoid a collision. In recent years, the European Space Agency (ESA)’s Space Surveillance and Tracking (SST) programme has been assisting national institutions in the upgrading of their space debris detection and monitoring capabilities. One of the latest such systems within this programme is the BIRALES space surveillance system based in Italy. The receiving antenna is a radio telescope that is made up of 32 receivers which are placed on eight parabolic cylindrical reflectors of the North–South arm of the Istituto Nazionale di Astrofisica (INAF)’s Northern Cross. This work introduces a new software backend which was developed for this novel space debris sensor. The system was designed to be a fast, highly configurable software backend for the radio telescope’s acquisition and processing system and whose monitoring and control can be realized by a simple front-end web-based application. The real-time detection of Resident Space Object (RSO) is an important prerequisite for such a system as it gives the operator an immediate feedback loop on any detections whilst keeping the storage requirements at a minimum given that there is no need to save the raw data. The detection of high-velocity objects is achieved by means of a specially developed data processing pipeline that uses the received raw antenna voltages to generate a number of beams, collectively known as a multipixel, that cover the Field of View (FoV) of the instrument. The trajectory of the detected objects is determined by considering the illumination sequence within this multipixel. The initial results on known objects represent the first steps in extending the growing network of European SST systems.


2014 ◽  
Vol 556-562 ◽  
pp. 6302-6306 ◽  
Author(s):  
Chun Mei Duan

In allusion to limitations of traditional data processing technology in big data processing, big data processing system architecture based on hadoop is designed, using the characteristics of quantification, unstructured and dynamic of cloud computing.It uses HDFS be responsible for big data storage, and uses MapReduce be responsible for big data calculation and uses Hbase as unstructured data storage database, at the same time a system of storage and cloud computing security model are designed, in order to implement efficient storage, management, and retrieval of data,thus it can save construction cost, and guarantee system stability, reliability and security.


2011 ◽  
Vol 148-149 ◽  
pp. 1280-1284
Author(s):  
Zhi Huang Huang ◽  
Jun Lin ◽  
Dan Lv

The design principles and implementation of a high-precision temperature and humidity’s monitoring and controlling system is introduced in this paper. The data acquisition and transmission of temperature and humidity node in distributed network is realized by the effective use of high-precision temperature and humidity sensor and CAN-bus. Then the principle experiments are carried out, which indicates that the system overcomes the shortcomings of the traditional temperature and humidity monitoring system, such like low transmission rate and poor real-time service, improve the accuracy of the system acquisition and meets increasingly stringent project monitoring and control applications.


2019 ◽  
Vol 9 (1) ◽  
pp. 1-8
Author(s):  
Marliana Budhiningtias Winanti ◽  
Meylan Lesnusa

The development of globalization brings with significant impact for every layer of society, especially the development of many technologies needed by every human being, not least in the areas of employment such as health, and others. In this case the Public Lung Health Center (BKPM) Maluku province is the center of the health inspection service laboratory. Where every day a lot of people who come from different places to check their condition to obtain the required health outcomes, but the increase in performance of health services is still not properly fit most people's expectations, because of patient data recording system is still done manually, a long time patient data storage system still manually which takes in the search for patient data it is considered not effective, patient examination data processing is still considered a long time because the process is done. Therefore created an information system to assist agencies in addressing the problem and help some of the difficulties that exist. The inspection data processing system designed to help process patient data input, data storage, and so forth process is computerized. The method used a structured and methods of development of information systems data processing desktop-based health care checks are made using the method Prototype, with tools such as system development flowmap, context diagram, DFD and database design tool. Keywords: Information Systems, Health Services, data processing.


Terminology ◽  
1994 ◽  
Vol 1 (2) ◽  
pp. 351-373 ◽  
Author(s):  
Juan C. Sager ◽  
Marie-Claude L'Homme

The pattern of existing terminological definitions is analysed, and a model for the terminological definition of concepts is proposed that is considered more appropriate to data-processing applications than present patterns of definition. The model consists of a regularised form of the traditional analytical definition by categorising and restricting the modes of description and thereby reducing the free-text element in the defining phrase. The new model should permit automatic prompting and control of the defining activity, highly efficient data storage and greater information extraction in retrieval. The model is currently being tested in a real application.


Sign in / Sign up

Export Citation Format

Share Document