Calibration of magnetic compass using an improved extreme learning machine based on reverse tuning

Sensor Review ◽  
2019 ◽  
Vol 39 (1) ◽  
pp. 121-128 ◽  
Author(s):  
Yanxia Liu ◽  
JianJun Fang ◽  
Gang Shi

PurposeThe sources of magnetic sensors errors are numerous, such as currents around, soft magnetic and hard magnetic materials and so on. The traditional methods mainly use explicit error models, and it is difficult to include all interference factors. This paper aims to present an implicit error model and studies its high-precision training method.Design/methodology/approachA multi-level extreme learning machine based on reverse tuning (MR-ELM) is presented to compensate for magnetic compass measurement errors by increasing the depth of the network. To ensure the real-time performance of the algorithm, the network structure is fixed to two ELM levels, and the maximum number of levels and neurons will not be continuously increased. The parameters of MR-ELM are further modified by reverse tuning to ensure network accuracy. Because the parameters of the network have been basically determined by least squares, the number of iterations is far less than that in the traditional BP neural network, and the real-time can still be guaranteed.FindingsThe results show that the training time of the MR-ELM is 19.65 s, which is about four times that of the fixed extreme learning algorithm, but training accuracy and generalization performance of the error model are better. The heading error is reduced from the pre-compensation ±2.5° to ±0.125°, and the root mean square error is 0.055°, which is about 0.46 times that of the fixed extreme learning algorithm.Originality/valueMR-ELM is presented to compensate for magnetic compass measurement errors by increasing the depth of the network. In this case, the multi-level ELM network parameters are further modified by reverse tuning to ensure network accuracy. Because the parameters of the network have been basically determined by least squares, the number of iterations is far less than that in the traditional BP neural network, and the real-time training can still be guaranteed. The revised manuscript improved the ELM algorithm itself (referred to as MR-ELM) and bring new ideas to the peers in the magnetic compass error compensation field.

2016 ◽  
Vol 4 (3) ◽  
pp. 163-181
Author(s):  
Pouria Sarhadi ◽  
Reza Nad Ali Niachari ◽  
Morteza Pouyan Rad ◽  
Javad Enayati

Purpose The purpose of this paper is to propose a software engineering procedure for real-time software development and verification of an autonomous underwater robotic system. High performance and robust software are one of the requirements of autonomous systems design. A simple error in the software can easily lead to a catastrophic failure in a complex system. Then, a systematic procedure is presented for this purpose. Design/methodology/approach This paper utilizes software engineering tools and hardware-inthe-loop (HIL) simulations for real-time system design of an autonomous underwater robot. Findings In this paper, the architecture of the system is extracted. Then, using software engineering techniques a suitable structure for control software is presented. Considering the desirable targets of the robot, suitable algorithms and functions are developed. After the development stage, proving the real-time performance of the software is disclosed. Originality/value A suitable approach for analyzing the real-time performance is presented. This approach is implemented using HIL simulations. The developed structure is applicable to other autonomous systems.


2019 ◽  
Vol 31 (1) ◽  
pp. 265-290 ◽  
Author(s):  
Ganjar Alfian ◽  
Muhammad Fazal Ijaz ◽  
Muhammad Syafrudin ◽  
M. Alex Syaekhoni ◽  
Norma Latif Fitriyani ◽  
...  

PurposeThe purpose of this paper is to propose customer behavior analysis based on real-time data processing and association rule for digital signage-based online store (DSOS). The real-time data processing based on big data technology (such as NoSQL MongoDB and Apache Kafka) is utilized to handle the vast amount of customer behavior data.Design/methodology/approachIn order to extract customer behavior patterns, customers’ browsing history and transactional data from digital signage (DS) could be used as the input for decision making. First, the authors developed a DSOS and installed it in different locations, so that customers could have the experience of browsing and buying a product. Second, the real-time data processing system gathered customers’ browsing history and transaction data as it occurred. In addition, the authors utilized the association rule to extract useful information from customer behavior, so it may be used by the managers to efficiently enhance the service quality.FindingsFirst, as the number of customers and DS increases, the proposed system was capable of processing a gigantic amount of input data conveniently. Second, the data set showed that as the number of visit and shopping duration increases, the chance of products being purchased also increased. Third, by combining purchasing and browsing data from customers, the association rules from the frequent transaction pattern were achieved. Thus, the products will have a high possibility to be purchased if they are used as recommendations.Research limitations/implicationsThis research empirically supports the theory of association rule that frequent patterns, correlations or causal relationship found in various kinds of databases. The scope of the present study is limited to DSOS, although the findings can be interpreted and generalized in a global business scenario.Practical implicationsThe proposed system is expected to help management in taking decisions such as improving the layout of the DS and providing better product suggestions to the customer.Social implicationsThe proposed system may be utilized to promote green products to the customer, having a positive impact on sustainability.Originality/valueThe key novelty of the present study lies in system development based on big data technology to handle the enormous amounts of data as well as analyzing the customer behavior in real time in the DSOS. The real-time data processing based on big data technology (such as NoSQL MongoDB and Apache Kafka) is used to handle the vast amount of customer behavior data. In addition, the present study proposed association rule to extract useful information from customer behavior. These results can be used for promotion as well as relevant product recommendations to DSOS customers. Besides in today’s changing retail environment, analyzing the customer behavior in real time in DSOS helps to attract and retain customers more efficiently and effectively, and retailers can get a competitive advantage over their competitors.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Sandeep Kumar Singh ◽  
Mamata Jenamani

Purpose The purpose of this paper is to design a supply chain database schema for Cassandra to store real-time data generated by Radio Frequency IDentification technology in a traceability system. Design/methodology/approach The real-time data generated in such traceability systems are of high frequency and volume, making it difficult to handle by traditional relational database technologies. To overcome this difficulty, a NoSQL database repository based on Casandra is proposed. The efficacy of the proposed schema is compared with two such databases, document-based MongoDB and column family-based Cassandra, which are suitable for storing traceability data. Findings The proposed Cassandra-based data repository outperforms the traditional Structured Query Language-based and MongoDB system from the literature in terms of concurrent reading, and works at par with respect to writing and updating of tracing queries. Originality/value The proposed schema is able to store the real-time data generated in a supply chain with low latency. To test the performance of the Cassandra-based data repository, a test-bed is designed in the lab and supply chain operations of Indian Public Distribution System are simulated to generate data.


2014 ◽  
Vol 494-495 ◽  
pp. 206-209
Author(s):  
Xue Feng Yang ◽  
Li Gang Chen ◽  
Xian Feng Zhong

There are considerable difference between the actual distance and that measured by infrared or ultrasonic ranging. The car reversing isnt intelligent enough. In order to solve the issues, this paper design an automatic car reversing auxiliary systems based on monocular sight. The system hardware mainly consists of image collection module, embedded micro-controller, and electronic braking module. On the basis of the distance measurement algorithm based on monocular sight, the real-time distance to the front vehicle can be measured and can be auxiliary controlled via the data exchange among vehicle electrical control units. The vehicle dynamic driving experiment verifies the high reliability of the vehicle automatic reversing auxiliary system based on monocular sight. The distance measurement errors are less than 2% when the distance to the front barriers is in the range of 20m~70m. The system can satisfy the real-time requirements for the vehicle intelligent auxiliary braking.


2017 ◽  
Vol 117 (9) ◽  
pp. 1890-1905 ◽  
Author(s):  
Yingfeng Zhang ◽  
Lin Zhao ◽  
Cheng Qian

Purpose The huge demand for fresh goods has stimulated lots of research on the perishable food supply chain. The characteristics of perishable food and the cross-regional transportation have brought many challenges to the operation models of perishable food supply chain. The purpose of this paper is to address these challenges based on the real-time data acquired by the Internet of Things (IoT) devices. Design/methodology/approach IoT and the modeling of the Supply Hub in Industrial Parks were adopted in the perishable food supply chain. Findings A conceptual model was established for the IoT-enabled perishable food supply chain with two-echelon supply hubs. The performance of supply chain has improved when implementing the proposed model, as is demonstrated by a case study. Originality/value By our model, the supply hubs which act as the dominators of the supply chain can respond to the real-time information captured from the operation processes of an IoT-enabled supply chain, thus to provide public warehousing and logistic services.


2017 ◽  
Vol 10 (2) ◽  
pp. 130-144 ◽  
Author(s):  
Iwan Aang Soenandi ◽  
Taufik Djatna ◽  
Ani Suryani ◽  
Irzaman Irzaman

Purpose The production of glycerol derivatives by the esterification process is subject to many constraints related to the yield of the production target and the lack of process efficiency. An accurate monitoring and controlling of the process can improve production yield and efficiency. The purpose of this paper is to propose a real-time optimization (RTO) using gradient adaptive selection and classification from infrared sensor measurement to cover various disturbances and uncertainties in the reactor. Design/methodology/approach The integration of the esterification process optimization using self-optimization (SO) was developed with classification process was combined with necessary condition optimum (NCO) as gradient adaptive selection, supported with laboratory scaled medium wavelength infrared (mid-IR) sensors, and measured the proposed optimization system indicator in the batch process. Business Process Modeling and Notation (BPMN 2.0) was built to describe the tasks of SO workflow in collaboration with NCO as an abstraction for the conceptual phase. Next, Stateflow modeling was deployed to simulate the three states of gradient-based adaptive control combined with support vector machine (SVM) classification and Arduino microcontroller for implementation. Findings This new method shows that the real-time optimization responsiveness of control increased product yield up to 13 percent, lower error measurement with percentage error 1.11 percent, reduced the process duration up to 22 minutes, with an effective range of stirrer rotation set between 300 and 400 rpm and final temperature between 200 and 210°C which was more efficient, as it consumed less energy. Research limitations/implications In this research the authors just have an experiment for the esterification process using glycerol, but as a development concept of RTO, it would be possible to apply for another chemical reaction or system. Practical implications This research introduces new development of an RTO approach to optimal control and as such marks the starting point for more research of its properties. As the methodology is generic, it can be applied to different optimization problems for a batch system in chemical industries. Originality/value The paper presented is original as it presents the first application of adaptive selection based on the gradient value of mid-IR sensor data, applied to the real-time determining control state by classification with the SVM algorithm for esterification process control to increase the efficiency.


2014 ◽  
Vol 631-632 ◽  
pp. 516-520
Author(s):  
Chao Yang ◽  
Shui Yan Dai ◽  
Ling Da Wu ◽  
Rong Huan Yu

The method of view-dependent smoothly rendering of large-scale vector data based on the vector texture on virtual globe is presented. The vector texture is rasterized from the vector data based on view-dependent quadtree LOD. And the vector texture is projected on the top of the terrain. The smooth transition of multi-level texture is realized by adjusting the transparency of texture dynamically based on view range in two processes to avoid texture “popping”. In “IN” process, the texture’s alpha value increases when the view range goes up while In “OUT” process, the texture’s alpha value decreases. the vector texture buffer updating method is used to accelerate the texture fetching based on the least-recently-used algorithm. In the end, the real-time large-scale vector data rendering is implemented on virtual globe. The result shows that this method can real-time render large-scale vector data smoothly.


2015 ◽  
Vol 53 (8) ◽  
pp. 2693-2696 ◽  
Author(s):  
Ramzi Ghodbane ◽  
Shady Asmar ◽  
Marlena Betzner ◽  
Marie Linet ◽  
Joseph Pierquin ◽  
...  

Culture remains the cornerstone of diagnosis for pulmonary tuberculosis, but the fastidiousness ofMycobacterium tuberculosismay delay culture-based diagnosis for weeks. We evaluated the performance of real-time high-resolution imaging for the rapid detection ofM. tuberculosiscolonies growing on a solid medium. A total of 50 clinical specimens, including 42 sputum specimens, 4 stool specimens, 2 bronchoalveolar lavage fluid specimens, and 2 bronchial aspirate fluid specimens were prospectively inoculated into (i) a commercially available Middlebrook broth and evaluated for mycobacterial growth indirectly detected by measuring oxygen consumption (standard protocol) and (ii) a home-made solid medium incubated in an incubator featuring real-time high-resolution imaging of colonies (real-time protocol). Isolates were identified by Ziehl-Neelsen staining and matrix-assisted laser desorption ionization–time of flight mass spectrometry. Use of the standard protocol yielded 14/50 (28%)M. tuberculosisisolates, which is not significantly different from the 13/50 (26%)M. tuberculosisisolates found using the real-time protocol (P= 1.00 by Fisher's exact test), and the contamination rate of 1/50 (2%) was not significantly different from the contamination rate of 2/50 (4%) using the real-time protocol (P= 1.00). The real-time imaging protocol showed a 4.4-fold reduction in time to detection, 82 ± 54 h versus 360 ± 142 h (P< 0.05). These preliminary data give the proof of concept that real-time high-resolution imaging ofM. tuberculosiscolonies is a new technology that shortens the time to growth detection and the laboratory diagnosis of pulmonary tuberculosis.


2019 ◽  
Vol 8 (4) ◽  
pp. 338-350
Author(s):  
Mauricio Loyola

Purpose The purpose of this paper is to propose a simple, fast, and effective method for detecting measurement errors in data collected with low-cost environmental sensors typically used in building monitoring, evaluation, and automation applications. Design/methodology/approach The method combines two unsupervised learning techniques: a distance-based anomaly detection algorithm analyzing temporal patterns in data, and a density-based algorithm comparing data across different spatially related sensors. Findings Results of tests using 60,000 observations of temperature and humidity collected from 20 sensors during three weeks show that the method effectively identified measurement errors and was not affected by valid unusual events. Precision, recall, and accuracy were 0.999 or higher for all cases tested. Originality/value The method is simple to implement, computationally inexpensive, and fast enough to be used in real-time with modest open-source microprocessors and a wide variety of environmental sensors. It is a robust and convenient approach for overcoming the hardware constraints of low-cost sensors, allowing users to improve the quality of collected data at almost no additional cost and effort.


Sign in / Sign up

Export Citation Format

Share Document