scholarly journals Map Matching for Fixed Sensor Data Based on Utility Theory

2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Kangkang He ◽  
Qi Cao ◽  
Gang Ren ◽  
Dawei Li ◽  
Shuichao Zhang

Map matching can provide useful traffic information by aligning the observed trajectories of vehicles with the road network on a digital map. It has an essential role in many advanced intelligent traffic systems (ITSs). Unfortunately, almost all current map-matching approaches were developed for GPS trajectories generated by probe sensors mounted in a few vehicles and cannot deal with the trajectories of massive vehicle samples recorded by fixed sensors, such as camera detectors. In this paper, we propose a novel map-matching model termed Fixed-MM, which is designed specifically for fixed sensor data. Based on two key observations from real-world data, Fixed-MM considers (1) the utility of each path and (2) the travel time constraint to match the trajectories of fixed sensor data to a specific path. Meanwhile, with the laws derived from the distribution of GPS trajectories, a path generation algorithm was developed to search for candidates. The proposed Fixed-MM was examined with field-test data. The experimental results show that Fixed-MM outperforms two types of classical map-matching algorithms regarding accuracy and efficiency when fixed sensor data are used. The proposed Fixed-MM can identify 68.38% of the links correctly, even when the spatial gap between the sensor pair is increased to five kilometers. The average computation time spent by Fixed-MM on one point is only 0.067 s, and we argue that the proposed method can be used online for many real-time ITS applications.


2020 ◽  
Vol 32 (6) ◽  
pp. 1112-1120
Author(s):  
Kazuki Takahashi ◽  
◽  
Jumpei Arima ◽  
Toshihiro Hayata ◽  
Yoshitaka Nagai ◽  
...  

In this study, a novel framework for autonomous robot navigation system is proposed. The navigation system uses an edge-node map, which is easily created from electronic maps. Unlike a general self-localization method using an occupancy grid map or a 3D point cloud map, there is no need to run the robot in the target environment in advance to collect sensor data. In this system, the internal sensor is mainly used for self-localization. Assuming that the robot is running on the road, the position of the robot is estimated by associating the robot’s travel trajectory with the edge. In addition, node arrival determination is performed using branch point information obtained from the edge-node map. Because this system does not use map matching, robust self-localization is possible, even in a dynamic environment.



Sensors ◽  
2020 ◽  
Vol 20 (7) ◽  
pp. 2057 ◽  
Author(s):  
Wentao Bian ◽  
Ge Cui ◽  
Xin Wang

GPS (Global Positioning System) trajectories with low sampling rates are prevalent in many applications. However, current map matching methods do not perform well for low-sampling-rate GPS trajectories due to the large uncertainty between consecutive GPS points. In this paper, a collaborative map matching method (CMM) is proposed for low-sampling-rate GPS trajectories. CMM processes GPS trajectories in batches. First, it groups similar GPS trajectories into clusters and then supplements the missing information by resampling. A collaborative GPS trajectory is then extracted for each cluster and matched to the road network, based on longest common subsequence (LCSS) distance. Experiments are conducted on a real GPS trajectory dataset and a simulated GPS trajectory dataset. The results show that the proposed CMM outperforms the baseline methods in both, effectiveness and efficiency.



2015 ◽  
Vol 2015 ◽  
pp. 1-9
Author(s):  
Deepa Devasenapathy ◽  
Kathiravan Kannan

The traffic in the road network is progressively increasing at a greater extent. Good knowledge of network traffic can minimize congestions using information pertaining to road network obtained with the aid of communal callers, pavement detectors, and so on. Using these methods, low featured information is generated with respect to the user in the road network. Although the existing schemes obtain urban traffic information, they fail to calculate the energy drain rate of nodes and to locate equilibrium between the overhead and quality of the routing protocol that renders a great challenge. Thus, an energy-efficient cluster-based vehicle detection in road network using the intention numeration method (CVDRN-IN) is developed. Initially, sensor nodes that detect a vehicle are grouped into separate clusters. Further, we approximate the strength of the node drain rate for a cluster using polynomial regression function. In addition, the total node energy is estimated by taking the integral over the area. Finally, enhanced data aggregation is performed to reduce the amount of data transmission using digital signature tree. The experimental performance is evaluated with Dodgers loop sensor data set from UCI repository and the performance evaluation outperforms existing work on energy consumption, clustering efficiency, and node drain rate.



Author(s):  
Ervina Varijki ◽  
Bambang Krismono Triwijoyo

One type of cancer that is capable identified using MRI technology is breast cancer. Breast cancer is still the leading cause of death world. therefore early detection of this disease is needed. In identifying breast cancer, a doctor or radiologist analyzing the results of magnetic resonance image that is stored in the format of the Digital Imaging Communication In Medicine (DICOM). It takes skill and experience sufficient for diagnosis is appropriate, andaccurate, so it is necessary to create a digital image processing applications by utilizing the process of object segmentation and edge detection to assist the physician or radiologist in identifying breast cancer. MRI image segmentation using edge detection to identification of breast cancer using a method stages gryascale change the image format, then the binary image thresholding and edge detection process using the latest Robert operator. Of the20 tested the input image to produce images with the appearance of the boundary line of each region or object that is visible and there are no edges are cut off, with the average computation time less than one minute.



2020 ◽  
Author(s):  
Ansarullah ◽  
Ramli Rahim ◽  
Baharuddin Hamzah ◽  
Asniawaty Kusno ◽  
Muhammad Tayeb

Chicken feathers are the result of waste from slaughterhouses and billions ofkilograms of waste produced by various kinds of poultry processing. This hal is a veryserious problem for the environment because it causes the impact of pollution. Hasmany utilization of chicken feather waste such as making komocen, accessories,upholstery materials, making brackets to the manufacture of animal feed but from theresults of this activity cannot reduce the production of chicken feathers that hiscontinuously increase every year. This is due to the fact that the selling price of chickenmeat has been reached by consumers with middle to upper economic levels. This caneasily be a chicken menu in almost all restaurants and restaurants to the food stalls onthe side of the road. An alternative way of utilizing chicken feathers is to makecomposite materials in the form of panels. Recent studies have shown that the pvacmaterial can be utilized as a mixing and adhesive material with mashed or groundfeathered composites to form a panel that can later be used as an acoustic material.The test results show that the absorption of chicken feathers and pvac glue into panelscan absorb sound well with an absorption coefficient of 0.59, light. This result is veryeconomical so it is worth to be recommended as an acoustic material. Apart from theresults of research methods carried out is one of the environmentally friendly activitiesin particular the handling of waste problems



2021 ◽  
Vol 13 (2) ◽  
pp. 690
Author(s):  
Tao Wu ◽  
Huiqing Shen ◽  
Jianxin Qin ◽  
Longgang Xiang

Identifying stops from GPS trajectories is one of the main concerns in the study of moving objects and has a major effect on a wide variety of location-based services and applications. Although the spatial and non-spatial characteristics of trajectories have been widely investigated for the identification of stops, few studies have concentrated on the impacts of the contextual features, which are also connected to the road network and nearby Points of Interest (POIs). In order to obtain more precise stop information from moving objects, this paper proposes and implements a novel approach that represents a spatio-temproal dynamics relationship between stopping behaviors and geospatial elements to detect stops. The relationship between the candidate stops based on the standard time–distance threshold approach and the surrounding environmental elements are integrated in a complex way (the mobility context cube) to extract stop features and precisely derive stops using the classifier classification. The methodology presented is designed to reduce the error rate of detection of stops in the work of trajectory data mining. It turns out that 26 features can contribute to recognizing stop behaviors from trajectory data. Additionally, experiments on a real-world trajectory dataset further demonstrate the effectiveness of the proposed approach in improving the accuracy of identifying stops from trajectories.



Mathematics ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 634
Author(s):  
Tarek Frahi ◽  
Francisco Chinesta ◽  
Antonio Falcó ◽  
Alberto Badias ◽  
Elias Cueto ◽  
...  

We are interested in evaluating the state of drivers to determine whether they are attentive to the road or not by using motion sensor data collected from car driving experiments. That is, our goal is to design a predictive model that can estimate the state of drivers given the data collected from motion sensors. For that purpose, we leverage recent developments in topological data analysis (TDA) to analyze and transform the data coming from sensor time series and build a machine learning model based on the topological features extracted with the TDA. We provide some experiments showing that our model proves to be accurate in the identification of the state of the user, predicting whether they are relaxed or tense.



Author(s):  
FATHALLAH NOUBOUD ◽  
RÉJEAN PLAMONDON

This paper presents a real-time constraint-free handprinted character recognition system based on a structural approach. After the preprocessing operation, a chain code is extracted to represent the character. The classification is based on the use of a processor dedicated to string comparison. The average computation time to recognize a character is about 0.07 seconds. During the learning step, the user can define any set of characters or symbols to be recognized by the system. Thus there are no constraints on the handprinting. The experimental tests show a high degree of accuracy (96%) for writer-dependent applications. Comparisons with other system and methods are discussed. We also present a comparison between the processor used in this system and the Wagner and Fischer algorithm. Finally, we describe some applications of the system.



2007 ◽  
Vol 46 (03) ◽  
pp. 324-331 ◽  
Author(s):  
P. Jäger ◽  
S. Vogel ◽  
A. Knepper ◽  
T. Kraus ◽  
T. Aach ◽  
...  

Summary Objectives: Pleural thickenings as biomarker of exposure to asbestos may evolve into malignant pleural mesothelioma. Foritsearly stage, pleurectomy with perioperative treatment can reduce morbidity and mortality. The diagnosis is based on a visual investigation of CT images, which is a time-consuming and subjective procedure. Our aim is to develop an automatic image processing approach to detect and quantitatively assess pleural thickenings. Methods: We first segment the lung areas, and identify the pleural contours. A convexity model is then used together with a Hounsfield unit threshold to detect pleural thickenings. The assessment of the detected pleural thickenings is based on a spline-based model of the healthy pleura. Results: Tests were carried out on 14 data sets from three patients. In all cases, pleural contours were reliably identified, and pleural thickenings detected. PC-based Computation times were 85 min for a data set of 716 slices, 35 min for 401 slices, and 4 min for 75 slices, resulting in an average computation time of about 5.2 s per slice. Visualizations of pleurae and detected thickeningswere provided. Conclusion: Results obtained so far indicate that our approach is able to assist physicians in the tedious task of finding and quantifying pleural thickenings in CT data. In the next step, our system will undergo an evaluation in a clinical test setting using routine CT data to quantifyits performance.



2010 ◽  
Vol 3 (6) ◽  
pp. 1555-1568 ◽  
Author(s):  
B. Mijling ◽  
O. N. E. Tuinder ◽  
R. F. van Oss ◽  
R. J. van der A

Abstract. The Ozone Profile Algorithm (OPERA), developed at KNMI, retrieves the vertical ozone distribution from nadir spectral satellite measurements of back scattered sunlight in the ultraviolet and visible wavelength range. To produce consistent global datasets the algorithm needs to have good global performance, while short computation time facilitates the use of the algorithm in near real time applications. To test the global performance of the algorithm we look at the convergence behaviour as diagnostic tool of the ozone profile retrievals from the GOME instrument (on board ERS-2) for February and October 1998. In this way, we uncover different classes of retrieval problems, related to the South Atlantic Anomaly, low cloud fractions over deserts, desert dust outflow over the ocean, and the intertropical convergence zone. The influence of the first guess and the external input data including the ozone cross-sections and the ozone climatologies on the retrieval performance is also investigated. By using a priori ozone profiles which are selected on the expected total ozone column, retrieval problems due to anomalous ozone distributions (such as in the ozone hole) can be avoided. By applying the algorithm adaptations the convergence statistics improve considerably, not only increasing the number of successful retrievals, but also reducing the average computation time, due to less iteration steps per retrieval. For February 1998, non-convergence was brought down from 10.7% to 2.1%, while the mean number of iteration steps (which dominates the computational time) dropped 26% from 5.11 to 3.79.



Sign in / Sign up

Export Citation Format

Share Document