scholarly journals Development of Magnetic-Based Navigation by Constructing Maps Using Machine Learning for Autonomous Mobile Robots in Real Environments

Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 3972
Author(s):  
Takumi Takebayashi ◽  
Renato Miyagusuku ◽  
Koichi Ozaki

Localization is fundamental to enable the use of autonomous mobile robots. In this work, we use magnetic-based localization. As Earth’s geomagnetic field is stable in time and is not affected by nonmagnetic materials, such as a large number of people in the robot’s surroundings, magnetic-based localization is ideal for service robotics in supermarkets, hotels, etc. A common approach for magnetic-based localization is to first create a magnetic map of the environment where the robot will be deployed. For this, magnetic samples acquired a priori are used. To generate this map, the collected data is interpolated by training a Gaussian Process Regression model. Gaussian processes are nonparametric, data-drive models, where the most important design choice is the selection of an adequate kernel function. These models are flexible and generate mean predictions as well as the confidence of those predictions, making them ideal for their use in probabilistic approaches. However, their computational and memory cost scales poorly when large datasets are used for training, making their use in large-scale environments challenging. The purpose of this study is to: (i) enable magnetic-based localization on large-scale environments by using a sparse representation of Gaussian processes, (ii) test the effect of several kernel functions on robot localization, and (iii) evaluate the accuracy of the approach experimentally on different large-scale environments.

2021 ◽  
Vol 8 ◽  
Author(s):  
Radu Mariescu-Istodor ◽  
Pasi Fränti

The scalability of traveling salesperson problem (TSP) algorithms for handling large-scale problem instances has been an open problem for a long time. We arranged a so-called Santa Claus challenge and invited people to submit their algorithms to solve a TSP problem instance that is larger than 1 M nodes given only 1 h of computing time. In this article, we analyze the results and show which design choices are decisive in providing the best solution to the problem with the given constraints. There were three valid submissions, all based on local search, including k-opt up to k = 5. The most important design choice turned out to be the localization of the operator using a neighborhood graph. The divide-and-merge strategy suffers a 2% loss of quality. However, via parallelization, the result can be obtained within less than 2 min, which can make a key difference in real-life applications.


2014 ◽  
Vol 658 ◽  
pp. 587-592
Author(s):  
Ionel Conduraru ◽  
Ioan Doroftei ◽  
Dorin Luca ◽  
Alina Conduraru Slatineanu

Mobile robots have a large scale use in industry, military operations, exploration and other applications where human intervention is risky. When a mobile robot has to move in small and narrow spaces and to avoid obstacles, mobility is one of its main issues. An omni-directional drive mechanism is very attractive because it guarantees a very good mobility in such cases. Also, the accurate estimation of the position is a key component for the successful operation for most of autonomous mobile robots. In this work, some odometry aspects of an omni-directional robot are presented and a simple odometer solution is proposed.


2013 ◽  
Vol 837 ◽  
pp. 561-566 ◽  
Author(s):  
Ionel Conduraru ◽  
Ioan Doroftei ◽  
Alina Conduraru (Slătineanu)

In recent years more and more emphasis was placed on the idea of autonomous mobile robots, researches being constantly rising. Mobile robots have a large scale use in industry, military operations, exploration and other applications where human intervention is risky. The accurate estimation of the position is a key component for the successful operation for most of autonomous mobile robots. The localization of an autonomous robot system refers mainly to the precise determination of the coordinates where the system is present at a certain moment of time. In many applications, the orientation and an initial estimation of the robot position are known, being supplied directly or indirectly by the user or the supervisor. During the execution of the tasks, the robot must update this estimation using measurements from its sensors. This is known as local localization. Using only sensors that measure relative movements, the error in the pose estimation increases over time as errors are accumulated. Localization is a fundamental operation for navigating mobile robots


1996 ◽  
Vol 76 (06) ◽  
pp. 0939-0943 ◽  
Author(s):  
B Boneu ◽  
G Destelle ◽  

SummaryThe anti-aggregating activity of five rising doses of clopidogrel has been compared to that of ticlopidine in atherosclerotic patients. The aim of this study was to determine the dose of clopidogrel which should be tested in a large scale clinical trial of secondary prevention of ischemic events in patients suffering from vascular manifestations of atherosclerosis [CAPRIE (Clopidogrel vs Aspirin in Patients at Risk of Ischemic Events) trial]. A multicenter study involving 9 haematological laboratories and 29 clinical centers was set up. One hundred and fifty ambulatory patients were randomized into one of the seven following groups: clopidogrel at doses of 10, 25, 50,75 or 100 mg OD, ticlopidine 250 mg BID or placebo. ADP and collagen-induced platelet aggregation tests were performed before starting treatment and after 7 and 28 days. Bleeding time was performed on days 0 and 28. Patients were seen on days 0, 7 and 28 to check the clinical and biological tolerability of the treatment. Clopidogrel exerted a dose-related inhibition of ADP-induced platelet aggregation and bleeding time prolongation. In the presence of ADP (5 \lM) this inhibition ranged between 29% and 44% in comparison to pretreatment values. The bleeding times were prolonged by 1.5 to 1.7 times. These effects were non significantly different from those produced by ticlopidine. The clinical tolerability was good or fair in 97.5% of the patients. No haematological adverse events were recorded. These results allowed the selection of 75 mg once a day to evaluate and compare the antithrombotic activity of clopidogrel to that of aspirin in the CAPRIE trial.


2020 ◽  
pp. 9-13
Author(s):  
A. V. Lapko ◽  
V. A. Lapko

An original technique has been justified for the fast bandwidths selection of kernel functions in a nonparametric estimate of the multidimensional probability density of the Rosenblatt–Parzen type. The proposed method makes it possible to significantly increase the computational efficiency of the optimization procedure for kernel probability density estimates in the conditions of large-volume statistical data in comparison with traditional approaches. The basis of the proposed approach is the analysis of the optimal parameter formula for the bandwidths of a multidimensional kernel probability density estimate. Dependencies between the nonlinear functional on the probability density and its derivatives up to the second order inclusive of the antikurtosis coefficients of random variables are found. The bandwidths for each random variable are represented as the product of an undefined parameter and their mean square deviation. The influence of the error in restoring the established functional dependencies on the approximation properties of the kernel probability density estimation is determined. The obtained results are implemented as a method of synthesis and analysis of a fast bandwidths selection of the kernel estimation of the two-dimensional probability density of independent random variables. This method uses data on the quantitative characteristics of a family of lognormal distribution laws.


Author(s):  
Maria A. Milkova

Nowadays the process of information accumulation is so rapid that the concept of the usual iterative search requires revision. Being in the world of oversaturated information in order to comprehensively cover and analyze the problem under study, it is necessary to make high demands on the search methods. An innovative approach to search should flexibly take into account the large amount of already accumulated knowledge and a priori requirements for results. The results, in turn, should immediately provide a roadmap of the direction being studied with the possibility of as much detail as possible. The approach to search based on topic modeling, the so-called topic search, allows you to take into account all these requirements and thereby streamline the nature of working with information, increase the efficiency of knowledge production, avoid cognitive biases in the perception of information, which is important both on micro and macro level. In order to demonstrate an example of applying topic search, the article considers the task of analyzing an import substitution program based on patent data. The program includes plans for 22 industries and contains more than 1,500 products and technologies for the proposed import substitution. The use of patent search based on topic modeling allows to search immediately by the blocks of a priori information – terms of industrial plans for import substitution and at the output get a selection of relevant documents for each of the industries. This approach allows not only to provide a comprehensive picture of the effectiveness of the program as a whole, but also to visually obtain more detailed information about which groups of products and technologies have been patented.


Author(s):  
Laure Fournier ◽  
Lena Costaridou ◽  
Luc Bidaut ◽  
Nicolas Michoux ◽  
Frederic E. Lecouvet ◽  
...  

Abstract Existing quantitative imaging biomarkers (QIBs) are associated with known biological tissue characteristics and follow a well-understood path of technical, biological and clinical validation before incorporation into clinical trials. In radiomics, novel data-driven processes extract numerous visually imperceptible statistical features from the imaging data with no a priori assumptions on their correlation with biological processes. The selection of relevant features (radiomic signature) and incorporation into clinical trials therefore requires additional considerations to ensure meaningful imaging endpoints. Also, the number of radiomic features tested means that power calculations would result in sample sizes impossible to achieve within clinical trials. This article examines how the process of standardising and validating data-driven imaging biomarkers differs from those based on biological associations. Radiomic signatures are best developed initially on datasets that represent diversity of acquisition protocols as well as diversity of disease and of normal findings, rather than within clinical trials with standardised and optimised protocols as this would risk the selection of radiomic features being linked to the imaging process rather than the pathology. Normalisation through discretisation and feature harmonisation are essential pre-processing steps. Biological correlation may be performed after the technical and clinical validity of a radiomic signature is established, but is not mandatory. Feature selection may be part of discovery within a radiomics-specific trial or represent exploratory endpoints within an established trial; a previously validated radiomic signature may even be used as a primary/secondary endpoint, particularly if associations are demonstrated with specific biological processes and pathways being targeted within clinical trials. Key Points • Data-driven processes like radiomics risk false discoveries due to high-dimensionality of the dataset compared to sample size, making adequate diversity of the data, cross-validation and external validation essential to mitigate the risks of spurious associations and overfitting. • Use of radiomic signatures within clinical trials requires multistep standardisation of image acquisition, image analysis and data mining processes. • Biological correlation may be established after clinical validation but is not mandatory.


Sign in / Sign up

Export Citation Format

Share Document