data acquisition process
Recently Published Documents


TOTAL DOCUMENTS

71
(FIVE YEARS 26)

H-INDEX

6
(FIVE YEARS 2)

Author(s):  
P. Pushpalatha

Abstract: Optical coherence tomography angiography (OCTA) is an imaging which can applied in ophthalmology to provide detailed visualization of the perfusion of vascular networks in the eye. compared to previous state of the art dye-based imaging, such as fluorescein angiography. OCTA is non-invasive, time efficient, and it allows for the examination of retinal vascular in 3D. These advantage of the technique combined with the good usability in commercial devices led to a quick adoption of the new modality in the clinical routine. However, the interpretation of OCTA data is not without problems commonly observed image artifacts and the quite involved algorithmic details of OCTA signal construction can make the clinical assessment of OCTA exams challenging. In this paper we describe the technical background of OCTA and discuss the data acquisition process, common image visualization techniques, as well as 3D to 2D projection using high pass filtering, relu function and convolution neural network (CNN) for more accuracy and segmentation results.


Vehicles ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 721-735
Author(s):  
Mohammed Alharbi ◽  
Hassan A. Karimi

Sensor uncertainty significantly affects the performance of autonomous vehicles (AVs). Sensor uncertainty is predominantly linked to sensor specifications, and because sensor behaviors change dynamically, the machine learning approach is not suitable for learning them. This paper presents a novel learning approach for predicting sensor performance in challenging environments. The design of our approach incorporates both epistemic uncertainties, which are related to the lack of knowledge, and aleatoric uncertainties, which are related to the stochastic nature of the data acquisition process. The proposed approach combines a state-based model with a predictive model, where the former estimates the uncertainty in the current environment and the latter finds the correlations between the source of the uncertainty and its environmental characteristics. The proposed approach has been evaluated on real data to predict the uncertainties associated with global navigation satellite systems (GNSSs), showing that our approach can predict sensor uncertainty with high confidence.


CONVERTER ◽  
2021 ◽  
pp. 606-612
Author(s):  
Dong Jin

Collecting data from the Internet is the key to solve the problem of data sources. This paper studies the image information collection system based on Python web crawler technology.This paper studies and develops a data acquisition system based on Python web crawler technology, which realizes the automatic collection of subject data. In this paper, we use urllib, beautiful soup, threading library to design and develop a system model framework including data crawling, exception handling, robots protocol management and multithreading management modules. Through the application of specific cases, this paper introduces the data acquisition process. Experimental data show that compared with the traditional manual data acquisition, the proposed method greatly improves the work efficiency.


2021 ◽  
Vol 11 (1) ◽  
pp. 524-539
Author(s):  
Rosa Mirelly García Jara ◽  
◽  
Katty Susana Gutiérrez Villanueva ◽  
Katherin Vanessa Rodríguez Zevallos ◽  
Frank Edmundo Escobedo Bailón

Business Intelligence is becoming more and more relevant in companies, this is due to the fact that decision makers rely on it to perform the work itself. Based on this assertion, it is necessary to facilitate the BI process, reducing response times and increasing effectiveness and efficiency. This article shows different ways to perform Business Intelligence, from the origin, i.e. the extraction of data, to the last link of the process related to decision making. To this extent, new alternatives are presented that with the necessary study have shown that they go beyond what we now know as BI, allowing not only to make decisions but also to propose that these have an automated support, also allowing the data to be processed practically alone and to handle more real reports based on data from various sources. The objective of the study is to analyze the new trends for the development of processes related to business intelligence, for which a meticulous bibliographic review has been carried out by consulting scientific articles, books and scientific conferences. First, the description of terms and the staging of the information collected through the research, dedicated to the various innovative trends for the deployment of Business Intelligence, showing new definitions, architectures and trends that are currently being carried out, have been developed. Finally, SOA architecture proposals, open data acquisition, process automation and data warehouse reengineering would allow the optimization of business intelligence through their alternatives.


2021 ◽  
Vol 7 ◽  
pp. e349
Author(s):  
Alex Noel Joseph Raj ◽  
Haipeng Zhu ◽  
Asiya Khan ◽  
Zhemin Zhuang ◽  
Zengbiao Yang ◽  
...  

Currently, the new coronavirus disease (COVID-19) is one of the biggest health crises threatening the world. Automatic detection from computed tomography (CT) scans is a classic method to detect lung infection, but it faces problems such as high variations in intensity, indistinct edges near lung infected region and noise due to data acquisition process. Therefore, this article proposes a new COVID-19 pulmonary infection segmentation depth network referred as the Attention Gate-Dense Network- Improved Dilation Convolution-UNET (ADID-UNET). The dense network replaces convolution and maximum pooling function to enhance feature propagation and solves gradient disappearance problem. An improved dilation convolution is used to increase the receptive field of the encoder output to further obtain more edge features from the small infected regions. The integration of attention gate into the model suppresses the background and improves prediction accuracy. The experimental results show that the ADID-UNET model can accurately segment COVID-19 lung infected areas, with performance measures greater than 80% for metrics like Accuracy, Specificity and Dice Coefficient (DC). Further when compared to other state-of-the-art architectures, the proposed model showed excellent segmentation effects with a high DC and F1 score of 0.8031 and 0.82 respectively.


2021 ◽  
Vol 11 (1) ◽  
pp. 294-304
Author(s):  
Siraj Munir ◽  
Syed Imran Jami ◽  
Shaukat Wasi

Abstract In this work we have proposed a model for Citizen Profiling. It uses veillance (Surveillance and Sousveillance) for data acquisition. For representation of Citizen Profile Temporal Knowledge Graph has been used through which we can answer semantic queries. Previously, most of the work lacks representation of Citizen Profile and have used surveillance for data acquisition. Our contribution is towards enriching the data acquisition process by adding sousveillance mechanism and facilitating semantic queries through representation of Citizen Profiles using Temporal Knowledge Graphs. Our proposed solution is storage efficient as we have only stored data logs for Citizen Profiling instead of storing images, audio, and video for profiling purposes. Our proposed system can be extended to Smart City, Smart Traffic Management, Workplace profiling etc. Agent based mechanism can be used for data acquisition where each Citizen has its own agent. Another improvement can be to incorporate a decentralized version of database for maintaining Citizen profile.


Electronics ◽  
2020 ◽  
Vol 9 (11) ◽  
pp. 1972
Author(s):  
Dhiraj Neupane ◽  
Jongwon Seok

Underwater acoustics has been implemented mostly in the field of sound navigation and ranging (SONAR) procedures for submarine communication, the examination of maritime assets and environment surveying, target and object recognition, and measurement and study of acoustic sources in the underwater atmosphere. With the rapid development in science and technology, the advancement in sonar systems has increased, resulting in a decrement in underwater casualties. The sonar signal processing and automatic target recognition using sonar signals or imagery is itself a challenging process. Meanwhile, highly advanced data-driven machine-learning and deep learning-based methods are being implemented for acquiring several types of information from underwater sound data. This paper reviews the recent sonar automatic target recognition, tracking, or detection works using deep learning algorithms. A thorough study of the available works is done, and the operating procedure, results, and other necessary details regarding the data acquisition process, the dataset used, and the information regarding hyper-parameters is presented in this article. This paper will be of great assistance for upcoming scholars to start their work on sonar automatic target recognition.


2020 ◽  
Vol 4 (5) ◽  
pp. 820-828
Author(s):  
Imam Riadi ◽  
Abdul Fadlil ◽  
Muhammad Immawan Aulia

DVD-R is a type of optical drive that can store data in one burning process. However, there is a feature that allows erasing data in a read-only type, namely multisession. The research was conducted to implement the data acquisition process which was deleted from a DVD-R using Autopsy forensic tools and FTK Imager. The National Institute of Standards and Technology (NIST) is a method commonly used in digital forensics in scope storage with stages, namely collection, examination, analysis, and reporting. The acquisition results from Autopsy and FTK-Imager show the same results as the original file before being deleted, validated by matching the hash value. Based on the results obtained from the analysis and presentation stages, it can be concluded from the ten files resulting from data acquisition using the FTK Imager and Autopsy tools on DVD-R. FTK Imager detects two file systems, namely ISO9660 and Joliet, while the Autopsy tool only has one file system, namely UDF. The findings on the FTK Imager tool successfully acquired ten files with matching hash values and Autopsy Tools detected seven files with did not find three files with extensions, *.MOV, *.exe, *.rar. Based on the results of the comparative analysis of the performance test carried out on the FTK Imager, it got a value of 100% because it managed to find all deleted files and Autopsy got a value of 70% because 3 files were not detected because 3 files were not detected and the hash values ​​were empty with the extensions * .exe, * .rar and *.MOV. This is because the Autopsy tool cannot detect the three file extensions.  


2020 ◽  
Vol 3 (S1) ◽  
Author(s):  
Michael Brand ◽  
Davood Babazadeh ◽  
Carsten Krüger ◽  
Björn Siemers ◽  
Sebastian Lehnhoff

Abstract Modern power systems are cyber-physical systems with increasing relevance and influence of information and communication technology. This influence comprises all processes, functional, and non-functional aspects like functional correctness, safety, security, and reliability. An example of a process is the data acquisition process. Questions focused in this paper are, first, how one can trust in process data in a data acquisition process of a highly-complex cyber-physical power system. Second, how can the trust in process data be integrated into a state estimation to achieve estimated results in a way that it can reflect trustworthiness of that input?We present the concept of an anomaly-sensitive state estimation that tackles these questions. The concept is based on a multi-faceted trust model for power system network assessment. Furthermore, we provide a proof of concept by enriching measurements in the context of the IEEE 39-bus system with reasonable trust values. The proof of concept shows the benefits but also the limitations of the approach.


Author(s):  
Imadeddine Mountasser ◽  
Brahim Ouhbi ◽  
Bouchra Frikh ◽  
Ferdaous Hdioud

Nowadays, people and things are becoming permanently interconnected. This interaction overloaded the world with an incredible digital data deluge—termed big data—generated from a wide range of data sources. Indeed, big data has invaded the domain of tourism as a source of innovation that serves to better understand tourists' behavior and enhance tourism destination management and marketing. Thus, tourism stakeholders have increasingly leveraging tourism-related big data sources to gather abundant information concerning all tourism industry axes. However, big data has several complexity aspects and brings commensurate challenges that go along with its exploitation. It has specifically changed the way data is acquired and managed, which may influence the nature and the quality of the conducted analyses and the made decisions. Thus, this article investigates the big data acquisition process and thoroughly identifies its challenges and requirements. It also reveals its current state-of-the-art protocols and frameworks.


Sign in / Sign up

Export Citation Format

Share Document