scholarly journals DOCODE 3.0 (DOcument COpy DEtector): A system for plagiarism detection by applying an information fusion process from multiple documental data sources

2016 ◽  
Vol 27 ◽  
pp. 64-75 ◽  
Author(s):  
Juan D. Velásquez ◽  
Yerko Covacevich ◽  
Francisco Molina ◽  
Edison Marrese-Taylor ◽  
Cristián Rodríguez ◽  
...  
Sensors ◽  
2019 ◽  
Vol 19 (8) ◽  
pp. 1929 ◽  
Author(s):  
Farag Azzedin ◽  
Mustafa Ghaleb

The advent of Internet-of-Things (IoT) is creating an ecosystem of smart applications and services enabled by a multitude of sensors. The real value of these IoT smart applications comes from analyzing the information provided by these sensors. Information fusion improves information completeness/quality and, hence, enhances estimation about the state of things. Lack of trust and therefore, malicious activities renders the information fusion process and hence, IoT smart applications unreliable. Behavior-related issues associated with the data sources, such as trustworthiness, honesty, and accuracy, must be addressed before fully utilizing these smart applications. In this article, we argue that behavior trust modeling is indispensable to the success of information fusion and, hence, to smart applications. Unfortunately, the area is still in its infancy and needs further research to enhance information fusion. The aim of this article is to raise the awareness and the need of behavior trust modelling and its effect on information fusion. Moreover, this survey describes IoT architectures for modelling trust as well as classification of current IoT trust models. Finally, we discuss future directions towards trustworthy reliable fusion techniques.


2019 ◽  
Vol 8 (8) ◽  
pp. 330 ◽  
Author(s):  
Robert Jeansoulin

Since the launch of Landsat-1 in 1972, the scientific domain of geo-information has been incrementally shaped through different periods, due to technology evolutions: in devices (satellites, UAV, IoT), in sensors (optical, radar, LiDAR), in software (GIS, WebGIS, 3D), and in communication (Big Data). Land Cover and Disaster Management remain the main big issues where these technologies are highly required. Data fusion methods and tools have been adapted progressively to new data sources, which are augmenting in volume, variety, and in quick accessibility. This Special Issue gives a snapshot of the current status of that adaptation, as well as looking at what challenges are coming soon.


2020 ◽  
Vol 63 ◽  
pp. 256-272 ◽  
Author(s):  
S. Salcedo-Sanz ◽  
P. Ghamisi ◽  
M. Piles ◽  
M. Werner ◽  
L. Cuadra ◽  
...  

2021 ◽  
Vol 20 ◽  
pp. 352-361
Author(s):  
Xiang Lin

In the big data environment, the visualization technique has been increasingly adopted to mine the data on library and information (L&I), with the diversification of data sources and the growth of data volume. However, there are several defects with the research on information association of L&I visualization network: the lack of optimization of network layout algorithms, and the absence of L&I information fusion and comparison in multiple disciplines, in the big data environment. To overcome these defects, this paper explores the visualization of L&I from the perspective of big data analysis and fusion. Firstly, the authors analyzed the topology of the L&I visualization network, and calculated the metrics for the construction of L&I visualization topology map. Next, the importance of meta-paths of the L&I visualization network was calculated. Finally, a complex big data L&I visualization network was established, and the associations between information nodes were analyzed in details. Experimental results verify the effectiveness of the proposed algorithm


Author(s):  
Luiz Alberto Pereira Afonso Ribeiro ◽  
Ana Cristina Bicharra Garcia ◽  
Paulo Sérgio Medeiros Dos Santos

The use of big data and information fusion in electronichealth records (EHR) allowed the identification of adversedrug reactions(ADR) through the integration of heteroge-neous sources such as clinical notes (CN), medication pre-scriptions, and pathological examinations. This heterogene-ity of data sources entails the need to address redundancy,conflict, and uncertainty caused by the high dimensionalitypresent in EHR. The use of multisensor information fusion(MSIF) presents an ideal scenario to deal with uncertainty,especially when adding resources of the theory of evidence,also called Dempster–Shafer Theory (DST). In that scenariothere is a challenge which is to specify the attribution of be-lief through the mass function, from the datasets, named basicprobability assignment (BPA). The objective of the presentwork is to create a form of BPA generation using analy-sis of data regarding causal and time relationships betweensources, entities and sensors, not only through correlation, butby causal inference.


Sign in / Sign up

Export Citation Format

Share Document