time stamp
Recently Published Documents


TOTAL DOCUMENTS

230
(FIVE YEARS 50)

H-INDEX

16
(FIVE YEARS 3)

2021 ◽  
Author(s):  
S S Rajasekar ◽  
C. Palanisamy ◽  
K. Saranya

Abstract The Location based Service Selection (LSS) in WSN has been well studied. Towards effective LSS, an efficient Mobility Aware Displacement Approximation (MADA-LSS) based approach is presented in this article. The model monitors the mobility of mobile device and predict the possible location at different future time stamp. According to the future time stamp, the list of service locations are identified at different possible locations. According to the possible locations and service set, the method discover set of routes to reach each of the service points. For each of the route, the method estimates Data Arrival Rate (DAR), Data Claim Rate (DCR) and Data Rate Support Measure (DRSM) for both Access Point as well as route identified. Also, the method ranks the service points according to DRSM value and estimates the Trusted Handover Measure (THM) for the routes according to IoT (Internet of Things) devices of any route. By considering both THM and DSRM values an optimal service point as well as route has been selected to perform data transmission. Also, the privacy preservation is performed by using the same set of displacement approximation scheme which selects optimal encryption based on mobility parameters and time complexity and security of different encryption schemes. The proposed method improves the performance of LSS and secure routing.


Author(s):  
Wojciech Wasko ◽  
Dotan David Levi ◽  
Teferet Geula ◽  
Amit Mandelbaum
Keyword(s):  

2021 ◽  
Vol 11 (2) ◽  
pp. 321-328
Author(s):  
Prisca I. Okochi ◽  
Stanley A. Okolie ◽  
Juliet N. Odii

An Improved Data Leakage Detection System is designed to mitigate the leakage of crucial and sensitive data in a cloud computing environment. Generally, leakage of data in computing system has caused a lot of irreparable damage or catastrophe to various institutions or organizations worldwide. Therefore, this research aims at detecting and preventing any intentional or non-intentional data leakages using dynamic password or key for data decryption security mechanisms. To achieve this the OOADM methodology was adopted. The new system was implemented using ASP.net MVC and Microsoft SQL Server Management Studio as the backend. And by incorporating an Audit trail/Transaction log mechanism, the new system monitors the activities within and outside the computing environment with date and time stamp. Hence, the system can be applied in any environment for the prevention and detection of any data leakage.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Furqan Aziz ◽  
Victor Roth Cardoso ◽  
Laura Bravo-Merodio ◽  
Dominic Russ ◽  
Samantha C. Pendleton ◽  
...  

AbstractMultimorbidity, frequently associated with aging, can be operationally defined as the presence of two or more chronic conditions. Predicting the likelihood of a patient with multimorbidity to develop a further particular disease in the future is one of the key challenges in multimorbidity research. In this paper we are using a network-based approach to analyze multimorbidity data and develop methods for predicting diseases that a patient is likely to develop. The multimorbidity data is represented using a temporal bipartite network whose nodes represent patients and diseases and a link between these nodes indicates that the patient has been diagnosed with the disease. Disease prediction then is reduced to a problem of predicting those missing links in the network that are likely to appear in the future. We develop a novel link prediction method for static bipartite network and validate the performance of the method on benchmark datasets. By using a probabilistic framework, we then report on the development of a method for predicting future links in the network, where links are labelled with a time-stamp. We apply the proposed method to three different multimorbidity datasets and report its performance measured by different performance metrics including AUC, Precision, Recall, and F-Score.


Author(s):  
Alexander Rusch ◽  
Thomas Roesgen

Event-based cameras (Lichtsteiner et al., 2008; Posch et al., 2010; Gallego et al., 2020) operate fundamentally different from frame-based cameras: Each pixel of the sensor array reacts asynchronously to relative brightness changes creating a sequential stream of events in address-event representation (AER). Each event is defined by a microsecond-accurate time stamp, the pixel position and a binary polarity indicating a relative increase or decrease of light intensity. Thus, event-based cameras only sense changes in a scenery while effectively suppressing static, redundant information. This renders the camera technology promising also for flow diagnostics. In established approaches like PIV or PTV vast amounts of data are generated, only for a large part of redundant information to be eliminated in data post-processing. In contrast, eventbased cameras effectively compress the data stream already at the source. To make full use of this potential, new data processing algorithms are needed since event-based cameras do not generate conventional framebased data. This work utilizes an event-based camera to identify and track flow tracers such as helium-filled soap bubbles (HFSBs) with real-time visual feedback in measurement volumes of the order of several cubic meters.


Author(s):  
Thivaharan. S

Modern communication devices generate huge amount of data through the manifold usage of various social media applications. Among the entire generated data more than 40% are unstructured in nature. The industry also is reluctant to retain the data with the following characteristics: data containing asynchronous time stamp, replicated data, data which are broken while transmission and data that leads to misclassification. It is high time to drop-out the irrelevant data and considering the synchronous ones. In this article, a sentiment extraction model is proposed that governs the various social media contents. SpaCy is used as the preferred implementation language as it has many readily available libraries for the purpose of content classification. To avoid the over-fitting problem the actuators like “relu” and “sigmoid” are used. Even though many such classifiers are available for content classification, this article with the appropriate setting of Epoch counts, a categorical accuracy of 68% is obtained. The entire model is implemented in the TensorFlow based platform.


Author(s):  
Sayalee Ghule

Log records contain data generally Client Title, IP Address, Time Stamp, Get to Ask, number of Bytes Exchanged, Result Status, URL that Intimated, and Client Chairman. The log records are kept up by the internet servers. By analysing these log records gives a flawless thought to the client. The wide Web may be a solid store of web pages that gives the Net clients piles of information. With the change in the number and complexity of Websites, the degree of the net has gotten to be massively wide. Web Utilization Mining may be a division of web mining that consolidates the application of mining strategies to web server logs in coordination to expel the behaviour of clients. Log records contain basic data around the execution of a framework. This data is frequently utilized for investigating, operational profiling, finding quirks, recognizing security dangers, measuring execution,


Sign in / Sign up

Export Citation Format

Share Document