Cross-Site Search Attacks: Unauthorized Queries over Private Data

Author(s):  
Bar Meyuhas ◽  
Nethanel Gelernter ◽  
Amir Herzberg
Keyword(s):  
10.17158/183 ◽  
2011 ◽  
Vol 17 (2) ◽  
Author(s):  
Eric John G. Emberda ◽  
Siegfried C. Capon ◽  
Johanah A. Maunda

<p>Stealing information from a user’s computer through the Internet is a growing concern. One type of Internet attacks or Cybercrime is Cross-Site Scripting or XSS. It allows an attacker to retrieve information from an Internet user by inserting a script to a vulnerable website where it automatically mines private data from the victim, then sends those data to another website. This study was conducted to examine the different vulnerable aspects of a website. A list of XSS-vulnerable websites was gathered, as well as a list of different XSS scripts. These websites were tested with the XSS scripts to determine the entry point to which the scripts can penetrate. A web proxy application was created which implements different mechanisms to prevent these XSS scripts from successfully mining private data. The web proxy application was able to minimize XSS attacks by comparing the scripts inside the website with the database of XSS scripts. The researchers however, recommend that the process of preventing XSS scripts be improved by adding artificially intelligent algorithms that will read patterns for XSS scripts and distinguish them from safe scripts.</p>


2014 ◽  
Vol 3 (2) ◽  
pp. 13-21 ◽  
Author(s):  
Bharti Nagpal ◽  
Naresh Chauhan ◽  
Nanhay Singh

Author(s):  
Wanlu Zhang ◽  
Qigang Wang ◽  
Mei Li

Background: As artificial intelligence and big data analysis develop rapidly, data privacy, especially patient medical data privacy, is getting more and more attention. Objective: To strengthen the protection of private data while ensuring the model training process, this article introduces a multi-Blockchain-based decentralized collaborative machine learning training method for medical image analysis. In this way, researchers from different medical institutions are able to collaborate to train models without exchanging sensitive patient data. Method: Partial parameter update method is applied to prevent indirect privacy leakage during model propagation. With the peer-to-peer communication in the multi-Blockchain system, a machine learning task can leverage auxiliary information from another similar task in another Blockchain. In addition, after the collaborative training process, personalized models of different medical institutions will be trained. Results: The experimental results show that our method achieves similar performance with the centralized model-training method by collecting data sets of all participants and prevents private data leakage at the same time. Transferring auxiliary information from similar task on another Blockchain has also been proven to effectively accelerate model convergence and improve model accuracy, especially in the scenario of absence of data. Personalization training process further improves model performance. Conclusion: Our approach can effectively help researchers from different organizations to achieve collaborative training without disclosing their private data.


2021 ◽  
Vol 21 (2) ◽  
pp. 1-31
Author(s):  
Bjarne Pfitzner ◽  
Nico Steckhan ◽  
Bert Arnrich

Data privacy is a very important issue. Especially in fields like medicine, it is paramount to abide by the existing privacy regulations to preserve patients’ anonymity. However, data is required for research and training machine learning models that could help gain insight into complex correlations or personalised treatments that may otherwise stay undiscovered. Those models generally scale with the amount of data available, but the current situation often prohibits building large databases across sites. So it would be beneficial to be able to combine similar or related data from different sites all over the world while still preserving data privacy. Federated learning has been proposed as a solution for this, because it relies on the sharing of machine learning models, instead of the raw data itself. That means private data never leaves the site or device it was collected on. Federated learning is an emerging research area, and many domains have been identified for the application of those methods. This systematic literature review provides an extensive look at the concept of and research into federated learning and its applicability for confidential healthcare datasets.


Sign in / Sign up

Export Citation Format

Share Document