scholarly journals Context and trade-offs characterize real-world threat detection systems: A review and comprehensive framework to improve research practice and resolve the translational crisis

2020 ◽  
Vol 115 ◽  
pp. 25-33
Author(s):  
Markus Fendt ◽  
Michael H. Parsons ◽  
Raimund Apfelbach ◽  
Alexandra J.R. Carthey ◽  
Chris R. Dickman ◽  
...  
Author(s):  
Chris Dawson ◽  
Stuart Inkpen ◽  
Chris Nolan ◽  
David Bonnell

Many different approaches have been adopted for identifying leaks in pipelines. Leak detection systems, however, generally suffer from a number of difficulties and limitations. For existing and new pipelines, these inevitably force significant trade-offs to be made between detection accuracy, operational range, responsiveness, deployment cost, system reliability, and overall effectiveness. Existing leak detection systems frequently rely on the measurement of secondary effects such as temperature changes, acoustic signatures or flow differences to infer the existence of a leak. This paper presents an alternative approach to leak detection employing electromagnetic measurements of the material in the vicinity of the pipeline that can potentially overcome some of the difficulties encountered with existing approaches. This sensing technique makes direct measurements of the material near the pipeline resulting in reliable detection and minimal risk of false alarms. The technology has been used successfully in other industries to make critical measurements of materials under challenging circumstances. A number of prototype sensors were constructed using this technology and they were tested by an independent research laboratory. The test results show that sensors based on this technique exhibit a strong capability to detect oil, and to distinguish oil from water (a key challenge with in-situ sensors).


2017 ◽  
Vol 139 (11) ◽  
Author(s):  
Wei Chen ◽  
Mark Fuge

To solve a design problem, sometimes it is necessary to identify the feasible design space. For design spaces with implicit constraints, sampling methods are usually used. These methods typically bound the design space; that is, limit the range of design variables. But bounds that are too small will fail to cover all possible designs, while bounds that are too large will waste sampling budget. This paper tries to solve the problem of efficiently discovering (possibly disconnected) feasible domains in an unbounded design space. We propose a data-driven adaptive sampling technique—ε-margin sampling, which learns the domain boundary of feasible designs and also expands our knowledge on the design space as available budget increases. This technique is data-efficient, in that it makes principled probabilistic trade-offs between refining existing domain boundaries versus expanding the design space. We demonstrate that this method can better identify feasible domains on standard test functions compared to both random and active sampling (via uncertainty sampling). However, a fundamental problem when applying adaptive sampling to real world designs is that designs often have high dimensionality and thus require (in the worst case) exponentially more samples per dimension. We show how coupling design manifolds with ε-margin sampling allows us to actively expand high-dimensional design spaces without incurring this exponential penalty. We demonstrate this on real-world examples of glassware and bottle design, where our method discovers designs that have different appearance and functionality from its initial design set.


Author(s):  
Femi A. Aderohunmu ◽  
Giacomo Paci ◽  
Davide Brunelli ◽  
Jeremiah Deng ◽  
Luca Benini ◽  
...  
Keyword(s):  

Author(s):  
Poushali Sengupta ◽  
Sudipta Paul ◽  
Subhankar Mishra

The leakage of data might have an extreme effect on the personal level if it contains sensitive information. Common prevention methods like encryption-decryption, endpoint protection, intrusion detection systems are prone to leakage. Differential privacy comes to the rescue with a proper promise of protection against leakage, as it uses a randomized response technique at the time of collection of the data which promises strong privacy with better utility. Differential privacy allows one to access the forest of data by describing their pattern of groups without disclosing any individual trees. The current adaption of differential privacy by leading tech companies and academia encourages authors to explore the topic in detail. The different aspects of differential privacy, its application in privacy protection and leakage of information, a comparative discussion on the current research approaches in this field, its utility in the real world as well as the trade-offs will be discussed.


Author(s):  
Zafar Sultan ◽  
Paul Kwan

In this paper, a hybrid identity fusion model at decision level is proposed for Simultaneous Threat Detection Systems. The hybrid model is comprised of mathematical and statistical data fusion engines; Dempster Shafer, Extended Dempster and Generalized Evidential Processing (GEP). Simultaneous Threat Detection Systems improve threat detection rate by 39%. In terms of efficiency and performance, the comparison of 3 inference engines of the Simultaneous Threat Detection Systems showed that GEP is the better data fusion model. GEP increased precision of threat detection from 56% to 95%. Furthermore, set cover packing was used as a middle tier data fusion tool to discover the reduced size groups of threat data. Set cover provided significant improvement and reduced threat population from 2272 to 295, which helped in minimizing the processing complexity of evidential processing cost and time in determining the combined probability mass of proposed Multiple Simultaneous Threat Detection System. This technique is particularly relevant to on-line and Internet dependent applications including portals.


2010 ◽  
pp. 2310-2325
Author(s):  
Adam Slagell ◽  
Kiran Lakkaraju

It is desirable for many reasons to share information, particularly computer and network logs. Researchers need it for experiments, incident responders need it for collaborative security, and educators need this data for real world examples. However, the sensitive nature of this information often prevents its sharing. Anonymization techniques have been developed in recent years that help reduce risk and navigate the trade-offs between privacy, security and the need to openly share information. This chapter looks at the progress made in this area of research over the past several years, identifies the major problems left to solve and sets a roadmap for future research.


2010 ◽  
Vol 2 (2) ◽  
pp. 51-67
Author(s):  
Zafar Sultan ◽  
Paul Kwan

In this paper, a hybrid identity fusion model at decision level is proposed for Simultaneous Threat Detection Systems. The hybrid model is comprised of mathematical and statistical data fusion engines; Dempster Shafer, Extended Dempster and Generalized Evidential Processing (GEP). Simultaneous Threat Detection Systems improve threat detection rate by 39%. In terms of efficiency and performance, the comparison of 3 inference engines of the Simultaneous Threat Detection Systems showed that GEP is the better data fusion model. GEP increased precision of threat detection from 56% to 95%. Furthermore, set cover packing was used as a middle tier data fusion tool to discover the reduced size groups of threat data. Set cover provided significant improvement and reduced threat population from 2272 to 295, which helped in minimizing the processing complexity of evidential processing cost and time in determining the combined probability mass of proposed Multiple Simultaneous Threat Detection System. This technique is particularly relevant to on-line and Internet dependent applications including portals.


2020 ◽  
Vol 10 (15) ◽  
pp. 5208
Author(s):  
Mohammed Nasser Al-Mhiqani ◽  
Rabiah Ahmad ◽  
Z. Zainal Abidin ◽  
Warusia Yassin ◽  
Aslinda Hassan ◽  
...  

Insider threat has become a widely accepted issue and one of the major challenges in cybersecurity. This phenomenon indicates that threats require special detection systems, methods, and tools, which entail the ability to facilitate accurate and fast detection of a malicious insider. Several studies on insider threat detection and related areas in dealing with this issue have been proposed. Various studies aimed to deepen the conceptual understanding of insider threats. However, there are many limitations, such as a lack of real cases, biases in making conclusions, which are a major concern and remain unclear, and the lack of a study that surveys insider threats from many different perspectives and focuses on the theoretical, technical, and statistical aspects of insider threats. The survey aims to present a taxonomy of contemporary insider types, access, level, motivation, insider profiling, effect security property, and methods used by attackers to conduct attacks and a review of notable recent works on insider threat detection, which covers the analyzed behaviors, machine-learning techniques, dataset, detection methodology, and evaluation metrics. Several real cases of insider threats have been analyzed to provide statistical information about insiders. In addition, this survey highlights the challenges faced by other researchers and provides recommendations to minimize obstacles.


Author(s):  
Tatyana O. Sharpee

Sensory systems exist to provide an organism with information about the state of the environment that can be used to guide future actions and decisions. Remarkably, two conceptually simple yet general theorems from information theory can be used to evaluate the performance of any sensory system. One theorem states that there is a minimal amount of energy that an organism has to spend in order to capture a given amount of information about the environment. The second theorem states that the maximum rate with which the organism can acquire resources from the environment, relative to its competitors, is limited by the information this organism collects about the environment, also relative to its competitors. These two theorems provide a scaffold for formulating and testing general principles of sensory coding but leave unanswered many important practical questions of implementation in neural circuits. These implementation questions have guided thinking in entire subfields of sensory neuroscience, and include: What features in the sensory environment should be measured? Given that we make decisions on a variety of time scales, how should one solve trade-offs between making simpler measurements to guide minimal decisions vs. more elaborate sensory systems that have to overcome multiple delays between sensation and action. Once we agree on the types of features that are important to represent, how should they be represented? How should resources be allocated between different stages of processing, and where is the impact of noise most damaging? Finally, one should consider trade-offs between implementing a fixed strategy vs. an adaptive scheme that readjusts resources based on current needs. Where adaptation is considered, under what conditions does it become optimal to switch strategies? Research over the past 60 years has provided answers to almost all of these questions but primarily in early sensory systems. Joining these answers into a comprehensive framework is a challenge that will help us understand who we are and how we can make better use of limited natural resources.


Sign in / Sign up

Export Citation Format

Share Document