input event
Recently Published Documents


TOTAL DOCUMENTS

16
(FIVE YEARS 9)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Vol 4 ◽  
Author(s):  
Mayank Kejriwal

Often thought of as higher-order entities, events have recently become important subjects of research in the computational sciences, including within complex systems and natural language processing (NLP). One such application is event link prediction. Given an input event, event link prediction is the problem of retrieving a relevant set of events, similar to the problem of retrieving relevant documents on the Web in response to keyword queries. Since geopolitical events have complex semantics, it is an open question as to how to best model and represent events within the framework of event link prediction. In this paper, we formalize the problem and discuss how established representation learning algorithms from the machine learning community could potentially be applied to it. We then conduct a detailed empirical study on the Global Terrorism Database (GTD) using a set of metrics inspired by the information retrieval community. Our results show that, while there is considerable signal in both network-theoretic and text-centric models of the problem, classic text-only models such as bag-of-words prove surprisingly difficult to outperform. Our results establish both a baseline for event link prediction on GTD, and currently outstanding challenges for the research community to tackle in this space.


2021 ◽  
Author(s):  
Andreas Schmid ◽  
Raphael Wimmer

End-to-end latency - the time a computer system needs from an input event until output is displayed - directly influences task difficulty and user experience.It is therefore an important topic in HCI research. Different human-computer interfaces require different ways to measure latency as it is influenced by all involved hardware and software components.However, many approaches to measuring latency rely on professional lab equipment and are therefore hard to replicate.We propose a method for accurately measuring the end-to-end latency of traditional setups with a button-equipped input device and a display.To this end, a microcontroller closes the electrical contact of a mouse button to trigger an input event, and captures the screen response via a photo-resistor.Our approach combines parts of different existing methods to measure latency and only relies on cheap and off-the-shelf components to allow for easy replication.The latency values measured by our device are very close to those measured with a high-speed smartphone camera (240 Hz). The maximum error is below 2.64 ms - lower than the camera's temporal resolution and the screen refresh periods of high-fps computer displays.Therefore, our approach allows for repeatedly and reliably measuring end-to-end-latency.


2020 ◽  
pp. 875529302097096
Author(s):  
Jawad Fayaz ◽  
Sarah Azar ◽  
Mayssa Dabaghi ◽  
Farzin Zareian

This study presents an efficient algorithm that can be used to simulate ground motion waveforms using the site-based approach developed by Dabaghi and Der Kiureghian, and Rezaeian and Der Kiureghian that not only correspond to a specified seismic scenario (e.g. magnitude, distance, site conditions) but are also certain to achieve a target ground motion intensity measure within a narrow range. The suggested algorithm alleviates the need to scale simulated ground motions generated using the above-mentioned site-based approach; the resulting hazard-targeted simulated ground motions have consistent amplitude and time- and frequency-domain characteristics, which are required for proper seismic demand analysis of structures. The proposed algorithm takes as input a set of seismic Event Parameters and the target hazard intensity measure [Formula: see text] and generates a corresponding set of Model Parameters (i.e. input to the site-based ground motion simulation model). These Model Parameters are then used to simulate ground motion waveforms that not only represent the set of input Event Parameters ( Mw, Rrup, Vs30) but also maintain the target [Formula: see text]. To generate the set of Model Parameters, predictive relations between the Model Parameters and [Formula: see text] of ground motions are developed. Among the Model Parameters, the ones classified as important by statistical procedures (such as Random Forests, Forward Selection) are used to develop the predictive relations. The developed relations are then validated against a large dataset of recorded ground motions. The final implementation is provided in terms of graphic-user interface (GUI) called “Hazard-Targeted Time-Series Simulator” ( HATSim), which efficiently simulates site-based ground motions with minimum inputs.


Intrusion Detection Systems (IDS) is used as a tool to detect intrusions on IT networks, providing support in network monitoring to identify and avoid possible attacks. Most such approaches adopt Signature-based methods for detecting attacks which include matching the input event to predefined database signatures. Signature based intrusion detection acts as an adaptable device security safeguard technology. This paper discusses various Signature-based Intrusion Detection Systems and their advantages; given a set of signatures and basic patterns that estimate the relative importance of each intrusion detection system feature, system administrators may help identify cyber-attacks and threats to the network and Computer system. Eighty percent of incidents can be easily and promptly detected using signature-based detection methods if used as a precautionary phase for vulnerability detection and twenty percent rest by anomaly-based intrusion detection system that involves comparing definitions of normal activity or event behavior with observed events in identifying the significant deviations and deciding the traffic to flag.


2020 ◽  
Vol 223 ◽  
pp. 103792
Author(s):  
José Luis Mas ◽  
Jacobo Martín ◽  
Mai Khanh Pham ◽  
Elena Chamizo ◽  
Juan-Carlos Miquel ◽  
...  

2020 ◽  
Vol 245 ◽  
pp. 06039
Author(s):  
Kinga Anna Woźniak ◽  
Olmo Cerri ◽  
Javier M. Duarte ◽  
Torsten Möller ◽  
Jennifer Ngadiuba ◽  
...  

We discuss a model-independent strategy for boosting new physics searches with the help of an unsupervised anomaly detection algorithm. Prior to a search, each input event is preprocessed by the algorithm - a variational autoencoder (VAE). Based on the loss assigned to each event, input data can be split into a background control sample and a signal enriched sample. Following this strategy, one can enhance the sensitivity to new physics with no assumption on the underlying new physics signature. Our results show that a typical BSM search on the signal enriched group is more sensitive than an equivalent search on the original dataset.


2019 ◽  
Vol 214 ◽  
pp. 04034
Author(s):  
Alex Brino ◽  
Alessandro Di Girolamo ◽  
Wen Guan ◽  
Mario Lassnig ◽  
Tadashi Maeno ◽  
...  

The ATLAS experiment at the LHC is gradually transitioning from the traditional file-based processing model to dynamic workflow management at the event level with the ATLAS Event Service (AES). The AES assigns finegrained processing jobs to workers and streams out the data in quasi-real time, ensuring fully efficient utilization of all resources, including the most volatile. The next major step in this evolution is the possibility to intelligently stream the input data itself to workers. The Event Streaming Service (ESS) is now in development to asynchronously deliver only the input data required for processing when it is needed, protecting the application payload fromWAN latency without creating expensive long-term replicas. In the current prototype implementation, ESS processes run on compute nodes in parallel to the payload, reading the input event ranges remotely over the network, and replicating them in small input files that are passed to the application. In this contribution, we present the performance of the ESS prototype for different types of workflows in comparison to tasks accessing remote data directly. Based on the experience gained with the current prototype, we are now moving to the development of a server-side component of the ESS. The service can evolve progressively into a powerful Content Delivery Network-like capability for data streaming, ultimately enabling the delivery of ‘virtual data’ generated on demand.


2018 ◽  
Author(s):  
Arvind Satyanarayan ◽  
Dominik Moritz ◽  
Kanit Wongsuphasawat ◽  
Jeffery Heer

We present Vega-Lite, a high-level grammar that enables rapid specification of interactive data visualizations. Vega-Lite combines a traditional grammar of graphics, providing visual encoding rules and a composition algebra for layered and multi-view displays, with a novel grammar of interaction. Users specify interactive semantics by composing selections. In Vega-Lite, a selection is an abstraction that defines input event processing, points of interest, and a predicate function for inclusion testing. Selections parameterize visual encodings by serving as input data, defining scale extents, or by driving conditional logic. The Vega-Lite compiler automatically synthesizes requisite data flow and event handling logic, which users can override for further customization. In contrast to existing reactive specifications, Vega-Lite selections decompose an interaction design into concise, enumerable semantic units. We evaluate Vega-Lite through a range of examples, demonstrating succinct specification of both customized interaction methods and common techniques such as panning, zooming, and linked selection.


Sign in / Sign up

Export Citation Format

Share Document