scholarly journals Statistical Detection of Malicious PE-Executables for Fast Offline Analysis

Author(s):  
Ronny Merkel ◽  
Tobias Hoppe ◽  
Christian Kraetzer ◽  
Jana Dittmann
Author(s):  
Julian Prell ◽  
Christian Scheller ◽  
Sebastian Simmermacher ◽  
Christian Strauss ◽  
Stefan Rampp

Abstract Objective The quantity of A-trains, a high-frequency pattern of free-running facial nerve electromyography, is correlated with the risk for postoperative high-grade facial nerve paresis. This correlation has been confirmed by automated analysis with dedicated algorithms and by visual offline analysis but not by audiovisual real-time analysis. Methods An investigator was presented with 29 complete data sets measured during actual surgeries in real time and without breaks in a random order. Data were presented either strictly via loudspeaker (audio) or simultaneously by loudspeaker and computer screen (audiovisual). Visible and/or audible A-train activity was then quantified by the investigator with the computerized equivalent of a stopwatch. The same data were also analyzed with quantification of A-trains by automated algorithms. Results Automated (auto) traintime (TT), known to be a small, yet highly representative fraction of overall A-train activity, ranged from 0.01 to 10.86 s (median: 0.58 s). In contrast, audio-TT ranged from 0 to 1,357.44 s (median: 29.69 s), and audiovisual-TT ranged from 0 to 786.57 s (median: 46.19 s). All three modalities were correlated to each other in a highly significant way. Likewise, all three modalities correlated significantly with the extent of postoperative facial paresis. As a rule of thumb, patients with visible/audible A-train activity < 1 minute presented with a more favorable clinical outcome than patients with > 1 minute of A-train activity. Conclusion Detection and even quantification of A-trains is technically possible not only with intraoperative automated real-time calculation or postoperative visual offline analysis, but also with very basic monitoring equipment and real-time good quality audiovisual analysis. However, the investigator found audiovisual real-time-analysis to be very demanding; thus tools for automated quantification can be very helpful in this respect.


2010 ◽  
Vol 36 (S1) ◽  
pp. 124-124
Author(s):  
M. Salman ◽  
H. Mousa ◽  
P. Twining ◽  
D. K. James ◽  
M. Momtaz ◽  
...  

2004 ◽  
Author(s):  
Mark Bernhardt ◽  
William J. Oxford ◽  
Philip E. Clare ◽  
Vicky A. Wilkinson ◽  
Damien G. Clarke

2016 ◽  
Vol 14 (11) ◽  
pp. 1097-1107 ◽  
Author(s):  
Rachel Park ◽  
Thomas F. O’Brien ◽  
Susan S. Huang ◽  
Meghan A. Baker ◽  
Deborah S. Yokoe ◽  
...  

2021 ◽  
Vol 251 ◽  
pp. 04011
Author(s):  
Fabrizio Ameli ◽  
Marco Battaglieri ◽  
Mariangela Bondí ◽  
Andrea Celentano ◽  
Sergey Boyarinov ◽  
...  

An effort is underway to develop streaming readout data acquisition system for the CLAS12 detector in Jefferson Lab’s experimental Hall-B. Successful beam tests were performed in the spring and summer of 2020 using a 10GeV electron beam from Jefferson Lab’s CEBAF accelerator. The prototype system combined elements of the TriDAS and CODA data acquisition systems with the JANA2 analysis/reconstruction framework. This successfully merged components that included an FPGA stream source, a distributed hit processing system, and software plugins that allowed offline analysis written in C++ to be used for online event filtering. Details of the system design and performance are presented.


Sign in / Sign up

Export Citation Format

Share Document