scholarly journals A Framework for Visual Dynamic Analysis of Ray Tracing Algorithms

2014 ◽  
Vol 14 (2) ◽  
pp. 38-49 ◽  
Author(s):  
Hristo Lesev ◽  
Alexander Penev

Abstract A novel approach is presented for recording high volume data about ray tracing rendering systems' runtime state and its subsequent dynamic analysis and interactive visualization in the algorithm computational domain. Our framework extracts light paths traced by the system and leverages on a powerful filtering subsystem, helping interactive visualization and exploration of the desired subset of recorded data. We introduce a versatile data logging format and acceleration structures for easy access and filtering. We have implemented a plugin based framework and a tool set that realize all ideas presented in this paper. The framework provides data logging API for instrumenting production-ready, multithreaded, distributed renderers. The framework visualization tool enables deeper understanding of the ray tracing algorithms for novices, as well as for experts.

Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 698 ◽  
Author(s):  
Shabana Ramzan ◽  
Imran Bajwa ◽  
Rafaqut Kazmi

Handling complexity in the data of information systems has emerged into a serious challenge in recent times. The typical relational databases have limited ability to manage the discrete and heterogenous nature of modern data. Additionally, the complexity of data in relational databases is so high that the efficient retrieval of information has become a bottleneck in traditional information systems. On the side, Big Data has emerged into a decent solution for heterogenous and complex data (structured, semi-structured and unstructured data) by providing architectural support to handle complex data and by providing a tool-kit for efficient analysis of complex data. For the organizations that are sticking to relational databases and are facing the challenge of handling complex data, they need to migrate their data to a Big Data solution to get benefits such as horizontal scalability, real-time interaction, handling high volume data, etc. However, such migration from relational databases to Big Data is in itself a challenge due to the complexity of data. In this paper, we introduce a novel approach that handles complexity of automatic transformation of existing relational database (MySQL) into a Big data solution (Oracle NoSQL). The used approach supports a bi-fold transformation (schema-to-schema and data-to-data) to minimize the complexity of data and to allow improved analysis of data. A software prototype for this transformation is also developed as a proof of concept. The results of the experiments show the correctness of our transformations that outperform the other similar approaches.


Cancers ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 86
Author(s):  
Mohit Kumar ◽  
Chellappagounder Thangavel ◽  
Richard C. Becker ◽  
Sakthivel Sadayappan

Immunotherapy is one of the most effective therapeutic options for cancer patients. Five specific classes of immunotherapies, which includes cell-based chimeric antigenic receptor T-cells, checkpoint inhibitors, cancer vaccines, antibody-based targeted therapies, and oncolytic viruses. Immunotherapies can improve survival rates among cancer patients. At the same time, however, they can cause inflammation and promote adverse cardiac immune modulation and cardiac failure among some cancer patients as late as five to ten years following immunotherapy. In this review, we discuss cardiotoxicity associated with immunotherapy. We also propose using human-induced pluripotent stem cell-derived cardiomyocytes/ cardiac-stromal progenitor cells and cardiac organoid cultures as innovative experimental model systems to (1) mimic clinical treatment, resulting in reproducible data, and (2) promote the identification of immunotherapy-induced biomarkers of both early and late cardiotoxicity. Finally, we introduce the integration of omics-derived high-volume data and cardiac biology as a pathway toward the discovery of new and efficient non-toxic immunotherapy.


2021 ◽  
Vol 6 (5) ◽  
pp. 62
Author(s):  
John Morris ◽  
Mark Robinson ◽  
Roberto Palacin

The ‘short’ neutral section is a feature of alternating current (AC) railway overhead line electrification that is often unreliable and a source of train delays. However hardly any dynamic analysis of its behaviour has been undertaken. This paper briefly describes the work undertaken investigating the possibility of modelling the behaviour using a novel approach. The potential for thus improving the performance of short neutral sections is evaluated, with particular reference to the UK situation. The analysis fundamentally used dynamic simulation of the pantograph and overhead contact line (OCL) interface, implemented using a proprietary finite element analysis tool. The neutral section model was constructed using physical characteristics and laboratory tests data, and was included in a validated pantograph/OCL simulation model. Simulation output of the neutral section behaviour has been validated satisfactorily against real line test data. Using this method the sensitivity of the neutral section performance in relation to particular parameters of its construction was examined. A limited number of parameter adjustments were studied, seeking potential improvements. One such improvement identified involved the additional inclusion of a lever arm at the trailing end of the neutral section. A novel application of pantograph/OCL dynamic simulation to modelling neutral section behaviour has been shown to be useful in assessing the modification of neutral section parameters.


2006 ◽  
Vol 128 (9) ◽  
pp. 945-952 ◽  
Author(s):  
Sandip Mazumder

Two different algorithms to accelerate ray tracing in surface-to-surface radiation Monte Carlo calculations are investigated. The first algorithm is the well-known binary spatial partitioning (BSP) algorithm, which recursively bisects the computational domain into a set of hierarchically linked boxes that are then made use of to narrow down the number of ray-surface intersection calculations. The second algorithm is the volume-by-volume advancement (VVA) algorithm. This algorithm is new and employs the volumetric mesh to advance the ray through the computational domain until a legitimate intersection point is found. The algorithms are tested for two classical problems, namely an open box, and a box in a box, in both two-dimensional (2D) and three-dimensional (3D) geometries with various mesh sizes. Both algorithms are found to result in orders of magnitude gains in computational efficiency over direct calculations that do not employ any acceleration strategy. For three-dimensional geometries, the VVA algorithm is found to be clearly superior to BSP, particularly for cases with obstructions within the computational domain. For two-dimensional geometries, the VVA algorithm is found to be superior to the BSP algorithm only when obstructions are present and are densely packed.


2015 ◽  
Vol 2015 (DPC) ◽  
pp. 000995-001015
Author(s):  
Tom Strothmann

The potential of Thermo compression Bonding (TCB) has been widely discussed for several years, but it has not previously achieved widespread production use. TCB has now begun the transition to an accepted high volume manufacturing technology driven primarily by the memory market, but with wider adoption close for non-memory applications. Several key factors have enabled this transition, including advanced TCB equipment with higher UPH for cost reduction and advanced methods of inline process control. The unique requirements of TCB demand absolute process control, simultaneous data logging capability for multiple key factors in the process and portability of the process between tools. This introduces a level of sophistication that has not previously been required for BE assembly processes. This presentation will review state of the art TCB technology and the fundamental equipment requirements to support the transition to HVM.


2020 ◽  
Vol 26 (10) ◽  
pp. 3008-3021 ◽  
Author(s):  
Jonathan Sarton ◽  
Nicolas Courilleau ◽  
Yannick Remion ◽  
Laurent Lucas

2003 ◽  
Vol 47 (2) ◽  
pp. 43-51 ◽  
Author(s):  
M.B. Beck ◽  
Z. Lin

In spite of a long history of automated instruments being deployed in the water industry, only recently has the difficulty of extracting timely insights from high-grade, high-volume data sets become an important problem. Put simply, it is now relatively easy to be “data-rich”, much less easy to become “information-rich". Whether the availability of so many data arises from “technological push” or the “demand pull” of practical problem solving is not the subject of discussion. The paper focuses instead on two issues: first, an outline of a methodological framework, based largely on the algorithms of (on-line) recursive estimation and involving a sequence of transformations to which the data can be subjected; and second, presentation and discussion of the results of applying these transformations in a case study of a biological system of wastewater treatment. The principal conclusion is that the difficulty of transforming data into information may lie not so much in coping with the high sampling intensity enabled by automated monitoring networks, but in coming to terms with the complexity of the higher-order, multi-variable character of the data sets, i.e., in interpreting the interactions among many contemporaneously measured quantities.


2019 ◽  
Vol 8 (1) ◽  
pp. 4 ◽  
Author(s):  
Saleh Altowaijri ◽  
Mohamed Ayari ◽  
Yamen El Touati

By nature, some jobs are always in closed environments and employees may stay for long periods. This is the case for many professional activities such as military watch tours of borders, civilian buildings and facilities that need efficient control processes. The role assigned to personnel in such environments is usually sensitive and of high importance, especially in terms of security and protection. With this in mind, we proposed in our research a novel approach using multi-sensor technology to monitor many safety and security parameters including the health status of indoor workers, such as those in watchtowers and at guard posts. In addition, the data gathered for those employees (heart rate, temperature, eye movement, human motion, etc.) combined with the room’s sensor data (temperature, oxygen ratio, toxic gases, air quality, etc.) were saved by appropriate cloud services, which ensured easy access to the data without ignoring the privacy protection aspect of such critical material. This information can be used later by specialists to monitor the evolution of the worker’s health status as well as its cost-effectiveness, which gives the possibility to improve productivity in the workplace and general employee health.


Author(s):  
Daniel C McFarlane ◽  
Alexa K Doig ◽  
James A Agutter ◽  
Jonathan L Mercurio ◽  
Ranjeev Mittu ◽  
...  

Modern sensors for health surveillance generate high volumes and rates of data that currently overwhelm operational decision-makers. These data are collected with the intention of enabling front-line clinicians to make effective clinical judgments. Ironically, prior human–systems integration (HSI) studies show that the flood of data degrades rather than aids decision-making performance. Health surveillance operations can focus on aggregate changes to population health or on the status of individual people. In the case of clinical monitoring, medical device alarms currently create an information overload situation for front-line clinical workers, such as hospital nurses. Consequently, alarms are often missed or ignored, and an impending patient adverse event may not be recognized in time to prevent crisis. One innovation used to improve decision making in areas of data-rich environments is the Human Alerting and Interruption Logistics (HAIL) technology, which was originally sponsored by the US Office of Naval Research. HAIL delivers metacognitive HSI services that empower end-users to quickly triage interruptions and dynamically manage their multitasking. HAIL informed our development of an experimental prototype that provides a set of context-enabled alarm notification services (without automated alarm filtering) to support users’ metacognition for information triage. This application is called HAIL Clinical Alarm Triage (HAIL-CAT) and was designed and implemented on a smartwatch to support the mobile multitasking of hospital nurses. An empirical study was conducted in a 20-bed virtual hospital with high-fidelity patient simulators. Four teams of four registered nurses (16 in total) participated in a 180-minute simulated patient care scenario. Each nurse was assigned responsibility to care for five simulated patients and high rates of simulated health surveillance data were available from patient monitors, infusion pumps, and a call light system. Thirty alarms per nurse were generated in each 90-minute segment of the data collection sessions, only three of which were clinically important alarms. The within-subjects experimental design included a treatment condition where the nurses used HAIL-CAT on a smartwatch to triage and manage alarms and a control condition without the smartwatch. The results show that, when using the smartwatch, nurses responded three times faster to clinically important and actionable alarms. An analysis of nurse performance also shows no negative effects on their other duties. Subjective results show favorable opinions about utility, usability, training requirement, and adoptability. These positive findings suggest the potential for the HAIL HSI system to be transferrable to the domain of health surveillance to achieve the currently unrealized potential utility of high-volume data.


Sign in / Sign up

Export Citation Format

Share Document