A Gaze-Contingent System for Foveated Multiresolution Visualization of Vector and Volumetric Data

2020 ◽  
Vol 2020 (1) ◽  
pp. 374-1-374-11
Author(s):  
Thanawut Ananpiriyakul ◽  
Joshua Anghel ◽  
Kristi Potter ◽  
Alark Joshi

Computational complexity is a limiting factor for visualizing large-scale scientific data. Most approaches to render large datasets are focused on novel algorithms that leverage cutting-edge graphics hardware to provide users with an interactive experience. In this paper, we alternatively demonstrate foveated imaging which allows interactive exploration using low-cost hardware by tracking the gaze of a participant to drive the rendering quality of an image. Foveated imaging exploits the fact that the spatial resolution of the human visual system decreases dramatically away from the central point of gaze, allowing computational resources to be reserved for areas of importance. We demonstrate this approach using face tracking to identify the gaze point of the participant for both vector and volumetric datasets and evaluate our results by comparing against traditional techniques. In our evaluation, we found a significant increase in computational performance using our foveated imaging approach while maintaining high image quality in regions of visual attention.

2018 ◽  
Vol 2018 ◽  
pp. 1-23 ◽  
Author(s):  
Hao Chen ◽  
Shu Yang ◽  
Jun Li ◽  
Ning Jing

With the development of aerospace science and technology, Earth Observation Satellite cluster which consists of heterogeneous satellites with many kinds of payloads appears gradually. Compared with the traditional satellite systems, satellite cluster has some particular characteristics, such as large-scale, heterogeneous satellite platforms, various payloads, and the capacity of performing all the observation tasks. How to select a subset from satellite cluster to perform all observation tasks effectively with low cost is a new challenge arousing in the field of aerospace resource scheduling. This is the agent team formation problem for observation task-oriented satellite cluster. A mathematical scheduling model is built. Three novel algorithms, i.e., complete search algorithm, heuristic search algorithm, and swarm intelligence optimization algorithm, are proposed to solve the problem in different scales. Finally, some experiments are conducted to validate the effectiveness and practicability of our algorithms.


2021 ◽  
Vol 35 (11) ◽  
pp. 1272-1273
Author(s):  
Charles Varin ◽  
Rhys Emms ◽  
Graeme Bart ◽  
Thomas Fennel ◽  
Thomas Brabec

Including optical nonlinearity in FDTD software in a stable, efficient, and rigorous way can be challenging. Traditional methods address this challenge by solving an implicit form of Maxwell’s equations iteratively. Reaching numerical convergence over the entire numerical space at each time step demands significant computational resources, which can be a limiting factor for the modeling of large-scale three-dimensional nonlinear optics problems (complex photonics devices, laser filamentation, ...). Recently, we proposed an explicit methodology based on a nonlinear generalization of the Lorentz dispersion model and developed example cases where it was used to account for both linear and nonlinear optical effects. An overview of this work is proposed here.


2013 ◽  
Vol 42 (5) ◽  
pp. e32-e32 ◽  
Author(s):  
Jun Li ◽  
Hairong Wei ◽  
Tingsong Liu ◽  
Patrick Xuechun Zhao

Abstract The accurate construction and interpretation of gene association networks (GANs) is challenging, but crucial, to the understanding of gene function, interaction and cellular behavior at the genome level. Most current state-of-the-art computational methods for genome-wide GAN reconstruction require high-performance computational resources. However, even high-performance computing cannot fully address the complexity involved with constructing GANs from very large-scale expression profile datasets, especially for the organisms with medium to large size of genomes, such as those of most plant species. Here, we present a new approach, GPLEXUS (http://plantgrn.noble.org/GPLEXUS/), which integrates a series of novel algorithms in a parallel-computing environment to construct and analyze genome-wide GANs. GPLEXUS adopts an ultra-fast estimation for pairwise mutual information computing that is similar in accuracy and sensitivity to the Algorithm for the Reconstruction of Accurate Cellular Networks (ARACNE) method and runs ∼1000 times faster. GPLEXUS integrates Markov Clustering Algorithm to effectively identify functional subnetworks. Furthermore, GPLEXUS includes a novel ‘condition-removing’ method to identify the major experimental conditions in which each subnetwork operates from very large-scale gene expression datasets across several experimental conditions, which allows users to annotate the various subnetworks with experiment-specific conditions. We demonstrate GPLEXUS’s capabilities by construing global GANs and analyzing subnetworks related to defense against biotic and abiotic stress, cell cycle growth and division in Arabidopsis thaliana.


2020 ◽  
Author(s):  
David Bastviken ◽  
Jonatan Nygren ◽  
Jonathan Schenk ◽  
Roser Parrelada Massana ◽  
Nguyen Thanh Duc

<p>The lack of reliable low-cost greenhouse gas flux measurement approaches limit our ability quantify regulation and verify mitigation efforts at the local level.   Methane (CH4), one of the most important greenhouse gases, is particularly dependent on local measurements because levels are regulated by a complex combination of sources, sinks and environmental conditions. There are still major gaps in the global methane budget and the reasons for the irregular development over time remains unclear. Facilitation of local flux measurements in all parts of the world therefore seem important to constrain large-scale assessments. As the high cost of gas analysers is a limiting factor for flux measurements, we here present how low-cost CH4 sensors can be used outside their specified range to yield reasonably accurate chamber-based flux measurements. By using a two-step calibration approach, testing multiple alternatives on how to model interference from temperature and humidity, an R2 ≥ 0.99 was achieved over a CH4 concentration range of 2 – 700 ppm under variable temperature and relative humidity. We also demonstrate ways to reach such calibration results without complicated calibration experiments and instead using in the order of 20 in situ reference measurements at different environmental conditions. Finally we, constructed and described a make-it-yourself Arduino based logger with the tested sensors for CH<sub>4</sub>, temperature, humidity and carbon dioxide (CO<sub>2</sub>) intended for flux chamber use with a material cost of approximately 200 Euro. We hope that this can contribute to more widespread greenhouse gas flux measurements in many environments and countries.</p>


Author(s):  
Christopher Yeates ◽  
Cornelia Schmidt-Hattenberger ◽  
Wolfgang Weinzierl ◽  
David Bruhn

AbstractDesigning low-cost network layouts is an essential step in planning linked infrastructure. For the case of capacitated trees, such as oil or gas pipeline networks, the cost is usually a function of both pipeline diameter (i.e. ability to carry flow or transferred capacity) and pipeline length. Even for the case of incompressible, steady flow, minimizing cost becomes particularly difficult as network topology itself dictates local flow material balances, rendering the optimization space non-linear. The combinatorial nature of potential trees requires the use of graph optimization heuristics to achieve good solutions in reasonable time. In this work we perform a comparison of known literature network optimization heuristics and metaheuristics for finding minimum-cost capacitated trees without Steiner nodes, and propose novel algorithms, including a metaheuristic based on transferring edges of high valency nodes. Our metaheuristic achieves performance above similar algorithms studied, especially for larger graphs, usually producing a significantly higher proportion of optimal solutions, while remaining in line with time-complexity of algorithms found in the literature. Data points for graph node positions and capacities are first randomly generated, and secondly obtained from the German emissions trading CO2 source registry. As political will for applications and storage for hard-to-abate industry CO2 emissions is growing, efficient network design methods become relevant for new large-scale CO2 pipeline networks.


2020 ◽  
Author(s):  
CHRISTOPHER YEATES ◽  
Cornelia Schmidt-Hattenberger ◽  
Wolfgang Weinzierl ◽  
David Bruhn

Designing low-cost networks is an essential step in planning linked infrastructure. For the case of capacitated trees, such as oil or gas pipeline networks, the cost is usually a function of both pipeline thickness (i.e. capacity) and pipeline length. Minimizing cost becomes particularly difficult as network topology itself dictates local flow material balances, rendering the optimization space non-linear. The combinatorial nature of potential trees requires the use of graph optimization heuristics to achieve good solutions in reasonable time. In this work we perform a comparison of known literature network optimization heuristics and metaheuristics, and propose novel algorithms, including a metaheuristic based on transferring edges of high valency nodes. Our metaheuristic achieves performance above similar algorithms studied, especially for larger graphs, usually producing a significantly higher proportion of optimal solutions, while remaining in line with time-complexity of algorithms found in the literature. Data points for graph node positions and capacities are first randomly generated, and secondly obtained from the German emissions trading CO2 source registry. Driven by the increasing necessity to find applications and storage for industry CO2 emissions, finding minimum-cost networks increases the business case for large-scale CO2 transportation pipeline infrastructure.


1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


BMC Biology ◽  
2019 ◽  
Vol 17 (1) ◽  
Author(s):  
Amrita Srivathsan ◽  
Emily Hartop ◽  
Jayanthi Puniamoorthy ◽  
Wan Ting Lee ◽  
Sujatha Narayanan Kutty ◽  
...  

Abstract Background More than 80% of all animal species remain unknown to science. Most of these species live in the tropics and belong to animal taxa that combine small body size with high specimen abundance and large species richness. For such clades, using morphology for species discovery is slow because large numbers of specimens must be sorted based on detailed microscopic investigations. Fortunately, species discovery could be greatly accelerated if DNA sequences could be used for sorting specimens to species. Morphological verification of such “molecular operational taxonomic units” (mOTUs) could then be based on dissection of a small subset of specimens. However, this approach requires cost-effective and low-tech DNA barcoding techniques because well-equipped, well-funded molecular laboratories are not readily available in many biodiverse countries. Results We here document how MinION sequencing can be used for large-scale species discovery in a specimen- and species-rich taxon like the hyperdiverse fly family Phoridae (Diptera). We sequenced 7059 specimens collected in a single Malaise trap in Kibale National Park, Uganda, over the short period of 8 weeks. We discovered > 650 species which exceeds the number of phorid species currently described for the entire Afrotropical region. The barcodes were obtained using an improved low-cost MinION pipeline that increased the barcoding capacity sevenfold from 500 to 3500 barcodes per flowcell. This was achieved by adopting 1D sequencing, resequencing weak amplicons on a used flowcell, and improving demultiplexing. Comparison with Illumina data revealed that the MinION barcodes were very accurate (99.99% accuracy, 0.46% Ns) and thus yielded very similar species units (match ratio 0.991). Morphological examination of 100 mOTUs also confirmed good congruence with morphology (93% of mOTUs; > 99% of specimens) and revealed that 90% of the putative species belong to the neglected, megadiverse genus Megaselia. We demonstrate for one Megaselia species how the molecular data can guide the description of a new species (Megaselia sepsioides sp. nov.). Conclusions We document that one field site in Africa can be home to an estimated 1000 species of phorids and speculate that the Afrotropical diversity could exceed 200,000 species. We furthermore conclude that low-cost MinION sequencers are very suitable for reliable, rapid, and large-scale species discovery in hyperdiverse taxa. MinION sequencing could quickly reveal the extent of the unknown diversity and is especially suitable for biodiverse countries with limited access to capital-intensive sequencing facilities.


2021 ◽  
Author(s):  
Parsoa Khorsand ◽  
Fereydoun Hormozdiari

Abstract Large scale catalogs of common genetic variants (including indels and structural variants) are being created using data from second and third generation whole-genome sequencing technologies. However, the genotyping of these variants in newly sequenced samples is a nontrivial task that requires extensive computational resources. Furthermore, current approaches are mostly limited to only specific types of variants and are generally prone to various errors and ambiguities when genotyping complex events. We are proposing an ultra-efficient approach for genotyping any type of structural variation that is not limited by the shortcomings and complexities of current mapping-based approaches. Our method Nebula utilizes the changes in the count of k-mers to predict the genotype of structural variants. We have shown that not only Nebula is an order of magnitude faster than mapping based approaches for genotyping structural variants, but also has comparable accuracy to state-of-the-art approaches. Furthermore, Nebula is a generic framework not limited to any specific type of event. Nebula is publicly available at https://github.com/Parsoa/Nebula.


Sign in / Sign up

Export Citation Format

Share Document