Visualizing Changes in Coordinate Terms over Time: An Example of Mining Repositories of Temporal Data through their Search Interfaces

Author(s):  
Hiroaki Ohshima ◽  
Adam Jatowt ◽  
Satoshi Oyama ◽  
Katsumi Tanaka
Keyword(s):  
2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Ermanno Cordelli ◽  
Paolo Soda ◽  
Giulio Iannello

Abstract Background Biological phenomena usually evolves over time and recent advances in high-throughput microscopy have made possible to collect multiple 3D images over time, generating $$3D+t$$ 3 D + t (or 4D) datasets. To extract useful information there is the need to extract spatial and temporal data on the particles that are in the images, but particle tracking and feature extraction need some kind of assistance. Results This manuscript introduces our new freely downloadable toolbox, the Visual4DTracker. It is a MATLAB package implementing several useful functionalities to navigate, analyse and proof-read the track of each particle detected in any $$3D+t$$ 3 D + t stack. Furthermore, it allows users to proof-read and to evaluate the traces with respect to a given gold standard. The Visual4DTracker toolbox permits the users to visualize and save all the generated results through a user-friendly graphical user interface. This tool has been successfully used in three applicative examples. The first processes synthetic data to show all the software functionalities. The second shows how to process a 4D image stack showing the time-lapse growth of Drosophila cells in an embryo. The third example presents the quantitative analysis of insulin granules in living beta-cells, showing that such particles have two main dynamics that coexist inside the cells. Conclusions Visual4DTracker is a software package for MATLAB to visualize, handle and manually track $$3D+t$$ 3 D + t stacks of microscopy images containing objects such cells, granules, etc.. With its unique set of functions, it remarkably permits the user to analyze and proof-read 4D data in a friendly 3D fashion. The tool is freely available at https://drive.google.com/drive/folders/19AEn0TqP-2B8Z10kOavEAopTUxsKUV73?usp=sharing


2021 ◽  
Vol 11 (13) ◽  
pp. 6078
Author(s):  
Tiffany T. Ly ◽  
Jie Wang ◽  
Kanchan Bisht ◽  
Ukpong Eyo ◽  
Scott T. Acton

Automatic glia reconstruction is essential for the dynamic analysis of microglia motility and morphology, notably so in research on neurodegenerative diseases. In this paper, we propose an automatic 3D tracing algorithm called C3VFC that uses vector field convolution to find the critical points along the centerline of an object and trace paths that traverse back to the soma of every cell in an image. The solution provides detection and labeling of multiple cells in an image over time, leading to multi-object reconstruction. The reconstruction results can be used to extract bioinformatics from temporal data in different settings. The C3VFC reconstruction results found up to a 53% improvement on the next best performing state-of-the-art tracing method. C3VFC achieved the highest accuracy scores, in relation to the baseline results, in four of the five different measures: Entire structure average, the average bi-directional entire structure average, the different structure average, and the percentage of different structures.


2017 ◽  
Author(s):  
Prashanti Manda ◽  
Todd J Vision

The scientific literature contains an historic record of the changing ways in which we describe the world. Shifts in understanding of scientific concepts are reflected in the introduction of new terms and the changing usage and context of existing ones. We conducted an ontology-based temporal data mining analysis of biodiversity literature from the 1700s to 2000s to quantitatively measure how the context of usage for vertebrate anatomical concepts has changed over time. The corpus of literature was divided into nine non-overlapping time periods with comparable amounts of data and context vectors of anatomical concepts were compared to measure the magnitude of concept drift both between adjacent time periods and cumulatively relative to the initial state. Surprisingly, we found that while anatomical concept drift between adjacent time periods was substantial (55% to 68%), it was of the same magnitude as cumulative concept drift across multiple time periods. Such a process, bound by an overall mean drift, fits the expectations of a mean-reverting process.


2020 ◽  
Vol 9 (1) ◽  
pp. 34
Author(s):  
Luigi Barazzetti ◽  
Mattia Previtali ◽  
Marco Scaioni

The identification of deterioration mechanisms and their monitoring over time is an essential phase for conservation. This work aimed at developing a novel approach for deterioration mapping and monitoring based on 360° images, which allows for simple and rapid data collection. The opportunity to capture the whole scene around a 360° camera reduces the number of images needed in a condition mapping project, resulting in a powerful solution to document small and narrow spaces. The paper will describe the implemented workflow for deterioration mapping based on 360° images, which highlights pathologies on surfaces and quantitatively measures their extension. Such a result will be available as standard outputs as well as an innovative virtual environment for immersive visualization. The case of multi-temporal data acquisition will be considered and discussed as well. Multiple 360° images acquired at different epochs from slightly different points are co-registered to obtain pixel-to-pixel correspondence, providing a solution to quantify and track deterioration effects.


GigaScience ◽  
2020 ◽  
Vol 9 (8) ◽  
Author(s):  
Carlos Sáez ◽  
Alba Gutiérrez-Sacristán ◽  
Isaac Kohane ◽  
Juan M García-Gómez ◽  
Paul Avillach

Abstract Background Temporal variability in health-care processes or protocols is intrinsic to medicine. Such variability can potentially introduce dataset shifts, a data quality issue when reusing electronic health records (EHRs) for secondary purposes. Temporal data-set shifts can present as trends, as well as abrupt or seasonal changes in the statistical distributions of data over time. The latter are particularly complicated to address in multimodal and highly coded data. These changes, if not delineated, can harm population and data-driven research, such as machine learning. Given that biomedical research repositories are increasingly being populated with large sets of historical data from EHRs, there is a need for specific software methods to help delineate temporal data-set shifts to ensure reliable data reuse. Results EHRtemporalVariability is an open-source R package and Shiny app designed to explore and identify temporal data-set shifts. EHRtemporalVariability estimates the statistical distributions of coded and numerical data over time; projects their temporal evolution through non-parametric information geometric temporal plots; and enables the exploration of changes in variables through data temporal heat maps. We demonstrate the capability of EHRtemporalVariability to delineate data-set shifts in three impact case studies, one of which is available for reproducibility. Conclusions EHRtemporalVariability enables the exploration and identification of data-set shifts, contributing to the broad examination and repurposing of large, longitudinal data sets. Our goal is to help ensure reliable data reuse for a wide range of biomedical data users. EHRtemporalVariability is designed for technical users who are programmatically utilizing the R package, as well as users who are not familiar with programming via the Shiny user interface. Availability: https://github.com/hms-dbmi/EHRtemporalVariability/ Reproducible vignette: https://cran.r-project.org/web/packages/EHRtemporalVariability/vignettes/EHRtemporalVariability.html Online demo: http://ehrtemporalvariability.upv.es/


2019 ◽  
Author(s):  
Namita J Bhan ◽  
Jonathan Strutz ◽  
Joshua Glaser ◽  
Reza Kalhor ◽  
Edward Boyden ◽  
...  

AbstractRecording biological signals can be difficult in three-dimensional matrices, such as tissue. We present a DNA polymerase-based strategy that records temporal biosignals locally onto DNA to be read out later, which could obviate the need to extract information from tissue on the fly. We use a template-independent DNA polymerase, terminal deoxynucleotidyl transferase (TdT) that probabilistically adds dNTPs to single-stranded DNA (ssDNA) substrates without a template. We show that in vitro, the dNTP-incorporation preference of TdT changes with the presence of Co2+, Ca2+, Zn2+ and temperature. Extracting the signal profile over time is possible by examining the dNTP incorporation preference along the length of synthesized ssDNA strands like a molecular ticker tape. We call this TdT-based untemplated recording of temporal local environmental signals (TURTLES). We show that we can determine the time of Co2+ addition to within two minutes over a 60-minute period. Further, TURTLES has the capability to record multiple fluctuations. We can estimate the rise and fall of an input Co2+ pulse to within three minutes. TURTLES has at least 200-fold better temporal resolution than all previous DNA-based recording techniques.


Author(s):  
Matteo Golfarelli ◽  
Stefano Rizzi

Data warehouses are information repositories specialized in supporting decision making. Since the decisional process typically requires an analysis of historical trends, time and its management acquire a huge importance. In this paper we consider the variety of issues, often grouped under term temporal data warehousing, implied by the need for accurately describing how information changes over time in data warehousing systems. We recognize that, with reference to a three-levels architecture, these issues can be classified into some topics, namely: handling data/schema changes in the data warehouse, handling data/schema changes in the data mart, querying temporal data, and designing temporal data warehouses. After introducing the main concepts and terminology of temporal databases, we separately survey these topics. Finally, we discuss the open research issues also in connection with their implementation on commercial tools.


Author(s):  
Safa Brahmia ◽  
Zouhaier Brahmia ◽  
Fabio Grandi ◽  
Rafik Bouaziz

The JSON Schema language lacks explicit support for defining time-varying schemas of JSON documents. Moreover, existing JSON NoSQL databases (e.g., MongoDB, CouchDB) do not provide any support for managing temporal data. Hence, administrators of JSON NoSQL databases have to use ad hoc techniques in order to specify JSON schema for time-varying instances. In this chapter, the authors propose a disciplined approach, named Temporal JSON Schema (τJSchema), for the temporal management of JSON documents. τJSchema allows creating a temporal JSON schema from (1) a conventional JSON schema, (2) a set of temporal logical characteristics, for specifying which components of a JSON document can vary over time, and (3) a set of temporal physical characteristics, for specifying how the time-varying aspects are represented in the document. By using such characteristics to describe temporal aspects of JSON data, τJSchema guarantees logical and physical data independence and provides a low-impact solution since it requires neither updates to existing JSON documents nor extensions to related JSON technologies.


2009 ◽  
pp. 221-237 ◽  
Author(s):  
Matteo Golfarelli ◽  
Stefano Rizzi

Data warehouses are information repositories specialized in supporting decision making. Since the decisional process typically requires an analysis of historical trends, time and its management acquire a huge importance. In this paper we consider the variety of issues, often grouped under term temporal data warehousing, implied by the need for accurately describing how information changes over time in data warehousing systems. We recognize that, with reference to a three-levels architecture, these issues can be classified into some topics, namely: handling data/schema changes in the data warehouse, handling data/schema changes in the data mart, querying temporal data, and designing temporal data warehouses. After introducing the main concepts and terminology of temporal databases, we separately survey these topics. Finally, we discuss the open research issues also in connection with their implementation on commercial tools.


Author(s):  
Christian Beilschmidt ◽  
Johannes Drönner ◽  
Néstor Fernández ◽  
Christian Langer ◽  
Michael Mattig ◽  
...  

The Essential Biodiversity Variables (EBVs) are important information sources for scientists and decision makers. They are developed and promoted by the Group on Earth Observations Biodiversity Observation Network (GEO BON) together with the community. EBVs provide an abstraction level between measurements and indicators. This enables access to biodiversity observations and allows different groups of users to detect temporal trends as well as regional deviations. In particular, the analysis of EBVs supports finding countermeasures for current important challenges like biodiversity loss and climate change. A visual assessment is an intuitive way to drive the analysis. As one example, researchers can recognize and interpret the changes of forest cover maps over time. The VAT System, in which VAT is an acronym for visualization, analysis and transformation, is an ideal candidate platform for the creation of such an analytical application. It is a geographical processing system that supports a variety of spatio-temporal data types and allows computations using heterogeneous data. For user interaction, it offers a web-based user interface that is built with state-of-the-art web technology. Users can perform interactive analysis of spatio-temporal data by visualizing data on maps and using various graphs and diagrams that are linked to the user’s area of interest. Furthermore, users are enabled to browse through the temporal dimension of the data using a time slider tool. This provides an easy access to large spatio-temporal data sets. One exemplary use case is the creation of EBV statistics for selected countries or areas. This functionality is provided as an app that is built upon the VAT System. Here, users select EBVs, a time range and a metric, and create temporal charts that display developments over time. The charts are constructed internally by employing R scripts that were created by domain experts. The scripts are executed using VAT’s R connectivity module. Finally, users can export the results to their local computers. An export contains the result itself and additionally, a list of citations of the included EBVs as well as a workflow description of all processing steps for reasons of reproducibility. Such a use case exemplifies the suitability of the VAT System to facilitate the creation of similar projects or applications without the need of programming, using VAT’s modular and flexible components.


Sign in / Sign up

Export Citation Format

Share Document