scholarly journals The cone method: Inferring decision times from single-trial 3D movement trajectories in choice behavior

Author(s):  
Philipp Ulbrich ◽  
Alexander Gail

AbstractOngoing goal-directed movements can be rapidly adjusted following new environmental information, e.g., when chasing pray or foraging. This makes movement trajectories in go-before-you-know decision-making a suitable behavioral readout of the ongoing decision process. Yet, existing methods of movement analysis are often based on statistically comparing two groups of trial-averaged trajectories and are not easily applied to three-dimensional data, preventing them from being applicable to natural free behavior. We developed and tested the cone method to estimate the point of overt commitment (POC) along a single two- or three-dimensional trajectory, i.e., the position where the movement is adjusted towards a newly selected spatial target. In Experiment 1, we established a “ground truth” data set in which the cone method successfully identified the experimentally constrained POCs across a wide range of all but the shallowest adjustment angles. In Experiment 2, we demonstrate the power of the method in a typical decision-making task with expected decision time differences known from previous findings. The POCs identified by cone method matched these expected effects. In both experiments, we compared the cone method’s single trial performance with a trial-averaging method and obtained comparable results. We discuss the advantages of the single-trajectory cone method over trial-averaging methods and possible applications beyond the examples presented in this study. The cone method provides a distinct addition to existing tools used to study decisions during ongoing movement behavior, which we consider particularly promising towards studies of non-repetitive free behavior.

2020 ◽  
Author(s):  
Philipp Ulbrich ◽  
Alexander Gail

AbstractOngoing goal-directed movements can be rapidly adjusted following new environmental information, e.g. when chasing pray or foraging. This makes movement trajectories in go-before-you-know decision-making a suitable behavioral readout of the ongoing decision process. Yet, existing methods of movement analysis are often based on statistically comparing two groups of trial-averaged trajectories and are not easily applied to three-dimensional data, preventing them from being applicable to natural free behavior. We developed and tested the cone method to estimate the point of overt commitment (POC) along a single two- or three-dimensional trajectory, i.e. the position where movement is adjusted towards a newly selected spatial target. In Experiment 1, we established a “ground truth” data set in which the cone method successfully identified the experimentally constrained POCs across a wide range of all but the shallowest adjustment angles. In Experiment 2, we demonstrate the power of the method in a typical decision-making task with expected decision time differences known from previous findings. The POCs identified by cone method matched these expected effects. In both experiments, we compared the cone method’s single trial performance with a trial-averaging method and obtained comparable results. We discuss the advantages of the single-trajectory cone method over trial-averaging methods and possible applications beyond the examples presented in this study. The cone method provides a distinct addition to existing tools used to study decisions during ongoing movement behavior, which we consider particularly promising towards studies of non-repetitive free behavior.


1992 ◽  
Vol 16 (3) ◽  
pp. 319-338 ◽  
Author(s):  
Trevor Hoey

Temporal variability in bedload transport rates and spatial variability in sediment storage have been reported with increasing frequency in recent years. A spatial and temporal classification for these features is suggested based on the gravel bedform classification of Church and Jones (1982). The identified scales, meso-, macro-, and mega- are each broad, and within each there is a wide range of processes acting to produce bedload fluctuations. Sampling the same data set with different sampling intervals yields a near linear relationship between sampling interval and pulse period. A range of modelling strategies has been applied to bed waves. The most successful have been those which allow for the three-dimensional nature of sediment storage processes, and which allow changes in the width and depth of stored sediment. The existence of bed waves makes equilibrium in gravel-bed rivers necessarily dynamic. Bedload pulses and bed waves can be regarded as equilibrium forms at sufficiently long timescales.


2017 ◽  
Vol 14 (1) ◽  
pp. 172988141668713 ◽  
Author(s):  
Seongjo Lee ◽  
Seoungjae Cho ◽  
Sungdae Sim ◽  
Kiho Kwak ◽  
Yong Woon Park ◽  
...  

Obstacle avoidance and available road identification technologies have been investigated for autonomous driving of an unmanned vehicle. In order to apply research results to autonomous driving in real environments, it is necessary to consider moving objects. This article proposes a preprocessing method to identify the dynamic zones where moving objects exist around an unmanned vehicle. This method accumulates three-dimensional points from a light detection and ranging sensor mounted on an unmanned vehicle in voxel space. Next, features are identified from the cumulative data at high speed, and zones with significant feature changes are estimated as zones where dynamic objects exist. The approach proposed in this article can identify dynamic zones even for a moving vehicle and processes data quickly using several features based on the geometry, height map and distribution of three-dimensional space data. The experiment for evaluating the performance of proposed approach was conducted using ground truth data on simulation and real environment data set.


2014 ◽  
Vol 70 (a1) ◽  
pp. C368-C368 ◽  
Author(s):  
Alexander Eggeman ◽  
Robert Krakow ◽  
Paul Midgley

STEM and TEM-based tomography has been used widely to study the 3D morphology of a wide range of materials. Similarly reciprocal space tomography in which a tilt-series of diffraction patterns are acquired offers a powerful method for the analysis of the atomic structure of crystalline materials. The natural progression is to combine these techniques into a complete three dimensional morphology and crystallography data set, allowing both features to be studied simultaneously. Using a tilt series of scanning precession electron diffraction measurements from a commercially available Ni-base superalloy as an example, the complete reciprocal lattice orientation for a number of components embedded within the matrix could be determined. It was straightforward to identify reciprocal lattice vectors that allowed dark-field images representing each phase to be produced post-acquisition. In turn these were combined using geometric tomography methods to yield a 3-D tomogram of the superalloy. Imaging these phases using conventional ADF STEM tomography would potentially be challenging given the compositional similarity between the different phases. From the combined dataset the spatial distribution of the component phases could be easily recovered but more importantly the orientational relationships between these different components could be unambiguously determined. In this way the thermo-mechanical history of the sample could be inferred from the arrangement of coherent and semi-coherent interfaces and a previously unreported crystallographic registry between metal carbide (MC) and the matrix f.c.c. phases could been identified. The possibilities for development and applications of this technique will be discussed further.


2014 ◽  
Vol 60 ◽  
pp. 39-55
Author(s):  
R. A. Crowther ◽  
A. G. W. Leslie

Ulrich (Uli) Arndt was a physicist and engineer whose contributions to the development of a wide range of instrumentation for X-ray crystallography played an important part in our ability to solve the atomic structure of large biological molecules. Such detailed information about protein structures has for the past 50 years underpinned the huge advances in the field of molecular biology. His innovations spanned all aspects of data generation and collection, from improvements in X-ray tubes, through novel designs for diffractometers and cameras to film scanners and more direct methods of X-ray detection. When he started in the field, the intensities of individual X-ray reflections were often estimated by eye from films. By the end of his career the whole process of collecting from a crystal a three-dimensional data set, possibly comprising hundreds of thousands of measurements, was fully automated and very rapid.


2021 ◽  
Author(s):  
Mikko Johannes Lensu ◽  
Markku Henrik Similä

Abstract. The statistics of ice ridging signatures was studied using a high (1.25 m) and a medium (20 m) resolution SAR image over the Baltic sea ice cover, acquired in 2016 and 2011, respectively. Ice surface profiles measured by a 2011 Baltic campaign was used as ground truth data for both. The images did not delineate well individual ridges as linear features. This was assigned to the random, intermittent occurrence of ridge rubble block arrangements with bright SAR return. Instead, the ridging signature was approached in terms of the density of bright pixels and relations with the corresponding surface profile quantity, ice ridge density, were studied. In order to apply discrete statistics, these densities were quantified by counting bright pixel numbers (BPN) in pixel blocks of side length L, and by counting ridge sail numbers (RSN) in profile segments of length L. The scale L is a variable parameter of the approach. The other variable parameter is the pixel intensity threshold defining bright pixels, equivalently bright pixel percentage (BPP), or the ridge sail height threshold used to select ridges from surface profiles, respectively. As a sliding image operation the BPN count resulted in enhanced ridging signature and better applicability of SAR in ice information production. A distribution model for BPN statistics was derived by considering how the BPN values change in BPP changes. The model was found to apply over wide range of values for BPP and L. The same distribution model was found to apply to RSN statistics. This reduces the problem of correspondence between the two density concepts to connections between the parameters of the respective distribution models. The correspondence was studied for the medium resolution image for which the 2011 surface data set has close temporal match. The comparison was done by estimating ridge rubble coverage in 1 km2 squares from surface profile data and, on the other hand, assuming that the bright pixel density can be used as a proxy for ridge rubble coverage. Apart from a scaling factor, both were found to follow the presented distribution model.


Author(s):  
Bridget Carragher

Structural biologists typically acquire data in the form of a two-dimensional image (or set of images) from which the three-dimensional structure of the object of interest must be inferred. Examples can be found over a range of sizes spanning many orders of magnitude, and covering structures from the macroscopic to the atomic scale. A correspondingly wide range of different instruments is used in the collection of this data, from CT/MRI scanners, through light and electron microscopes, and recently, atomic force instruments. The images which are collected from these instruments may be in the form of a series of 2D slices through the 3D data set (and these may be either physical sections or optical sections) or a series of to mographic 2D projections of the 3D dataset. In either case it is highly likely that computer software tools will be used on the data set eitheras an aid in the qualitative interpretation of the structure or as a means of extracting quantitative morphological measurements.


Geophysics ◽  
1996 ◽  
Vol 61 (4) ◽  
pp. 1050-1064 ◽  
Author(s):  
Mark Grasmueck

Three‐dimensional, ground‐penetrating radar (georadar) techniques suitable for geological engineering applications have been developed and tested. Initial experiments were conducted on the floor of a quarry in southern Switzerland from which ornamental gneissic rock is extracted. During a brief two‐day period, constant‐offset georadar data were recorded over a [Formula: see text] area with a grid cell size of 0.1 m × 0.2 m. Georadar velocities were estimated from the results of expanding spread surveys. All georadar data and associated geometry files were recorded automatically in seismic industry formats. The experimental georadar data set was processed, image‐enhanced, and interpreted using [Formula: see text] seismic reflection software operating on a workstation. Arbitrary vertical sections, time slices, [Formula: see text] images, and animated movies in which the observer “travels” through the entire data volume were constructed from the resultant migrated georadar data. Semi‐automatic tracking routines allowed continuous subhorizontal reflections to a maximum depth of 30 m to be mapped through the rock mass. These reflections, which are characterized by negative polarity onsets, are probably caused by a system of ubiquitous water‐filled fractures, 2–4 cm thick. Volumes of rock bounded by the subhorizontal fractures were estimated from isopach maps and rock quality was assessed on the basis of root‐mean‐square (rms) amplitudes of reflections. An extension of a steep‐dipping fault exposed on a nearby quarry wall was best delineated on maps representing the horizontal gradients of reflection times. To synthesize in a single figure the principal geological results of the study, picked reflection times were presented in the form of shaded relief surfaces, in which remarkably vivid structural details of the subhorizontal fractures and intersecting near‐vertical fault could be discerned. It is concluded that 3-D georadar methods have the potential to resolve a wide range of engineering and environmental problems.


2014 ◽  
Vol 1 (1) ◽  
pp. 81-98 ◽  
Author(s):  
Renu Vashist ◽  
M. L. Garg

Rough Set Theory (RST) is relatively new and powerful mathematical tool to deal with imperfect data (i.e. data with uncertainty and vagueness) which is primarily used for classification and decision making problems. On the other hand, Logistic regression (Logit) is mainly used in Social Sciences when dependent variable takes limited and categorical data value ranges. However, both RST and Logit regression are powerful predictable models that are used in wide range of applications such as medicine, military, banking, financial markets etc. RST uses approximations and implications as two formal tools to deal with vagueness whereas Logit regression is severely constrained to deal with vague and imprecise data. Yet, both these methodologies are used to classify the object which is the key issue in decision making. This research paper compares these two tools on a common dataset. SPSS 17.0 software is used to run the Logit regression and Rose 2 software is used for analysis of Rough Set. One of the important finding of this comparison is that attributes in core of the data set under the rough set approach are similar to the most significant predictors of logistic regression model. This indicates that the significant attributes deducted by these two methodologies are similar. It is demonstrated that rough set is much more superior tool to classify the objects as compared to logistic regression. One of the important outcomes of this research is that degree of accuracy is much higher in rough set as compared to logistic regression thereby establishing the supremacy of rough set as a better decision making tool.


Author(s):  
J. Golze ◽  
U. Feuerhake ◽  
C. Koetsier ◽  
M. Sester

Abstract. The wide usage of GPS-equipped devices enables the mass recording of vehicle movement trajectories describing the movement behavior of the traffic participants. An important aspect of the road traffic is the impact of anomalies, like accidents, on traffic flow. Accidents are especially important as they contribute to the the aspects of safety and also influence travel time estimations. In this paper, the impact of accidents is determined based on a massive GPS trajectory and accident dataset. Due to the missing precise date of the accidents in the data set used, first, the date of the accident is estimated based on the speed profile at the accident time. Further, the temporal impact of the accident is estimated using the speed profile of the whole day. The approach is applied in an experiment on a one month subset of the datasets. The results show that more than 72% of the accident dates are identified and the impact on the temporal dimension is approximated. Moreover, it can be seen that accidents during the rush hours and on high frequency road types (e.g. motorways, trunks or primaries) have an increasing effect on the impact duration on the traffic flow.


Sign in / Sign up

Export Citation Format

Share Document