Manipulation for self-Identification, and self-Identification for better manipulation

2021 ◽  
Vol 6 (54) ◽  
pp. eabe1321
Author(s):  
Kaiyu Hang ◽  
Walter G. Bircher ◽  
Andrew S. Morgan ◽  
Aaron M. Dollar

The process of modeling a series of hand-object parameters is crucial for precise and controllable robotic in-hand manipulation because it enables the mapping from the hand’s actuation input to the object’s motion to be obtained. Without assuming that most of these model parameters are known a priori or can be easily estimated by sensors, we focus on equipping robots with the ability to actively self-identify necessary model parameters using minimal sensing. Here, we derive algorithms, on the basis of the concept of virtual linkage-based representations (VLRs), to self-identify the underlying mechanics of hand-object systems via exploratory manipulation actions and probabilistic reasoning and, in turn, show that the self-identified VLR can enable the control of precise in-hand manipulation. To validate our framework, we instantiated the proposed system on a Yale Model O hand without joint encoders or tactile sensors. The passive adaptability of the underactuated hand greatly facilitates the self-identification process, because they naturally secure stable hand-object interactions during random exploration. Relying solely on an in-hand camera, our system can effectively self-identify the VLRs, even when some fingers are replaced with novel designs. In addition, we show in-hand manipulation applications of handwriting, marble maze playing, and cup stacking to demonstrate the effectiveness of the VLR in precise in-hand manipulation control.

2019 ◽  
Author(s):  
Joseph John Pyne Simons ◽  
Ilya Farber

Not all transit users have the same preferences when making route decisions. Understanding the factors driving this heterogeneity enables better tailoring of policies, interventions, and messaging. However, existing methods for assessing these factors require extensive data collection. Here we present an alternative approach - an easily-administered single item measure of overall preference for speed versus comfort. Scores on the self-report item predict decisions in a choice task and account for a proportion of the differences in model parameters between people (n=298). This single item can easily be included on existing travel surveys, and provides an efficient method to both anticipate the choices of users and gain more general insight into their preferences.


2021 ◽  
Vol 47 (4) ◽  
pp. 392-401
Author(s):  
Volker Kaul

Liberalism believes that individuals are endowed a priori with reason or at least agency and it is up to that reason and agency to make choices, commitments and so on. Communitarianism criticizes liberalism’s explicit and deliberate neglect of the self and insists that we attain a self and identity only through the effective recognition of significant others. However, personal autonomy does not seem to be a default position, neither reason nor community is going to provide it inevitably. Therefore, it is so important to go beyond the liberal–communitarian divide. This article is analysing various proposals in this direction, asks about the place of communities and the individual in times of populism and the pandemic and provides a global perspective on the liberal–communitarian debate.


2011 ◽  
Vol 64 (S1) ◽  
pp. S3-S18 ◽  
Author(s):  
Yuanxi Yang ◽  
Jinlong Li ◽  
Junyi Xu ◽  
Jing Tang

Integrated navigation using multiple Global Navigation Satellite Systems (GNSS) is beneficial to increase the number of observable satellites, alleviate the effects of systematic errors and improve the accuracy of positioning, navigation and timing (PNT). When multiple constellations and multiple frequency measurements are employed, the functional and stochastic models as well as the estimation principle for PNT may be different. Therefore, the commonly used definition of “dilution of precision (DOP)” based on the least squares (LS) estimation and unified functional and stochastic models will be not applicable anymore. In this paper, three types of generalised DOPs are defined. The first type of generalised DOP is based on the error influence function (IF) of pseudo-ranges that reflects the geometry strength of the measurements, error magnitude and the estimation risk criteria. When the least squares estimation is used, the first type of generalised DOP is identical to the one commonly used. In order to define the first type of generalised DOP, an IF of signal–in-space (SIS) errors on the parameter estimates of PNT is derived. The second type of generalised DOP is defined based on the functional model with additional systematic parameters induced by the compatibility and interoperability problems among different GNSS systems. The third type of generalised DOP is defined based on Bayesian estimation in which the a priori information of the model parameters is taken into account. This is suitable for evaluating the precision of kinematic positioning or navigation. Different types of generalised DOPs are suitable for different PNT scenarios and an example for the calculation of these DOPs for multi-GNSS systems including GPS, GLONASS, Compass and Galileo is given. New observation equations of Compass and GLONASS that may contain additional parameters for interoperability are specifically investigated. It shows that if the interoperability of multi-GNSS is not fulfilled, the increased number of satellites will not significantly reduce the generalised DOP value. Furthermore, the outlying measurements will not change the original DOP, but will change the first type of generalised DOP which includes a robust error IF. A priori information of the model parameters will also reduce the DOP.


2020 ◽  
Vol 24 (9) ◽  
pp. 4567-4574
Author(s):  
Daniel Erdal ◽  
Olaf A. Cirpka

Abstract. In global sensitivity analysis and ensemble-based model calibration, it is essential to create a large enough sample of model simulations with different parameters that all yield plausible model results. This can be difficult if a priori plausible parameter combinations frequently yield non-behavioral model results. In a previous study (Erdal and Cirpka, 2019), we developed and tested a parameter-sampling scheme based on active-subspace decomposition. While in principle this scheme worked well, it still implied testing a substantial fraction of parameter combinations that ultimately had to be discarded because of implausible model results. This technical note presents an improved sampling scheme and illustrates its simplicity and efficiency by a small test case. The new sampling scheme can be tuned to either outperform the original implementation by improving the sampling efficiency while maintaining the accuracy of the result or by improving the accuracy of the result while maintaining the sampling efficiency.


Geophysics ◽  
2005 ◽  
Vol 70 (1) ◽  
pp. J1-J12 ◽  
Author(s):  
Lopamudra Roy ◽  
Mrinal K. Sen ◽  
Donald D. Blankenship ◽  
Paul L. Stoffa ◽  
Thomas G. Richter

Interpretation of gravity data warrants uncertainty estimation because of its inherent nonuniqueness. Although the uncertainties in model parameters cannot be completely reduced, they can aid in the meaningful interpretation of results. Here we have employed a simulated annealing (SA)–based technique in the inversion of gravity data to derive multilayered earth models consisting of two and three dimensional bodies. In our approach, we assume that the density contrast is known, and we solve for the coordinates or shapes of the causative bodies, resulting in a nonlinear inverse problem. We attempt to sample the model space extensively so as to estimate several equally likely models. We then use all the models sampled by SA to construct an approximate, marginal posterior probability density function (PPD) in model space and several orders of moments. The correlation matrix clearly shows the interdependence of different model parameters and the corresponding trade-offs. Such correlation plots are used to study the effect of a priori information in reducing the uncertainty in the solutions. We also investigate the use of derivative information to obtain better depth resolution and to reduce underlying uncertainties. We applied the technique on two synthetic data sets and an airborne-gravity data set collected over Lake Vostok, East Antarctica, for which a priori constraints were derived from available seismic and radar profiles. The inversion results produced depths of the lake in the survey area along with the thickness of sediments. The resulting uncertainties are interpreted in terms of the experimental geometry and data error.


2020 ◽  
Vol 25 (1) ◽  
pp. 71-88
Author(s):  
Robert Farrugia

Michel Henry radicalises phenomenology by putting forward the idea of a double manifestation: the “Truth of Life” and “truth of the world.” For Henry, the world turns out to be empty of Life. To find its essence, the self must dive completely inward, away from the exterior movements of intentionality. Hence, Life, or God, for Henry, lies in non‑intentional, immanent self-experience, which is felt and yet remains invisible, in an absolutist sense, as an a priori condition of all conscious experience. In Christian theology, the doctrine of the Trinity illuminates the distinction between the immanent Trinity (God’s self‑relation) and the economic workings of the Trinity (God‑world relation). However, the mystery of God’s inmost being and the economy of salvation are here understood as inseparable. In light of this, the paper aims to: 1) elucidate the significance of Henry’s engagement with the phenomenological tradition and his proposal of a phenomenology of Life which advocates an immanent auto‑affection, radically separate from the ek‑static nature of intentionality, and 2) confront the division between Life and world in Henry’s Christian phenomenology and its discordancy with the doctrine of the Trinity, as the latter attests to the harmonious unity that subsists between inner life and the world.


2013 ◽  
Vol 8 (No. 4) ◽  
pp. 186-194
Author(s):  
M. Heřmanovský ◽  
P. Pech

This paper demonstrates an application of the previously published method for selection of optimal catchment descriptors, according to which similar catchments can be identified for the purpose of estimation of the Sacramento – Soil Moisture Accounting (SAC-SMA) model parameters for a set of tested catchments, based on the physical similarity approach. For the purpose of the analysis, the following data from the Model Parameter Estimation Experiment (MOPEX) project were taken: a priori model parameter sets used as reference values for comparison with the newly estimated parameters, and catchment descriptors of four categories (climatic descriptors, soil properties, land cover and catchment morphology). The inverse clustering method, with Andrews’ curves for a homogeneity check, was used for the catchment grouping process. The optimal catchment descriptors were selected on the basis of two criteria, one comparing different subsets of catchment descriptors of the same size (MIN), the other one evaluating the improvement after addition of another catchment descriptor (MAX). The results suggest that the proposed method and the two criteria used may lead to the selection of a subset of conditionally optimal catchment descriptors from a broader set of them. As expected, the quality of the resulting subset of optimal catchment descriptors is mainly dependent on the number and type of the descriptors in the broader set. In the presented case study, six to seven catchment descriptors (two climatic, two soil and at least two land-cover descriptors) were identified as optimal for regionalisation of the SAC-SMA model parameters for a set of MOPEX catchments.


1997 ◽  
Vol 43 (143) ◽  
pp. 180-191 ◽  
Author(s):  
Ε. M. Morris ◽  
H. -P. Bader ◽  
P. Weilenmann

AbstractA physics-based snow model has been calibrated using data collected at Halley Bay, Antarctica, during the International Geophysical Year. Variations in snow temperature and density are well-simulated using values for the model parameters within the range reported from other polar field experiments. The effect of uncertainty in the parameter values on the accuracy of the predictions is no greater than the effect of instrumental error in the input data. Thus, this model can be used with parameters determined a priori rather than by optimization. The model has been validated using an independent data set from Halley Bay and then used to estimate 10 m temperatures on the Antarctic Peninsula plateau over the last half-century.


2016 ◽  
Vol 541 ◽  
pp. 421-433 ◽  
Author(s):  
Humberto Vergara ◽  
Pierre-Emmanuel Kirstetter ◽  
Jonathan J. Gourley ◽  
Zachary L. Flamig ◽  
Yang Hong ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document