scale granularity
Recently Published Documents


TOTAL DOCUMENTS

10
(FIVE YEARS 4)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 2021 (09) ◽  
pp. 0926
Author(s):  
Terry Bollinger

This paper provides a reference copy of one particular and highly informal comment in a multiweek Academia.edu discussion of the paper Randomness in Relational Quantum Mechanics by Gary Gordon. The other main participants in this particular thread of the discussion were Doug Marman, Conrad Dale Johnson, Ruth Kastner, and the author. In this comment, the author argues that the only self-consistent approach to reconciling Feynman path integrals with Maxwell’s experimentally well-proven theory of electromagnetic wave pressure is introducing a new spin-0 particle, the vacuum or space phonon (sonon), that conveys linear momentum. The path histories of QED become the always-expanding structure of the sonon field, which, like a bubble, becomes increasingly unstable as it expands. The collection of all sonon fields around well-defined bundles of conserved quantum properties creates xyz space by defining the complete set of relational information for those entities. Spacetime in the sonon model is granular, multi-scale, and entirely mass-energy dependent. Implications of the sonon model are discussed, including the need for a drastic update to general relativity to take the multi-scale granularity of spacetime directly into account, rather than explaining it obliquely via models such as dark matter, dark energy, or MOND.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Yang Qiao ◽  
Yunjie Tian ◽  
Yue Liu ◽  
Jianbin Jiao

Object skeleton detection requires the convolutional neural networks to recognize objects and their parts in the cluttered background, overcome the image definition degradation brought by the pooling layers, and predict the location of skeleton pixels in different scale granularity. Most existing object skeleton detection methods take great efforts into the designing of side-output networks for multiscale feature fusion. Despite the great progress achieved by them, there are still many problems that hinder the development of object skeleton detection, such as the manually designed network is labor-intensive and the network initialization depends on models pretrained on large-scale datasets. To alleviate these issues, we propose a genetic NAS method to automatically search on a newly designed architecture search space for adaptive multiscale feature fusion. Furthermore, we introduce a symmetric encoder-decoder search space based on reversing the VGG network, in which the decoder can reuse the ImageNet pretrained model of VGG. The searched networks improve the performance of the state-of-the-art methods on commonly used skeleton detection benchmarks, which proves the efficacy of our method.


2020 ◽  
Author(s):  
Evgenia Titova ◽  
Rashmi Mittal

<p>In this study, we present methodology to create synthetic multi-year wind generation dataset at minute-scale granularity at the existing and future Australian wind farms. The purpose of the dataset  is to assist studies of penetration of large scale and distributed renewable generation into the electricity systems and its impact on power system security in the National Energy Market (NEM).</p><p>Synthetic historical records are based on a spatial and temporal blend of reanalysis datasets with the minute-scale wind speeds observations at Bureau of Meteorology weather station network. Strengths and weaknesses of reanalysis data are illustrated and a correction methodology discussed. A method to introduce minute-scale and sub-hourly fluctuations absent in the reanalyses records is presented. Expected statistical properties of sub-hourly fluctuations in the wind generation records are derived from the characteristics of the background atmospheric state in the vicinity of the wind farms.</p><p>The accuracy of the dataset is validated in terms of  power spectra and ramping frequencies in the simulated timeseries against existing minute-scale observations of wind generation at Australian Wind farms. The statistical properties of the observed and simulated timeseries match reasonably well, overall making the dataset suitable for the investigations of  the implications of wind ramping on energy demand and generation at the existing and foreseeable infrastructure build in the NEM.</p>


2020 ◽  
Vol 38 (1) ◽  
pp. 40-61 ◽  
Author(s):  
Marie-Lena Frech ◽  
David D. Loschelder ◽  
Malte Friese
Keyword(s):  

2016 ◽  
Vol 27 (12) ◽  
pp. 1573-1587 ◽  
Author(s):  
David D. Loschelder ◽  
Malte Friese ◽  
Michael Schaerer ◽  
Adam D. Galinsky

Past research has suggested a fundamental principle of price precision: The more precise an opening price, the more it anchors counteroffers. The present research challenges this principle by demonstrating a too-much-precision effect. Five experiments (involving 1,320 experts and amateurs in real-estate, jewelry, car, and human-resources negotiations) showed that increasing the precision of an opening offer had positive linear effects for amateurs but inverted-U-shaped effects for experts. Anchor precision backfired because experts saw too much precision as reflecting a lack of competence. This negative effect held unless first movers gave rationales that boosted experts’ perception of their competence. Statistical mediation and experimental moderation established the critical role of competence attributions. This research disentangles competing theoretical accounts (attribution of competence vs. scale granularity) and qualifies two putative truisms: that anchors affect experts and amateurs equally, and that more precise prices are linearly more potent anchors. The results refine current theoretical understanding of anchoring and have significant implications for everyday life.


2015 ◽  
Vol 24 ◽  
pp. 514 ◽  
Author(s):  
Stephanie Solt

<div class="page" title="Page 1"><div class="layoutArea"><div class="column"><p><span>Numerical expressions are often used imprecisely or approximately. This paper defends a novel analysis of numerical imprecision based on the notion of scale granularity, construed here in terms of sets of alternatives. I apply this approach to account for new facts relating to the interaction of (im)precision and comparison, in particular the necessarily precise interpretation of measure expressions in compara- tives, and the negative polarity status of overt approximators in comparatives (e.g. </span><span>Mabel owns *(no) more than about one hundred sheep.</span><span>) </span></p></div></div></div>


2012 ◽  
Vol 548 ◽  
pp. 740-743
Author(s):  
Yi Lan Chen ◽  
Huan Bao Wang

In this paper, we present a novel hybrid classification model with fuzzy clustering and design a newly combinatorial classifier for error-data in joining processes with diverse-granular computing, which is an ensemble of a naïve Bayes classifier with fuzzy c-means clustering. And we apply it to improve classification performance of traditional hard classifiers in more complex real-world situations. The fuzzy c-means clustering is applied to a fuzzy partition based on a given propositional function to augment the combinatorial classifier. This strategy would work better than a conventional hard classifier without fuzzy clustering. Proper scale granularity of objects contributes to higher classification performance of the combinatorial classifier. Our experimental results show the newly combinatorial classifier has improved the accuracy and stability of classification.


Author(s):  
Neil Zuckerman ◽  
Jennifer R. Lukes

Dependent scattering of acoustic phonons by multiple nanometer-scale inclusions in anisotropic media is investigated using a new molecular dynamics simulation technique. The spectral-directional characteristics of the scattering are found by calculation of three-dimensional scattering phase functions and cross sections for inclusions of varying sizes in various spatial arrangements. The technique enables computation of the effects of reflected wave interference and sequential scattering, mode conversion, lattice strain, elastic anisotropy, and atomic-scale granularity on acoustic phonon scattering from structured inclusions. The results will improve understanding and prediction of heat transfer in quantum-dot superlattices and other engineered thermal materials with nanometer-scale structures.


1994 ◽  
Vol 351 ◽  
Author(s):  
R.L. Holtz ◽  
E.V. Barrera ◽  
J. Milliken ◽  
V. Provenzano

ABSTRACTNanocomposites of copper with low concentrations of dispersed fullerenes were synthesized by simultaneous sputtering of copper and sublimation of fullerenes. Postdeposition heat treatments at 400 and 800 °C were performed to assess the thermal stability of the microstructure and the effect on the Vicker's microhardness. The as-deposited copperfullerene composite has submicron-scale granularity, in contrast to pure copper which has conventional columnar growth. Grain growth in the heat-treated fullerene-containing specimens is suppressed and the microhardness enhanced relative to pure copper.


Sign in / Sign up

Export Citation Format

Share Document