implicit assumption
Recently Published Documents


TOTAL DOCUMENTS

384
(FIVE YEARS 123)

H-INDEX

37
(FIVE YEARS 5)

eLife ◽  
2022 ◽  
Vol 11 ◽  
Author(s):  
Raimund Schlüßler ◽  
Kyoohyun Kim ◽  
Martin Nötzel ◽  
Anna Taubenberger ◽  
Shada Abuhattum ◽  
...  

Quantitative measurements of physical parameters become increasingly important for understanding biological processes. Brillouin microscopy (BM) has recently emerged as one technique providing the 3D distribution of viscoelastic properties inside biological samples - so far relying on the implicit assumption that refractive index (RI) and density can be neglected. Here, we present a novel method (FOB microscopy) combining BM with optical diffraction tomography and epi-fluorescence imaging for explicitly measuring the Brillouin shift, RI and absolute density with specificity to fluorescently labeled structures. We show that neglecting the RI and density might lead to erroneous conclusions. Investigating the nucleoplasm of wild-type HeLa cells, we find that it has lower density but higher longitudinal modulus than the cytoplasm. Thus, the longitudinal modulus is not merely sensitive to the water content of the sample - a postulate vividly discussed in the field. We demonstrate the further utility of FOB on various biological systems including adipocytes and intracellular membraneless compartments. FOB microscopy can provide unexpected scientific discoveries and shed quantitative light on processes such as phase separation and transition inside living cells.


2022 ◽  
Vol 2022 (01) ◽  
pp. 001
Author(s):  
Sarvesh Kumar Yadav ◽  
Rajib Saha

Abstract In the era of precision cosmology, accurate estimation of cosmological parameters is based upon the implicit assumption of the Gaussian nature of Cosmic Microwave Background (CMB) radiation. Therefore, an important scientific question to ask is whether the observed CMB map is consistent with Gaussian prediction. In this work, we extend previous studies based on CMB spherical harmonic phases (SHP) to examine the validity of the hypothesis that the temperature field of the CMB is consistent with a Gaussian random field (GRF). The null hypothesis is that the corresponding CMB SHP are independent and identically distributed in terms of a uniform distribution in the interval [0, 2π] [1,2]. We devise a new model-independent method where we use ordered and non-parametric Rao's statistic, based on sample arc-lengths to comprehensively test uniformity and independence of SHP for a given ℓ mode and independence of nearby ℓ mode SHP. We performed our analysis on the scales limited by spherical harmonic modes ≤ 128, to restrict ourselves to signal-dominated regions. To find the non-uniform or dependent sets of SHP, we calculate the statistic for the data and 10000 Monte Carlo simulated uniformly random sets of SHP and use 0.05 and 0.001 α levels to distinguish between statistically significant and highly significant detections. We first establish the performance of our method using simulated Gaussian, non-Gaussian CMB temperature maps, along with observed non-Gaussian 100 and 143 GHz Planck channel maps. We find that our method, performs efficiently and accurately in detecting phase correlations generated in all of the non-Gaussian simulations and observed foreground contaminated 100 and 143 GHz Planck channel temperature maps. We apply our method on Planck satellite mission's final released CMB temperature anisotropy maps- COMMANDER, SMICA, NILC, and SEVEM along with WMAP 9 year released ILC map. We report that SHP corresponding to some of the m-modes are non-uniform, some of the ℓ mode SHP and neighboring mode pair SHP are correlated in cleaned CMB maps. The detection of non-uniformity or correlation in the SHP indicates the presence of non-Gaussian signals in the foreground minimized CMB maps.


2021 ◽  
Vol 54 (1) ◽  
Author(s):  
Pau Amaro Seoane ◽  
Manuel Arca Sedda ◽  
Stanislav Babak ◽  
Christopher P. L. Berry ◽  
Emanuele Berti ◽  
...  

AbstractThe science objectives of the LISA mission have been defined under the implicit assumption of a 4-years continuous data stream. Based on the performance of LISA Pathfinder, it is now expected that LISA will have a duty cycle of $$\approx 0.75$$ ≈ 0.75 , which would reduce the effective span of usable data to 3 years. This paper reports the results of a study by the LISA Science Group, which was charged with assessing the additional science return of increasing the mission lifetime. We explore various observational scenarios to assess the impact of mission duration on the main science objectives of the mission. We find that the science investigations most affected by mission duration concern the search for seed black holes at cosmic dawn, as well as the study of stellar-origin black holes and of their formation channels via multi-band and multi-messenger observations. We conclude that an extension to 6 years of mission operations is recommended.


Author(s):  
Kerstin Erfurth ◽  
Marcus Groß ◽  
Ulrich Rendtel ◽  
Timo Schmid

AbstractComposite spatial data on administrative area level are often presented by maps. The aim is to detect regional differences in the concentration of subpopulations, like elderly persons, ethnic minorities, low-educated persons, voters of a political party or persons with a certain disease. Thematic collections of such maps are presented in different atlases. The standard presentation is by Choropleth maps where each administrative unit is represented by a single value. These maps can be criticized under three aspects: the implicit assumption of a uniform distribution within the area, the instability of the resulting map with respect to a change of the reference area and the discontinuities of the maps at the borderlines of the reference areas which inhibit the detection of regional clusters.In order to address these problems we use a density approach in the construction of maps. This approach does not enforce a local uniform distribution. It does not depend on a specific choice of area reference system and there are no discontinuities in the displayed maps. A standard estimation procedure of densities are Kernel density estimates. However, these estimates need the geo-coordinates of the single units which are not at disposal as we have only access to the aggregates of some area system. To overcome this hurdle, we use a statistical simulation concept. This can be interpreted as a Simulated Expectation Maximisation (SEM) algorithm of Celeux et al (1996). We simulate observations from the current density estimates which are consistent with the aggregation information (S-step). Then we apply the Kernel density estimator to the simulated sample which gives the next density estimate (E-Step).This concept has been first applied for grid data with rectangular areas, see Groß et al (2017), for the display of ethnic minorities. In a second application we demonstrated the use of this approach for the so-called “change of support” (Bradley et al 2016) problem. Here Groß et al (2020) used the SEM algorithm to recalculate case numbers between non-hierarchical administrative area systems. Recently Rendtel et al (2021) applied the SEM algorithm to display spatial-temporal clusters of Corona infections in Germany.Here we present three modifications of the basic SEM algorithm: 1) We introduce a boundary correction which removes the underestimation of kernel density estimates at the borders of the population area. 2) We recognize unsettled areas, like lakes, parks and industrial areas, in the computation of the kernel density. 3) We adapt the SEM algorithm for the computation of local percentages which are important especially in voting analysis.We evaluate our approach against several standard maps by means of the local voting register with known addresses. In the empirical part we apply our approach for the display of voting results for the 2016 election of the Berlin parliament. We contrast our results against Choropleth maps and show new possibilities for reporting spatial voting results.


2021 ◽  
Author(s):  
Bernard Fischli

Abstract Relativity has been based on the implicit assumption that it would exclusively describe interactions. Relativistic view effects are included as well, and they act with no force and no energy exchanges. The Ehrenfest paradox is solved. View effects specific to each point of view are the solution. The calculation of the deflection of light by the sun explains in detail why the deflection angle must be almost double the value obtained with Newton’s laws. The compatibility of General Relativity with the new interpretation is discussed. An object has no speed limit due to gravitation but it is limited in speed with electromagnetism. Inertial behavior is examined. The equivalence principle does not introduce gravitation to General Relativity. Relativity impacts the energy formula of electromagnetism using the Lorentz factor which also introduces view effects that are optical illusions with no impact on energy.


2021 ◽  
Author(s):  
◽  
Geoff Harrison

<p>This thesis is a study of business accelerators, and the efficacy of accelerators as learning environments. Accelerators are increasingly becoming a popular strategy for delivering a more authentic entrepreneurial learning experience. Accelerators provide a time-bound suite of highly structured educational and business development activities that provide learning support to cohorts of competitively selected high-potential entrepreneurial teams. The participants face considerable uncertainty and are exposed to complex learning and business development processes associated with rapidly building, validating, and scaling investable business models. Intense mentorship and entrepreneurial education are core features by which accelerators support this journey. Thus, an implicit assumption embedded in accelerator programme logic is the accelerator learning environment positively shapes learning and development outcomes. Yet little research has investigated how accelerators influence participant learning and development. This gap motives the current research.  A multilevel quantitative and qualitative mixed methods approach was adopted to examine participant learning and development at the three levels of participation embedded within accelerator programme design – cohort, team and participant. Concepts and measures from academic work on accelerators, learning agility, and individual performance behaviour were assembled into a coherent set of investigative tools and lenses. Taken together, they frame the accelerator learning environment as a whole system of actors and elements that operate both independently and interdependently. The research setting is a Global Accelerator Network affiliate programme based in New Zealand. Three strands of data were collected on 29 participants associated with 10 venture teams participating in a single accelerator programme cohort.  Strand 1 applied a multiphase quantitative survey approach to capture a longitudinal understanding of how accelerators influence participant learning and development at the cohort level. Patterns of relationships between the key constructs were identified for each phase. Strand 2 utilised a qualitative observation method to investigate the quantitative findings through a team lens. This was done because of the central role teams play in the accelerator programme logic. Each of these stands occurred during the accelerator. Strand 3 used interviews to explore how the accelerator learning environment influenced learning and development at the level of individual participants. Interview data was collected six months after the accelerator to capture participant perceptions in retrospect.  The research findings show accelerators do more than shelter emerging organisations; they actively support the development of the new venture, provide an authentic learning environment for the entrepreneurs, and they foster the development of entrepreneurship capacity. However, findings also suggest participant response to the learning environment is dynamic and unpredictable. Specifically, participants perceived the learning and development benefits they received from: a) mentors, as low across all phases; b) managers, as strongest during the middle and last phase of the programme; c) the cohort of participants, as very helpful during all three phases; and, d) accelerator instructional programming was tied closely to the relevance, quality and timing of the resources provided to them. Further, the evidence suggests team composition matters more than the team’s business idea, and task-oriented accelerator programme design negatively influences learning and development by limiting the amount of ‘free’ time participants have for creative interactions, experimentation and reflection. Thus, the availability of accelerator learning opportunities, such as education and mentorship, can both enable and hinder participant learning and development.  This study provides insights for entrepreneurship research focused on supporting the development and success of early-stage enterprises. The presented findings and interpretations offer scholars, organisers and stakeholders a greater appreciation of the importance of participant learning and development in accelerators. They also suggest the utility of applying learning agility and individual performance concepts as lenses for understanding individual learning processes and their effects in entrepreneurial contexts beyond accelerators. Research limitations, implications for policy and practice, and future research are discussed.</p>


2021 ◽  
Author(s):  
◽  
Geoff Harrison

<p>This thesis is a study of business accelerators, and the efficacy of accelerators as learning environments. Accelerators are increasingly becoming a popular strategy for delivering a more authentic entrepreneurial learning experience. Accelerators provide a time-bound suite of highly structured educational and business development activities that provide learning support to cohorts of competitively selected high-potential entrepreneurial teams. The participants face considerable uncertainty and are exposed to complex learning and business development processes associated with rapidly building, validating, and scaling investable business models. Intense mentorship and entrepreneurial education are core features by which accelerators support this journey. Thus, an implicit assumption embedded in accelerator programme logic is the accelerator learning environment positively shapes learning and development outcomes. Yet little research has investigated how accelerators influence participant learning and development. This gap motives the current research.  A multilevel quantitative and qualitative mixed methods approach was adopted to examine participant learning and development at the three levels of participation embedded within accelerator programme design – cohort, team and participant. Concepts and measures from academic work on accelerators, learning agility, and individual performance behaviour were assembled into a coherent set of investigative tools and lenses. Taken together, they frame the accelerator learning environment as a whole system of actors and elements that operate both independently and interdependently. The research setting is a Global Accelerator Network affiliate programme based in New Zealand. Three strands of data were collected on 29 participants associated with 10 venture teams participating in a single accelerator programme cohort.  Strand 1 applied a multiphase quantitative survey approach to capture a longitudinal understanding of how accelerators influence participant learning and development at the cohort level. Patterns of relationships between the key constructs were identified for each phase. Strand 2 utilised a qualitative observation method to investigate the quantitative findings through a team lens. This was done because of the central role teams play in the accelerator programme logic. Each of these stands occurred during the accelerator. Strand 3 used interviews to explore how the accelerator learning environment influenced learning and development at the level of individual participants. Interview data was collected six months after the accelerator to capture participant perceptions in retrospect.  The research findings show accelerators do more than shelter emerging organisations; they actively support the development of the new venture, provide an authentic learning environment for the entrepreneurs, and they foster the development of entrepreneurship capacity. However, findings also suggest participant response to the learning environment is dynamic and unpredictable. Specifically, participants perceived the learning and development benefits they received from: a) mentors, as low across all phases; b) managers, as strongest during the middle and last phase of the programme; c) the cohort of participants, as very helpful during all three phases; and, d) accelerator instructional programming was tied closely to the relevance, quality and timing of the resources provided to them. Further, the evidence suggests team composition matters more than the team’s business idea, and task-oriented accelerator programme design negatively influences learning and development by limiting the amount of ‘free’ time participants have for creative interactions, experimentation and reflection. Thus, the availability of accelerator learning opportunities, such as education and mentorship, can both enable and hinder participant learning and development.  This study provides insights for entrepreneurship research focused on supporting the development and success of early-stage enterprises. The presented findings and interpretations offer scholars, organisers and stakeholders a greater appreciation of the importance of participant learning and development in accelerators. They also suggest the utility of applying learning agility and individual performance concepts as lenses for understanding individual learning processes and their effects in entrepreneurial contexts beyond accelerators. Research limitations, implications for policy and practice, and future research are discussed.</p>


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Rongbo Chen ◽  
Haojun Sun ◽  
Lifei Chen ◽  
Jianfei Zhang ◽  
Shengrui Wang

AbstractMarkov models are extensively used for categorical sequence clustering and classification due to their inherent ability to capture complex chronological dependencies hidden in sequential data. Existing Markov models are based on an implicit assumption that the probability of the next state depends on the preceding context/pattern which is consist of consecutive states. This restriction hampers the models since some patterns, disrupted by noise, may be not frequent enough in a consecutive form, but frequent in a sparse form, which can not make use of the information hidden in the sequential data. A sparse pattern corresponds to a pattern in which one or some of the state(s) between the first and last one in the pattern is/are replaced by wildcard(s) that can be matched by a subset of values in the state set. In this paper, we propose a new model that generalizes the conventional Markov approach making it capable of dealing with the sparse pattern and handling the length of the sparse patterns adaptively, i.e. allowing variable length pattern with variable wildcards. The model, named Dynamic order Markov model (DOMM), allows deriving a new similarity measure between a sequence and a set of sequences/cluster. DOMM builds a sparse pattern from sub-frequent patterns that contain significant statistical information veiled by the noise. To implement DOMM, we propose a sparse pattern detector (SPD) based on the probability suffix tree (PST) capable of discovering both sparse and consecutive patterns, and then we develop a divisive clustering algorithm, named DMSC, for Dynamic order Markov model for categorical sequence clustering. Experimental results on real-world datasets demonstrate the promising performance of the proposed model.


2021 ◽  
Vol 923 (1) ◽  
pp. 39
Author(s):  
Pushkar Kopparla ◽  
Russell Deitrick ◽  
Kevin Heng ◽  
João M. Mendonça ◽  
Mark Hammond

Abstract General circulation models (GCMs) are often used to explore exoclimate parameter spaces and classify atmospheric circulation regimes. Models are tuned to give reasonable climate states for standard test cases, such as the Held–Suarez test, and then used to simulate diverse exoclimates by varying input parameters such as rotation rates, instellation, atmospheric optical properties, frictional timescales, and so on. In such studies, there is an implicit assumption that the model works reasonably well for the standard test case will be credible at all points in an arbitrarily wide parameter space. Here, we test this assumption using the open-source GCM THOR to simulate atmospheric circulation on tidally locked Earth-like planets with rotation periods of 0.1–100 days. We find that the model error, as quantified by the ratio between physical and spurious numerical contributions to the angular momentum balance, is extremely variable across this range of rotation periods with some cases where numerical errors are the dominant component. Increasing model grid resolution does improve errors, but using a higher-order numerical diffusion scheme can sometimes magnify errors for finite-volume dynamical solvers. We further show that to minimize error and make the angular momentum balance more physical within our model, the surface friction timescale must be smaller than the rotational timescale.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Bruno Varella Miranda ◽  
Brent Ross ◽  
Jason Franken ◽  
Miguel Gómez

PurposeThe purpose of this study is to disentangle the drivers of adoption of procurement strategies in situations where small agri-food firms deal with constrained organizational choices. More specifically, the authors investigate the role of transaction costs, capabilities and networks in the definition of feasible “make-or-buy” choices in emerging wine regions.Design/methodology/approachThis article analyzes a unique dataset of small wineries from five US states: Illinois, Michigan, Missouri, New York and Vermont. The reported results derive from both a hurdle model (i.e. a probit model and a truncated regression model) and a tobit model.FindingsThe results suggest the importance of trust as a replacement for formal governance structures whenever small firms deal with highly constrained sets of organizational choices. On the other hand, the level of dependence on a limited mix of winegrape varieties and the perception that these varieties are fundamental in building legitimacy help to explain higher rates of vertical integration.Originality/valueThis study is important because it sheds light on organizational constraints that affect millions of farmers across the globe. The study of “make-or-buy” decisions in agri-food supply chains has mostly relied on the implicit assumption that all organizational choices are available to every firm. Nevertheless, limited capabilities and the participation in low-density networks may constrain the ability of a firm to adopt a governance mechanism. Stated organizational preferences and actual organizational choices may thus differ.


Sign in / Sign up

Export Citation Format

Share Document