nested hierarchy
Recently Published Documents


TOTAL DOCUMENTS

46
(FIVE YEARS 10)

H-INDEX

9
(FIVE YEARS 1)

Author(s):  
Christopher P. Kempes ◽  
David C. Krakauer

AbstractWe argue for multiple forms of life realized through multiple different historical pathways. From this perspective, there have been multiple origins of life on Earth—life is not a universal homology. By broadening the class of originations, we significantly expand the data set for searching for life. Through a computational analogy, the origin of life describes both the origin of hardware (physical substrate) and software (evolved function). Like all information-processing systems, adaptive systems possess a nested hierarchy of levels, a level of function optimization (e.g., fitness maximization), a level of constraints (e.g., energy requirements), and a level of materials (e.g., DNA or RNA genome and cells). The functions essential to life are realized by different substrates with different efficiencies. The functional level allows us to identify multiple origins of life by searching for key principles of optimization in different material form, including the prebiotic origin of proto-cells, the emergence of culture, economic, and legal institutions, and the reproduction of software agents.


Erkenntnis ◽  
2021 ◽  
Author(s):  
Joe Dewhurst ◽  
Alistair. M. C. Isaac

AbstractMechanism realists assert the existence of mechanisms as objective structures in the world, but their exact metaphysical commitments are unclear. We introduce Local Hierarchy Realism (LHR) as a substantive and plausible form of mechanism realism. The limits of LHR reveal a deep tension between two aspects of mechanists’ explanatory strategy. Functional decomposition identifies locally relevant entities and activities, while these same entities and activities are also embedded in a nested hierarchy of levels. In principle, a functional decomposition may identify entities engaging in causal interactions that crosscut the hierarchical structure of composition relations, violating the mechanist’s injunction against interlevel causation. We argue that this possibility is realized in the example of ephaptic coupling, a subsidiary process of neural computation that crosscuts the hierarchy derived from synaptic transmission. These considerations undermine the plausibility of LHR as a general view, yet LHR has the advantages that (i) its metaphysical implications are precisely stateable; (ii) the structure it identifies is not reducible to mere aggregate causation; and (iii) it clearly satisfies intuitive and informal definitions of mechanism. We conclude by assessing the prospects for a form of mechanism realism weaker than LHR that nevertheless satisfies all three of these requirements.


Author(s):  
Marcos Antônio Mattos dos Reis ◽  
Umberto César Corrêa

abstract Sports science has showed benefits in the use of small-sided games in the teaching-learning and training processes of football. We propose that such benefits occur because the small-sided games are holons of a hierarchically organized that maintain the same characteristics of game, regardless the reduced complexity. The hierarchical model of football considers the numerical relations of cooperation and opposition in specific spaces of play. It characterizes a nested hierarchy model because it deals with both the parts and the different processes of game. Such a hierarchical model contains five levels, in which the upper level is the football game and the elementary level a game situation, that is, a small-sided game. As any open system of hierarchical organization, the small-sided games present simultaneously invariant characteristics of whole and the specificities of the parts according the context and level of analysis. The adoption of such a hierarchical perspective allows setting goals as well as selecting the teaching-learning and training’s contents at different analysis levels by considering the autonomy-dependency in each one.


Author(s):  
Prakhar Tripathi

—In the era of the ‘Big-Data’ we hear a lot about machine learning for working on this big data. Machine learning helps us to predict and analyze data with better accuracy and least human intervention. Machine learning is autonomous but susceptible to errors. This is due to biased prediction when previously trained on small data. This leads to chain of errors that can`t be determined easily for long period of time. And when recognized takes lot time to recognize source. There comes the idea of deep learning which achieves the flexibility by using use nested hierarchy of concept to define the world. But deep learning has setback of taking very long time to train data which could be reduced by using transfer learning.


2020 ◽  
Vol 8 (2) ◽  
pp. 235
Author(s):  
Peter Wyer ◽  
Michael Loughlin

Effective person-centred care requires recognition of the personhood not only of patients but of practitioners. This chapter explores the consequences of this recognition for major debates in medical epistemology, regarding clinical reasoning and the relationship between research and practice. For too long these debates have been dominated by false dichotomies - subjectivity versus objectivity, judgement versus evidence, reason versus emotion. Based on flawed understandings of such core concepts as “objectivity” and “engagement”, this distorted dissection of the subject-object relationship has served to depersonalise practice. The costs of this depersonalisation include over-regulation and micromanagement of healthcare processes by administrators and payers at the same time that information from clinical research remains under-utilized and the personhood of patients’ risks being ignored.Science is a human practice, founded in a broader conception of human reasoning, ontologically dependent on human beings living and engaging with the world in social, emotional and ethical contexts. After looking at different conceptions of epistemic hierarchies and their uses in the analysis and evaluation of reasoning in a range of practice contexts, we propose a “nested hierarchy” that effectively turns upside-down the flawed evidence hierarchies that have helped to depersonalise care. T.S. Eliot’s “wisdom, knowledge, information” scheme (to which we add “data” below “information”) provides a model for a person-centred epistemic hierarchy.  This crucial, person-centred inversion represents levels of awareness that characterize more or less developed thinking and judgment on the part of the particular practitioner.


KronoScope ◽  
2020 ◽  
Vol 20 (1) ◽  
pp. 121-134
Author(s):  
Steve Ostovich

Abstract The title is lifted from an essay by J. T. Fraser in his book Time and Time Again (2007). It conveys Fraser’s conviction, a conviction shared here, that understanding time and reality requires us to redirect our thinking process. Plato describes a path out of the dark cave of confusion into the realm of truth and light, that is, from time towards the timeless. But we should “reverse course” along this path and move from the timeless into the complexity of time. Time is not one thing foundational to reality; reality rather is a series of temporal levels developed through evolution and related in a nested hierarchy driven by conflict and towards increasing complexity. This theory makes possible critical and fruitful reflection on issues like entropy, indeterminacy, and mind/body dualism. It entails embracing our position as knowers in time and the complexity of truth as temporal rather than timeless.


2020 ◽  
Vol 493 (4) ◽  
pp. 5693-5712 ◽  
Author(s):  
Philipp Busch ◽  
Simon D M White

ABSTRACT We use the Millennium and Millennium-II simulations to illustrate the Tessellation-Level-Tree  (tlt), a hierarchical tree structure linking density peaks in a field constructed by voronoi tessellation of the particles in a cosmological N-body simulation. The tlt uniquely partitions the simulation particles into disjoint subsets, each associated with a local density peak. Each peak is a subpeak of a unique higher peak. The tlt can be persistence filtered to suppress peaks produced by discreteness noise. Thresholding a peak’s particle list at $\sim 80\left \langle \rho \right \rangle \,$ results in a structure similar to a standard friend-of-friends halo and its subhaloes. For thresholds below $\sim 7\left \langle \rho \right \rangle \,$, the largest structure percolates and is much more massive than other objects. It may be considered as defining the cosmic web. For a threshold of $5\left \langle \rho \right \rangle \,$, it contains about half of all cosmic mass and occupies $\sim 1{{\ \rm per\ cent}}$ of all cosmic volume; a typical external point is then ∼7h−1 Mpc from the web. We investigate the internal structure and clustering of tlt peaks. Defining the saddle point density ρlim  as the density at which a peak joins its parent peak, we show the median value of ρlim  for FoF-like peaks to be similar to the density threshold at percolation. Assembly bias as a function of ρlim  is stronger than for any known internal halo property. For peaks of group mass and below, the lowest quintile in ρlim  has b ≈ 0, and is thus uncorrelated with the mass distribution.


2020 ◽  
Vol 5 (1) ◽  
Author(s):  
Clémentine Cottineau ◽  
Elsa Arcaute

AbstractAlthough the cluster theory literature is bountiful in economics and regional science, there is still a lack of understanding of how the geographical scales of analysis (neighbourhood, city, region) relate to one another and impact the observed phenomenon, and to which extent the clusters are industrially coherent or geographically consistent. In this paper, we cluster spatial economic activities through a multi-scalar approach making use of percolation theory. We consider both the industrial similarity and the geographical proximity between firms, through their joint probability function which is constructed as a copula. This gives rise to an emergent nested hierarchy of geoindustrial clusters, which enables us to analyse the relationships between the different scales, and specific industrial sectors. Using longitudinal business microdata from the Office for National Statistics, we look at the evolution of clusters which spans from very local groups of businesses to the metropolitan level, in 2007 and in 2014, so that the changes stemming from the financial crisis can be observed.


Author(s):  
Nourhan Mohamed Zayed ◽  
Heba A. Elnemr

Deep learning (DL) is a special type of machine learning that attains great potency and flexibility by learning to represent input raw data as a nested hierarchy of essences and representations. DL consists of more layers than conventional machine learning that permit higher levels of abstractions and improved prediction from data. More abstract representations computed in terms of less abstract ones. The goal of this chapter is to present an intensive survey of existing literature on DL techniques over the last years especially in the medical imaging analysis field. All these techniques and algorithms have their points of interest and constraints. Thus, analysis of various techniques and transformations, submitted prior in writing, for plan and utilization of DL methods from medical image analysis prospective will be discussed. The authors provide future research directions in DL area and set trends and identify challenges in the medical imaging field. Furthermore, as quantity of medicinal application demands increase, an extended study and investigation in DL area becomes very significant.


Sign in / Sign up

Export Citation Format

Share Document