reductionist approach
Recently Published Documents


TOTAL DOCUMENTS

177
(FIVE YEARS 51)

H-INDEX

20
(FIVE YEARS 4)

Author(s):  
Nigel J. Mason ◽  
Perry A. Hailey ◽  
Duncan V. Mifsud ◽  
James S. Urquhart

Laboratory experiments play a key role in deciphering the chemistry of the interstellar medium (ISM) and the formation of complex organic molecules (COMs) relevant to life. To date, however, most studies in experimental astrochemistry have made use of a reductionist approach to experimental design in which chemical responses to variations in a single parameter are investigated while all other parameters are held constant. Although such work does afford insight into the chemistry of the ISM, it is likely that several important points (e.g., the possible influence of experimental parameter interaction) remain ambiguous. In light of this, we propose the adoption of a new “systems astrochemistry” approach for experimental studies and present the basic tenants and advantages of this approach in this perspective article. Such an approach has already been used for some time now and to great effect in the field of prebiotic chemistry, and so we anticipate that its application to experimental astrochemistry will uncover new data hitherto unknown which could aid in better linking laboratory work to observations and models.


Acta Naturae ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 52-64
Author(s):  
Danila V. Kolesov ◽  
Elena L. Sokolinskaya ◽  
Konstantin A. Lukyanov ◽  
Alexey M. Bogdanov

In modern life sciences, the issue of a specific, exogenously directed manipulation of a cells biochemistry is a highly topical one. In the case of electrically excitable cells, the aim of the manipulation is to control the cells electrical activity, with the result being either excitation with subsequent generation of an action potential or inhibition and suppression of the excitatory currents. The techniques of electrical activity stimulation are of particular significance in tackling the most challenging basic problem: figuring out how the nervous system of higher multicellular organisms functions. At this juncture, when neuroscience is gradually abandoning the reductionist approach in favor of the direct investigation of complex neuronal systems, minimally invasive methods for brain tissue stimulation are becoming the basic element in the toolbox of those involved in the field. In this review, we describe three approaches that are based on the delivery of exogenous, genetically encoded molecules sensitive to external stimuli into the nervous tissue. These approaches include optogenetics (Part I) as well as chemogenetics and thermogenetics (Part II), which are significantly different not only in the nature of the stimuli and structure of the appropriate effector proteins, but also in the details of experimental applications. The latter circumstance is an indication that these are rather complementary than competing techniques.


Author(s):  
Paulan Korenhof ◽  
Vincent Blok ◽  
Sanneke Kloppenburg

Abstract Digital Twins are conceptualised in the academic technical discourse as real-time realistic digital representations of physical entities. Originating from product engineering, the Digital Twin quickly advanced into other fields, including the life sciences and earth sciences. Digital Twins are seen by the tech sector as the new promising tool for efficiency and optimisation, while governmental agencies see it as a fruitful means for improving decision-making to meet sustainability goals. A striking example of the latter is the European Commission who wishes to delegate a significant role to Digital Twins in addressing climate change and supporting Green Deal policy. As Digital Twins give rise to high expectations, ambitions, and are being entrusted important societal roles, it is crucial to critically reflect on the nature of Digital Twins. In this article, we therefore philosophically reflect on Digital Twins by critically analysing dominant conceptualisations, the assumptions underlying them, and their normative implications. We dissect the concept and argue that a Digital Twin does not merely fulfil the role of being a representation, but is in fact a steering technique used to direct a physical entity towards certain goals by means of multiple representations. Currently, this steering seems mainly fuelled by a reductionist approach focused on efficiency and optimisation. However, this is not the only direction from which a Digital Twin can be thought and, consequently, designed and deployed. We therefore set an agenda based on a critical understanding of Digital Twins that helps to draw out their beneficial potential, while addressing their potential issues.


2021 ◽  
pp. 147612702110386
Author(s):  
Sylvia Grewatsch ◽  
Steve Kennedy ◽  
Pratima (Tima) Bansal

Strategy scholars are increasingly attempting to tackle complex global social and environmental issues (i.e. wicked problems); yet, many strategy scholars approach these wicked problems in the same way they approach business problems—by building causal models that seek to optimize some form of organizational success. Strategy scholars seek to reduce complexity, focusing on the significant variables that explain the salient outcomes. This approach to wicked problems, ironically, divorces firms from the very social-ecological context that makes the problem “wicked.” In this essay, we argue that strategy research into wicked problems can benefit from systems thinking, which deviates radically from the reductionist approach to analysis taken by many strategy scholars. We review some of the basic tenets of systems thinking and describe their differences from reductionist thinking. Furthermore, we ask strategy scholars to widen their theoretical lens by (1) investigating co-evolutionary dynamics rather than focusing primarily on static models, (2) advancing processual insights rather than favoring causal identification, and (3) recognizing tipping points and transformative change rather than assuming linear monotonic changes.


2021 ◽  
pp. 98-120
Author(s):  
Robert W. Batterman

This chapter begins with a discussion of Julian Schwinger’s “engineering approach” to particle physics. Schwinger argued from a number of perspectives that the very theory (Quantum Electrodynamics, for which he won a Nobel prize) was inadequate. Further, he claimed that an intermediate theory between the fundamental and the phenomenological was superior. Such a theory focuses on a few parameters at intermediate or mesoscales that we employ to organize the world. Schwinger’s motivations were avowedly pragmatic, although he did offer nonpragmatic reasons for preferring such a mesoscale approach. This engineering approach fits well with the idea that the introduction of order parameters in condensed matter physics introduced a natural foliation of the world into microscopic, mesoscopic, and macroscopic levels. It further suggests that a middle-out approach to many-body systems is superior to a bottom-up reductionist approach. The chapter also discusses a middle out approach to multiscale modeling in biology.


2021 ◽  
Author(s):  
Jennifer S Borchardt ◽  
Lucas M Blecker ◽  
Kenneth A Satyshur ◽  
Cynthia Czajkowski

First synthesized in the 1950s, benzodiazepines are widely prescribed drugs that exert their anxiolytic, sedative and anticonvulsant actions by binding to GABA-A receptors, the main inhibitory ligand-gated ion channel in the brain. Scientists have long theorized that there exists an endogenous benzodiazepine, or endozepine, in the brain. While there is indirect evidence suggesting a peptide, the diazepam binding inhibitor, is capable of modulating the GABA-A receptor, direct evidence of the modulatory effects of the diazepam binding inhibitor is limited. Here we take a reductionist approach to understand how purified diazepam binding inhibitor interacts with and affects GABA-A receptor activity. We used two-electrode voltage clamp electrophysiology to study how the effects of diazepam binding inhibitor vary with GABA-A receptor subunit composition, and found that GABA-evoked currents from α3-containing GABA-A receptors are weakly inhibited by the diazepam binding inhibitor, while currents from α5-containing receptors are positively modulated. We also used in silico   protein-protein docking to visualize potential diazepam binding inhibitor/GABA-A receptor interactions that revealed diazepam binding inhibitor bound at the benzodiazepine α/γ binding site interface, which provides a structural framework for understanding diazepam binding inhibitor effects on GABA-A receptors. Our results provide novel insights into mechanisms underlying how the diazepam binding inhibitor modulates GABA-mediated inhibition in the brain.


Author(s):  
Vladimir N. Uversky ◽  
Mohammed F. Alghamdi ◽  
Elrashdy M. Redwan

: Modern protein science is broadening horizons by moving toward the systemic description of proteins in their natural habitats. This implies a transition from a classical reductionist approach associated with consideration of the unique structure and specific biological activity of an individual protein in a purified form to studying entire proteomes and their functions. This mini-review provides a brief description of structural, functional, and expression proteomics, the dark proteome (or unfoldome), and some of the tools utilized in the analyses of proteomes.


Semiotica ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Sergio Torres-Martínez

Abstract This paper forges links between early analytic philosophy and the posits of semiotics. I show that there are some striking and potentially quite important, but perhaps unrecognized, connections between three key concepts in Wittgenstein’s middle and later philosophy, namely, complex (Philosophical Grammar), rule-following (Philosophical Investigations), and language games (Philosophical Investigations). This reveals the existence of a conceptual continuity between Wittgenstein’s “early” and “later” philosophy that can be applied to the analysis of the iterability of representation in computer-generated images. Methodologically, this paper clarifies to at least some degree, the nature, progress and promise of an approach to doing philosophy and semiotics from a modally modest perspective that sees in the intellectual products of humanities, and not in unreflective empiricism, the future of scientific development. This hybrid, non-reductionist approach shows, among other things, that semiotic processes are encoded by specific types of complexes in computer-generated images that display iterability in time and space.


Sign in / Sign up

Export Citation Format

Share Document