Computing, Philosophy and Reality

Author(s):  
Joseph Brenner

The conjunction of the disciplines of computing and philosophy implies that discussion of computational models and approaches should include explicit statements of their underlying worldview, given the fact that reality includes both computational and non-computational domains. As outlined at ECAP08, both domains of reality can be characterized by the different logics applicable to them. A new “Logic in Reality” (LIR) was proposed as best describing the dynamics of real, non-computable processes. The LIR process view of the real macroscopic world is compared here with recent computational and information-theoretic models. Proposals that the universe can be described as a mathematical structure equivalent to a computer or by simple cellular automata are deflated. A new interpretation of quantum superposition as supporting a concept of paraconsistent parallelism in quantum computing and an appropriate ontological commitment for computational modeling are discussed.

Author(s):  
William B. Rouse

This book discusses the use of models and interactive visualizations to explore designs of systems and policies in determining whether such designs would be effective. Executives and senior managers are very interested in what “data analytics” can do for them and, quite recently, what the prospects are for artificial intelligence and machine learning. They want to understand and then invest wisely. They are reasonably skeptical, having experienced overselling and under-delivery. They ask about reasonable and realistic expectations. Their concern is with the futurity of decisions they are currently entertaining. They cannot fully address this concern empirically. Thus, they need some way to make predictions. The problem is that one rarely can predict exactly what will happen, only what might happen. To overcome this limitation, executives can be provided predictions of possible futures and the conditions under which each scenario is likely to emerge. Models can help them to understand these possible futures. Most executives find such candor refreshing, perhaps even liberating. Their job becomes one of imagining and designing a portfolio of possible futures, assisted by interactive computational models. Understanding and managing uncertainty is central to their job. Indeed, doing this better than competitors is a hallmark of success. This book is intended to help them understand what fundamentally needs to be done, why it needs to be done, and how to do it. The hope is that readers will discuss this book and develop a “shared mental model” of computational modeling in the process, which will greatly enhance their chances of success.


2021 ◽  
Vol 11 (6) ◽  
pp. 2696
Author(s):  
Aritra Sarkar ◽  
Zaid Al-Ars ◽  
Koen Bertels

Inferring algorithmic structure in data is essential for discovering causal generative models. In this research, we present a quantum computing framework using the circuit model, for estimating algorithmic information metrics. The canonical computation model of the Turing machine is restricted in time and space resources, to make the target metrics computable under realistic assumptions. The universal prior distribution for the automata is obtained as a quantum superposition, which is further conditioned to estimate the metrics. Specific cases are explored where the quantum implementation offers polynomial advantage, in contrast to the exhaustive enumeration needed in the corresponding classical case. The unstructured output data and the computational irreducibility of Turing machines make this algorithm impossible to approximate using heuristics. Thus, exploring the space of program-output relations is one of the most promising problems for demonstrating quantum supremacy using Grover search that cannot be dequantized. Experimental use cases for quantum acceleration are developed for self-replicating programs and algorithmic complexity of short strings. With quantum computing hardware rapidly attaining technological maturity, we discuss how this framework will have significant advantage for various genomics applications in meta-biology, phylogenetic tree analysis, protein-protein interaction mapping and synthetic biology. This is the first time experimental algorithmic information theory is implemented using quantum computation. Our implementation on the Qiskit quantum programming platform is copy-left and is publicly available on GitHub.


2021 ◽  
Vol 376 (1821) ◽  
pp. 20190765 ◽  
Author(s):  
Giovanni Pezzulo ◽  
Joshua LaPalme ◽  
Fallon Durant ◽  
Michael Levin

Nervous systems’ computational abilities are an evolutionary innovation, specializing and speed-optimizing ancient biophysical dynamics. Bioelectric signalling originated in cells' communication with the outside world and with each other, enabling cooperation towards adaptive construction and repair of multicellular bodies. Here, we review the emerging field of developmental bioelectricity, which links the field of basal cognition to state-of-the-art questions in regenerative medicine, synthetic bioengineering and even artificial intelligence. One of the predictions of this view is that regeneration and regulative development can restore correct large-scale anatomies from diverse starting states because, like the brain, they exploit bioelectric encoding of distributed goal states—in this case, pattern memories. We propose a new interpretation of recent stochastic regenerative phenotypes in planaria, by appealing to computational models of memory representation and processing in the brain. Moreover, we discuss novel findings showing that bioelectric changes induced in planaria can be stored in tissue for over a week, thus revealing that somatic bioelectric circuits in vivo can implement a long-term, re-writable memory medium. A consideration of the mechanisms, evolution and functionality of basal cognition makes novel predictions and provides an integrative perspective on the evolution, physiology and biomedicine of information processing in vivo . This article is part of the theme issue ‘Basal cognition: multicellularity, neurons and the cognitive lens’.


Author(s):  
György Darvas

The paper makes an attempt to resolve two conceptual mingling: (a) the mingling of the two interpretations of the concept of orderedness applied in statistical thermodynamics and in symmetrology, and (b) the mingling of two interpretations of evolution applied in global and local processes. In conclusion, it formulates a new interpretation on the relation of the emergence of new material qualities in selforganizing processes on the one hand, and the evolution of the universe, on the other. The process of evolution is a sequence of emergence of new material qualities by self-organization processes, which happen in negligible small segments of the universe. Although thermodynamics looks at the universe as a closed (isolated) system, this holds for its outside boundaries only, while the universe has many subsystems inside, which are not isolated (closed), since they are in a permanent exchange of matter, energy, etc. with their environment (with the rest of the universe) through their open boundaries. Any ";;emergence";; takes place, i.e., all new qualities come into being just in these small open segments of the universe. The conditions to apply the second law of thermodynamics are not present here. Therefore, global evolution of the universe is the consequence of local symmetry decreases, local decreases of orderedness, and possible local decreases of entropy.


2021 ◽  
Vol 12 ◽  
Author(s):  
Hae Deok Jung ◽  
Yoo Jin Sung ◽  
Hyun Uk Kim

Chemotherapy is a mainstream cancer treatment, but has a constant challenge of drug resistance, which consequently leads to poor prognosis in cancer treatment. For better understanding and effective treatment of drug-resistant cancer cells, omics approaches have been widely conducted in various forms. A notable use of omics data beyond routine data mining is to use them for computational modeling that allows generating useful predictions, such as drug responses and prognostic biomarkers. In particular, an increasing volume of omics data has facilitated the development of machine learning models. In this mini review, we highlight recent studies on the use of multi-omics data for studying drug-resistant cancer cells. We put a particular focus on studies that use computational models to characterize drug-resistant cancer cells, and to predict biomarkers and/or drug responses. Computational models covered in this mini review include network-based models, machine learning models and genome-scale metabolic models. We also provide perspectives on future research opportunities for combating drug-resistant cancer cells.


Author(s):  
Eleonora Bilotta ◽  
Pietro Pantano

The ingenuity of nature and the power of DNA have generated an infinite range of languages - including human language. The existence of these languages inspires us to design artificial cognitive systems whose dynamic interaction with the environment is grounded, at least to some extent, on the same basic laws. Modern scientific knowledge provides us with new opportunities to investigate and understand the logic underlying biological life. We can then use this logic to derive design principles and computational models for artificial systems. The technologies we apply in these studies provide us with new insights into the complexity of the processes underlying the evolutionary success of modern species. We have yet to fully penetrate the mysteries of these natural languages. Nonetheless, the literature suggests (Chomsky, 1957; Aronof & Rees-Miller, 2003; Bilotta & Pantano, 2006) that while the superficial features of different languages depend on different physical supports and different mechanisms, their deep structures share common rules. These constitute linguistic universals, organized at different levels of complexity, where each level has its own rules of composition. At all levels, we can consider these rules as “production rules” or even as rules of reproduction.


2019 ◽  
pp. 210-229
Author(s):  
Michael Weisberg

Michael Weisberg’s book Simulation and Similarity argued that although mathematical models are sometimes described in narrative form, they are best understood as interpreted mathematical structures. But how can a mathematical structure be causal, as many models described in narrative seem to be? This chapter argues that models with apparently narrative form are actually computational structures. It explores this suggestion in detail, examining what computational structure consists of, the resources it offers modelers, and why attempting to re-describe computational models as imaginary concrete systems fails even more dramatically than it does for mathematical models.


1970 ◽  
Vol 1 (1) ◽  
Author(s):  
Michael L. Benedikt

Published in 1996* but not widely read, this article argues that space and information are so deeply related that the universe at every moment is exactly and only as large as it needs to be to “contain” the information it in fact is. Using three thought experiments—one about data visualization, one about cellular automata and consciousness, and one about the analysis of architectural space using isovists, each experiment blurring (or rather, uniting) the phenomena of psychological and physical space, the article argues that what we experience as “space” is that set of dimensions which provides the largest capacity for the world’s other qualities, objects, and events to express their variety most fully. The natural universe is incompressible, expanding only as, and because, it becomes richer in information (i.e. cools and evolves). Imaginary and virtual worlds obey the same rule: they are “naturally” as big as they are rich in information. But the possibility exists in cyberspace—as it does not in nature—to choose which dimensions will serve as the spatial framework, and which will become/appear as properties of the things themselves. Data visualizers know this well. One wonders why virtual worlds to this day look so similar to ours, then, rather than to the one envisaged by William Gibson in 1984 and 1986 and which he called “cyberspace.” A failure of architectural nerve? A constraint upon computation? Or has cyberspace proper yet to evolve?


Leonardo ◽  
2019 ◽  
Vol 52 (3) ◽  
pp. 230-235
Author(s):  
Libby Heaney

The author draws on her research experience in quantum computing to discuss the conception and form of an interactive installation, CLOUD. CLOUD explores complexity in the postdigital by referencing the principles of quantum superposition, quantum entanglement and quantum measurement.


Sign in / Sign up

Export Citation Format

Share Document