scholarly journals Towards Strong AI

2021 ◽  
Vol 35 (1) ◽  
pp. 91-101 ◽  
Author(s):  
Martin V. Butz

AbstractStrong AI—artificial intelligence that is in all respects at least as intelligent as humans—is still out of reach. Current AI lacks common sense, that is, it is not able to infer, understand, or explain the hidden processes, forces, and causes behind data. Main stream machine learning research on deep artificial neural networks (ANNs) may even be characterized as being behavioristic. In contrast, various sources of evidence from cognitive science suggest that human brains engage in the active development of compositional generative predictive models (CGPMs) from their self-generated sensorimotor experiences. Guided by evolutionarily-shaped inductive learning and information processing biases, they exhibit the tendency to organize the gathered experiences into event-predictive encodings. Meanwhile, they infer and optimize behavior and attention by means of both epistemic- and homeostasis-oriented drives. I argue that AI research should set a stronger focus on learning CGPMs of the hidden causes that lead to the registered observations. Endowed with suitable information-processing biases, AI may develop that will be able to explain the reality it is confronted with, reason about it, and find adaptive solutions, making it Strong AI. Seeing that such Strong AI can be equipped with a mental capacity and computational resources that exceed those of humans, the resulting system may have the potential to guide our knowledge, technology, and policies into sustainable directions. Clearly, though, Strong AI may also be used to manipulate us even more. Thus, it will be on us to put good, far-reaching and long-term, homeostasis-oriented purpose into these machines.

Author(s):  
C.L. Woodcock

Despite the potential of the technique, electron tomography has yet to be widely used by biologists. This is in part related to the rather daunting list of equipment and expertise that are required. Thanks to continuing advances in theory and instrumentation, tomography is now more feasible for the non-specialist. One barrier that has essentially disappeared is the expense of computational resources. In view of this progress, it is time to give more attention to practical issues that need to be considered when embarking on a tomographic project. The following recommendations and comments are derived from experience gained during two long-term collaborative projects.Tomographic reconstruction results in a three dimensional description of an individual EM specimen, most commonly a section, and is therefore applicable to problems in which ultrastructural details within the thickness of the specimen are obscured in single micrographs. Information that can be recovered using tomography includes the 3D shape of particles, and the arrangement and dispostion of overlapping fibrous and membranous structures.


2018 ◽  
Vol 30 (2) ◽  
pp. 77-94
Author(s):  
Hwee Cheng Tan ◽  
Ken T. Trotman

ABSTRACT We investigate the effect of regulatory requirements on impairment decisions and managers' search for and evaluation of impairment information. We manipulate reversibility of impairment losses (“can be reversed” versus “cannot be reversed”) and transparency in disclosures of impairment assumptions (more transparent versus less transparent) in a 2 × 2 experiment. We find that managers are more willing to impair when impairment losses can be reversed than when they cannot be reversed, but this effect does not vary with disclosure transparency. We also find that managers display information search bias in all four experimental conditions, however, regulatory requirements do not result in differences in the level of information search bias across the conditions. In contrast, regulatory requirements affect the differences in the level of information evaluation bias across conditions. We find that when impairment losses cannot be reversed, information evaluation bias is higher when disclosures are more transparent than less transparent. JEL Classification: M40; M41.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Mingxue Ma ◽  
Yao Ni ◽  
Zirong Chi ◽  
Wanqing Meng ◽  
Haiyang Yu ◽  
...  

AbstractThe ability to emulate multiplexed neurochemical transmission is an important step toward mimicking complex brain activities. Glutamate and dopamine are neurotransmitters that regulate thinking and impulse signals independently or synergistically. However, emulation of such simultaneous neurotransmission is still challenging. Here we report design and fabrication of synaptic transistor that emulates multiplexed neurochemical transmission of glutamate and dopamine. The device can perform glutamate-induced long-term potentiation, dopamine-induced short-term potentiation, or co-release-induced depression under particular stimulus patterns. More importantly, a balanced ternary system that uses our ambipolar synaptic device backtrack input ‘true’, ‘false’ and ‘unknown’ logic signals; this process is more similar to the information processing in human brains than a traditional binary neural network. This work provides new insight for neuromorphic systems to establish new principles to reproduce the complexity of a mammalian central nervous system from simple basic units.


2011 ◽  
Vol 20 (02) ◽  
pp. 271-295 ◽  
Author(s):  
VÍCTOR SÁNCHEZ-ANGUIX ◽  
SOLEDAD VALERO ◽  
ANA GARCÍA-FORNES

An agent-based Virtual Organization is a complex entity where dynamic collections of agents agree to share resources in order to accomplish a global goal or offer a complex service. An important problem for the performance of the Virtual Organization is the distribution of the agents across the computational resources. The final distribution should provide a good load balancing for the organization. In this article, a genetic algorithm is applied to calculate a proper distribution across hosts in an agent-based Virtual Organization. Additionally, an abstract multi-agent system architecture which provides infrastructure for Virtual Organization distribution is introduced. The developed genetic solution employs an elitist crossover operator where one of the children inherits the most promising genetic material from the parents with higher probability. In order to validate the genetic proposal, the designed genetic algorithm has been successfully compared to several heuristics in different scenarios.


2003 ◽  
Vol 17 (4) ◽  
pp. 347-358 ◽  
Author(s):  
Jennifer A. Steinberg ◽  
Brandon E. Gibb ◽  
Lauren B. Alloy ◽  
Lyn Y. Abramson

Previous work has established a relationship between reports of childhood emotional maltreatment and cognitive vulnerability to depression, as well as an association between cognitive vulnerability and self-referent information-processing biases. Findings from this study of individuals at low (LR) and high (HR) cognitive risk for depression revealed a relationship between reports of childhood emotional maltreatment and current information processing biases. Specifically, individuals with greater childhood emotional maltreatment exhibited more negative self-referent information processing. Moreover, cognitive risk mediated the relationship between childhood emotional maltreatment and these information-processing biases. Testing an alternate model, information-processing biases also mediated the relationship between childhood emotional maltreatment and cognitive risk.


2017 ◽  
pp. 89-98
Author(s):  
Eli Pavlova-Traykova ◽  
Ivan Marinov ◽  
Petar Dimov

This investigation has been carried out at Badinska River watershed - one of the most famous torrents in Bulgaria. The purpose of the survey is tos analyse the main erosion factors and erosion potential of territories, with a view to assess soil erosion risk and opportunity of high water formation from watershed. A methodical approach for determination and mapping of the territories in terms of the class of erosion risk with the use of GIS is applied. Assessments are made according to the ?Methodology for preparing the national long term programme for protection against erosion and flooding in forestlands?. The total assessment for Badinska river watershed is ?low to moderate? potential risk and ?very low to low? actual erosion risk. About of 5% of the forest stock territory is with ?moderate? and ?moderate to high? actual risk, and the biggest part of this territory (about 63%) is in the main stream watershed above the Yaloviko tributary.


2018 ◽  
Vol 28 (09) ◽  
pp. 1831-1856 ◽  
Author(s):  
Alessandro Ciallella ◽  
Emilio N. M. Cirillo ◽  
Petru L. Curşeu ◽  
Adrian Muntean

We present modeling strategies that describe the motion and interaction of groups of pedestrians in obscured spaces. We start off with an approach based on balance equations in terms of measures and then we exploit the descriptive power of a probabilistic cellular automaton model.Based on a variation of the simple symmetric random walk on the square lattice, we test the interplay between population size and an interpersonal attraction parameter for the evacuation of confined and darkened spaces. We argue that information overload and coordination costs associated with information processing in small groups are two key processes that influence the evacuation rate. Our results show that substantial computational resources are necessary to compensate for incomplete information — the more individuals in (information processing) groups the higher the exit rate for low population size. For simple social systems, it is likely that the individual representations are not redundant and large group sizes ensure that this non-redundant information is actually available to a substantial number of individuals. For complex social systems, information redundancy makes information evaluation and transfer inefficient and, as such, group size becomes a drawback rather than a benefit. The effect of group sizes on outgoing fluxes, evacuation times and wall effects is carefully studied with a Monte Carlo framework accounting also for the presence of an internal obstacle.


Sign in / Sign up

Export Citation Format

Share Document