technological singularity
Recently Published Documents


TOTAL DOCUMENTS

91
(FIVE YEARS 42)

H-INDEX

5
(FIVE YEARS 2)

2022 ◽  
pp. 17-49
Author(s):  
Donna L. Panucci ◽  
Theresa Bullard-Whyke

The so-called “singularity” postulates that artificial intelligence technology (AI) will soon outdistance human intelligence and commandeer (Terminator-like) planetary authority from humanity. It may be stipulated that this science fiction scenario is already becoming a science-reality. However, an alternative to the threatening brain-based technological singularity is not being considered. The potential for creating a positive non-dual reality with “coherent heart entrainment” is the desirable alternative, and this alternative “heart-based coherency” can begin with healing the media-sphere. The technological singularity—perhaps the salient issue facing “healing” of the global media-sphere—is a hypothetical point in time (very soon) at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.


2021 ◽  
pp. 1-3
Author(s):  
Laurie A. Schintler ◽  
Connie L. McNeely

2021 ◽  
pp. 394-408
Author(s):  
Roger Bradbury

This chapter considers the problem of educating for cybersecurity from the perspective of complex systems science. It argues that education is a process that has evolved in human social systems to curate, increase, and transmit the information needed for system survival. Education creates an increase in the negentropy (or useful information) of those systems as they seek to maximize the acquisition and throughput of energy—a physical principle known as maximum entropy production (MaxEP). Civilizations have responded to this principle over time by finding new solutions to the Earth’s MaxEP and becoming more complex in the process. A key part of this complexification is education. And in the present cyber age it is, as in previous ages, a lagging process cobbled together from the structures and processes of previous ages. The current education responses may soon be superseded as a new solution to the Earth’s MaxEP—the technological singularity—looms.


2021 ◽  
Author(s):  
Deep Bhattacharjee ◽  
Sanjeevan Singha Roy

<p>If in future, the highly intelligent machines control the world, then what would be its advantages and disadvantages? Will, those artificial intelligence powered superintelligent machines become an anathema for humanity or will they ease out the human works by guiding humans in complicated tasks, thereby extending a helping hand to the human works making them comfortable. Recent studies in theoretical computer science especially artificial intelligence predicted something called ‘technological singularity’ or the ‘intelligent explosion’ and if this happens then there can be a further stage as transfused machinery intelligence and actual intelligence where the machines being immensely powerful with a cognitive capacity more than that of humans for solving ‘immensely complicated tasks’ can takeover the humans and even the machines by more intelligent machines of superhuman intelligence. Therefore, it is troublesome and worry-full to think that ‘if in case the machines turned out against humans for their optimal domination in this planet’. Can humans have any chances to avoid them by bypassing the inevitable ‘hard singularity’ through a set of ‘soft singularity’. This paper discusses all the facts in details along with significant calculations showing humanity, how to avoid the hard singularity when the progress of intelligence is inevitable. </p>


2021 ◽  
Author(s):  
Deep Bhattacharjee ◽  
Sanjeevan Singha Roy

<p>If in future, the highly intelligent machines control the world, then what would be its advantages and disadvantages? Will, those artificial intelligence powered superintelligent machines become an anathema for humanity or will they ease out the human works by guiding humans in complicated tasks, thereby extending a helping hand to the human works making them comfortable. Recent studies in theoretical computer science especially artificial intelligence predicted something called ‘technological singularity’ or the ‘intelligent explosion’ and if this happens then there can be a further stage as transfused machinery intelligence and actual intelligence where the machines being immensely powerful with a cognitive capacity more than that of humans for solving ‘immensely complicated tasks’ can takeover the humans and even the machines by more intelligent machines of superhuman intelligence. Therefore, it is troublesome and worry-full to think that ‘if in case the machines turned out against humans for their optimal domination in this planet’. Can humans have any chances to avoid them by bypassing the inevitable ‘hard singularity’ through a set of ‘soft singularity’. This paper discusses all the facts in details along with significant calculations showing humanity, how to avoid the hard singularity when the progress of intelligence is inevitable. </p>


Sign in / Sign up

Export Citation Format

Share Document