computing science
Recently Published Documents


TOTAL DOCUMENTS

257
(FIVE YEARS 38)

H-INDEX

14
(FIVE YEARS 1)

2022 ◽  
Author(s):  
Juliao Braga ◽  
Jeferson Campos Nobre ◽  
Lisandro Zambenedetti Granville ◽  
Marcelo Santos

The IETF is responsible for the standardization and development of Internet protocols and this is based on the voluntary participation ofprofessionals, academics, and resear- chers from around the world. Volunteers work together through email lists and in three face-to-face meetings yearly. This proposal considers the importance ofidentifying mul- tidisciplinary opportunities around the Internet Engineering Task Force (IETF) in the process ofcreating or improving innovative standards on the Internet. We will discuss the organization of working groups, highlighting discussions ranging from protocols known as the Internet Protocol (IP) to research groups such as Things-to-Things (T2TRG) that discuss standards on the Internet ofThings (IoT). The opportunity to discuss theoretical/- practical challenges and manners of collaboration at the IETF opens up a vast prospect ofinclusion for the Brazilian community, as it becomes aware ofhow the IETF is consti- tuted and remains active, vigilant and prepared for the necessary changes for the smooth functioning of the Internet. The multidisciplinary, in the field of computing science that aggregates the volunteering of the IETF, is evident and needs the active help of people with diversified knowledge and in areas other than, necessarily, networks. In this way, this chapter covers since basic foundations on the Internet, the functioning of the IETF, the process ofdevelopment ofnew protocols, as well as the necessary tools and rules for writing an Internet-Draft (I-D).


2022 ◽  
pp. 677-715
Author(s):  
Amy Eguchi

The chapter aims at helping educators and classroom teachers who are new to using educational robotics as a learning tool in their classrooms. It discusses the approaches using robotics as a learning tool - a tool perfectly suited for enabling constructionist learning in the classroom and how educational robotics can provide ‘all' students motivation to learn STEM and computing science concepts. Educational robotics as a learning tool requires teachers as well as students to shift from traditional pedagogical approaches to learner-centered active learning approaches. The chapter discusses how the shift can be made in successful ways and provides guidance to pre- and in-service teachers on how to implement educational robotics as a learning tool to reach and attract ‘all' students to promote their learning.


Author(s):  
Zanyar Ameen

As everyday problems contain a lot of data and ambiguity, it has become necessary to develop new mathematical approaches to address them and soft set theory is the best tool to deal with such problems. Hence, in this article, we introduce a non-continuous mapping in soft settings called soft U -continuous. We mainly focus on studying soft U -continuity and its connection to soft continuity. We further show that soft U -continuity preserves soft compact sets and soft connected sets. The later sets have various applications in computing science and decision making theory. In the end, we show that if each soft U -continuous mapping f from a soft space X into a soft T 0-space Y is soft continuous, then Y is soft T 1.


Informatics ◽  
2021 ◽  
Vol 8 (4) ◽  
pp. 71
Author(s):  
János Végh

Today’s computing is based on the classic paradigm proposed by John von Neumann, three-quarters of a century ago. That paradigm, however, was justified for (the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!). It led to catastrophic performance losses in computing systems, from the operating gate level to large networks, including the neuromorphic ones. The model is perfect, but the paradigm is applied outside of its range of validity. The classic paradigm is completed here by providing the “procedure” missing from the “First Draft” that enables computing science to work with cases where the transfer time is not negligible apart from the processing time. The paper reviews whether we can describe the implemented computing processes by using the accurate interpretation of the computing model, and whether we can explain the issues experienced in different fields of today’s computing by omitting the wrong omissions. Furthermore, it discusses some of the consequences of improper technological implementations, from shared media to parallelized operation, suggesting ideas on how computing performance could be improved to meet the growing societal demands.


2021 ◽  
Author(s):  
Quintin Cutts ◽  
Joseph Maguire ◽  
Sally Fincher ◽  
Jack Parkinson

Author(s):  
Janos Vegh

Classic science seemed to be completed more than a century ago, facing only a few (but growing number of!) unexplained issues. Introducing time-dependence into classic science explained those issues, and its consequent use led to the birth of a series of modern sciences, including relativistic and quantum physics. Classic computing is based on the paradigm proposed by von Neumann for vacuum tubes only, which seems to be completed in the same sense. Von Neumann warned, however, that implementing computers under more advanced technological conditions, using the paradigm without considering the transfer time (and especially attempting to imitate neural operation), would be unsound. However, classic computing science persists in neglecting the transfer time and is facing a few (but growing number of!) unexplained issues, and its development stalled in most of its fields. Introducing time-dependence into the classic computing science explains those issues and discovers the reasons for its experienced stalling. It can lead to a revolution in computing, resulting in a modern computing science, in the same way, as it resulted in modern science's birth.


Sign in / Sign up

Export Citation Format

Share Document