Elements of Experimental Computing

Author(s):  
Giuseppe Primiero

This Chapter starts by considering the third foundation: computing as an experimental science. It overviews the origin of the term, the early positions on experimental computer science and introduces two basic notions of computational hypothesis and computational experiment essential to the understanding of computing as an experimental discipline.

2018 ◽  
pp. 4-7
Author(s):  
S. I. Zenko

The article raises the problem of classification of the concepts of computer science and informatics studied at secondary school. The efficiency of creation of techniques of training of pupils in these concepts depends on its solution. The author proposes to consider classifications of the concepts of school informatics from four positions: on the cross-subject basis, the content lines of the educational subject "Informatics", the logical and structural interrelations and interactions of the studied concepts, the etymology of foreign-language and translated words in the definition of the concepts of informatics. As a result of the first classification general and special concepts are allocated; the second classification — inter-content and intra-content concepts; the third classification — stable (steady), expanding, key and auxiliary concepts; the fourth classification — concepts-nouns, conceptsverbs, concepts-adjectives and concepts — combinations of parts of speech.


1992 ◽  
Vol 267 ◽  
Author(s):  
W. Stanley Taft ◽  
James W. Mayer

ABSTRACTAt Cornell University we are in the third year of teaching an interdisciplinary, undergraduate course on the physical properties and structures of works of Art, and the modern analytical methods used to investigate them: Art, Isotopes, and Analysis. The challenge is to explain concepts familiar to museum scientists and conservators to a group of 150 undergraduate students with a background that ranges from Art History to Computer Science. Painting techniques (Fresco, Tempera, Oil, etc.) are demonstrated to the class. The analytical techniques involve the interactions of electrons, photons, ions and neutrons with pigments and other materials. This instructional approach serves as an introduction to published analyses of works of art.


Author(s):  
Scott Selisker

The third chapter considers how technological ideas about “programming”—in cybernetics, computer science, and genetics—broadened the applications of automaton discourse and images. Such ideas were used to represent devalued forms of labor, Asian Americans, and post-Cold War fundamentalism.Selisker analyses the history and discorses that inform stereotypes such as hypertechnologicalorientals and new philosophical concepts like posthumanism.


1994 ◽  
Vol 26 (1) ◽  
pp. 1-25 ◽  
Author(s):  
Barbara Shapiro

Facts are something we take for granted, at least most of the time. As ordinary individuals we assume that there are knowable facts, for instance, that the dog chewed the drapes, that England exists, that it rained yesterday, or that babies cry. If, as scholars, that is as historians, social scientists, and natural scientists, we are more aware of the problematical nature of “facts” we nevertheless tend to establish and use facts rather unselfconsciously in our work. On this occasion I want to look at the evolution of the concept of “fact,” and in particular the way “fact” entered English natural philosophy. I will attempt to show that the concept of “fact” or “matter of fact,” so prominent in the English empirical tradition, is an adaptation or borrowing from another discipline—jurisprudence, and that many of the assumptions and much of the technology of fact-finding in law were carried over into the experimental science of the seventeenth century.My paper has three parts. The first discusses the nature of legal facts and fact-finding in the early modern period, focusing on the distinction between “matters of fact” and “matters of law,” the emphasis on first hand testimony by credible witnesses, the preference for direct testimony over inference, and legal efforts to create and maintain impartial proceedings. The second portion attempts to show how legal methods and assumptions were adopted by early modern historiographers and other fact-oriented reporters. The third section attempts to show how the legally constructed concept of “fact” or “matter of fact” was transferred to natural history and natural philosophy and generalized in Locke's empirical philosophy.


1996 ◽  
Vol 11 (3) ◽  
pp. 281-288 ◽  
Author(s):  
Luca Chittaro ◽  
Angelo Montanari

Time is one of the most relevant topics in AI. It plays a major role in several of AI research areas, ranging from logical foundations to applications of knowledge-based systems. Despite the ubiquity of time in AI, researchers tend to specialise and focus on time in particular contexts or applications, overlooking meaningful connections between different areas. In an attempt to promote crossfertilisation and reduce isolation, the Temporal Representation and Reasoning (TIME) workshop series was started in 1994. The third edition of the workshop was held on May 19–20 1996 in Key West, FL, with S. D. Goodwin and H. J. Hamilton as General Chairs, and L. Chittaro and A. Montanari as Program Chairs. A particular emphasis was given to the foundational aspects of temporal representation and reasoning through an investigation of the relationships between different approaches to temporal issues in AI, computer science and logic.


1982 ◽  
Vol 9 (1) ◽  
pp. 65-70 ◽  
Author(s):  
E. Barton Worthington

The historical perspective is becoming ever more important in scientific research and development—especially in regions of rapid political and social change, such as former colonial empires, where the past is readily forgotten. Therefore this essay attempts to reconstruct the evolution of Ecology as the scientific basis for environmental conservation and human progress, as seen through the eyes of a biologist who has exercised that science during a number of tasks in various parts of the world over most of the twentieth century.From its beginnings in evolutionary thinking during the nineteenth century, ecology emerged from natural history at the beginning of the twentieth. At first the running was made by botanists; but this was soon followed by zoologists, who dealt with more mobile communities. The first quarter-century was mainly exploratory; the second was mainly descriptive (although biological exploration was still dominant in the tropics). The third quarter saw ecology developing into an experimental science, and, as the environmental revolution got into its stride, ecology became organized both nationally and internationally.Although the term is now often misused and sometimes misunderstood by laymen, the last quarter-century is seeing the wide application of ecology in environmental and human affairs, and this gives some assurance that the twenty-first century will not become one of chaos. The Author expresses the hope that experienced practising ecologists will in future give higher priority to applying what they already know than to learning more and more about less and less.


2016 ◽  
Vol 23 (3) ◽  
pp. 145-149
Author(s):  
Marek Żukowicz ◽  
Michał Markiewicz

Abstract The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.


2021 ◽  
Vol 12 (1) ◽  
pp. 4
Author(s):  
Andrea C. Burrows ◽  
Mike Borowczak ◽  
Bekir Mugayitoglu

Computer science, cybersecurity education, and microcredentials are becoming more pervasive in all levels of the educational system. The purpose of this study was partnering with precollegiate teachers: (1) to investigate the self-efficacy of 30 precollegiate teacher participants towards computer science before, during, and after three iterations of a cybersecurity microcredential, and (2) to make changes to the cybersecurity microcredential to improve its effectiveness. The authors explored what teachers need in a microcredential. The first Cohort (n = 5) took the microcredential sequence over 28 days in the summer of 2020, the second Cohort (n = 16) took it over 42 days in the fall of 2020, and the third Cohort (n = 9) took it over 49 days in the summer of 2021. The authors investigated three research questions and used a systems thinking approach while developing, evaluating, and implementing the research study. The researchers used quantitative methods in the collection of a self-efficacy subscale survey to assess whether the precollegiate teachers’ beliefs about computer science changed, and then used qualitative methods when conducting semi-structured teacher participant interviews to address the research questions. The findings show that the precollegiate teachers’ self-efficacy scores towards computer science increased, and that there are areas in need of attention, such as resources and implementation, when creating microcredentials. The implications of this research include the importance of purposefully crafting microcredentials and professional developments, including aspects of creating effective partnerships.


2020 ◽  
Vol 20 (6) ◽  
pp. 880-894
Author(s):  
SIMON MARYNISSEN ◽  
BART BOGAERTS ◽  
MARC DENECKER

AbstractJustification theory is a unifying semantic framework. While it has its roots in non-monotonic logics, it can be applied to various areas in computer science, especially in explainable reasoning; its most central concept is a justification: an explanation why a property holds (or does not hold) in a model.In this paper, we continue the study of justification theory by means of three major contributions. The first is studying the relation between justification theory and game theory. We show that justification frameworks can be seen as a special type of games. The established connection provides the theoretical foundations for our next two contributions. The second contribution is studying under which condition two different dialects of justification theory (graphs as explanations vs trees as explanations) coincide. The third contribution is establishing a precise criterion of when a semantics induced by justification theory yields consistent results. In the past proving that such semantics were consistent took cumbersome and elaborate proofs.We show that these criteria are indeed satisfied for all common semantics of logic programming.


Sign in / Sign up

Export Citation Format

Share Document