scholarly journals El pensamiento bayesiano, un pensamiento computacional omnipresente

2021 ◽  
Vol 21 (68) ◽  
Author(s):  
Miguel Zapata Ros ◽  
Yamil Buenaño Palacios

En su acepción más sencilla se considera el pensamiento computacional como una serie de habilidades específicas que sirven a los programadores para hacer su tarea, pero que también son útiles a la gente en su vida profesional y en su vida personal como una forma de organizar la resolución de sus problemas, y de representar la realidad que hay en torno a ellos. En un esquema más elaborado este complejo de habilidades constituye una nueva alfabetización ---o la parte más sustancial de ella--- y una inculturación para manejarse en una nueva cultura, la cultura digital en la sociedad del conocimiento. Hemos visto cómo se usa la probabilidad bayesiana, en modelos de epidemiología, para determinar modelos de evolución de datos sobre contagio y fallecimientos en el COVID y en el procesamiento del lenguaje natural. Igualmente podríamos verlo en multitud de casos en los más variados campos científicos y de análisis de procesos. De esta forma, con la automatización de los métodos bayesianos y el uso de modelos gráficos probabilísticos es posible identificar patrones y anomalías en voluminosos conjuntos de datos en campos tan diversos como son los corpus lingüísticos, los mapas astronómicos, añadir funcionalidades a la práctica de la resonancia magnética, o a los hábitos de compra con tarjeta, online o smartphones. En esta nueva forma de proceder, se asocian el análisis de grandes datos y la teoría bayesiana. Si consideramos que el pensamiento bayesiano, esta forma de proceder, como un elemento más y relevante del pensamiento computacional, entonces a lo dicho en anteriores ocasiones hay que añadir ahora la idea de pensamiento computacional generalizado , que va más allá de la educación Ya no se trata de aspectos puramente asociados a la práctica profesional o vital ordinaria para manejarse por la vida y el mundo del trabajo, como ha sido lo que hemos llamado pensamiento computacional hasta ahora, sino como una preparación para la investigación básica y para metodología investigadora en casi todas las disciplinas. Porque, así definido, el pensamiento computacional está influyendo en la investigación en casi todas las áreas, tanto en las ciencias como en las humanidades. Una instrucción centrada en esta componente de pensamiento computacional, el pensamiento bayesiano, o que lo incluyese en una fase temprana, en Secundaria (K-12), incluyendo la fórmula de la probabilidad inversa, permitiría, basándonos en los First principles of learning de Merrill, y en particular en el principio de activación, activar estos aprendizajes como componentes muy valiosos y muy complejos en una etapa posterior de la actividad profesional o investigadora, o en la fase de formación, grados y postgrados, de estas profesiones o que capacitan para estas actividades y profesiones. In its simplest sense, computational thinking is considered as a series of specific skills that programmers use to do their homework. They are also useful to people in their professional and personal lives, as a way of organizing the resolution of their problems, and of representing the reality that surrounds them. In a more elaborate scheme, this complex of skills constitutes a new literacy --- or the most substantial part of it --- and an inculturation to deal with a new culture: digital culture in the knowledge society. We have seen how Bayesian Probability is used in epidemiology models to determine models of data evolution on contagion and deaths in COVID. We have also seen it in natural language processing. We could also see it in many cases in the most varied scientific and process analysis fields. In this way, with the automation of Bayesian methods and the use of probabilistic graphical models, it is possible to identify patterns and anomalies in voluminous data sets in fields. Fields as diverse as linguistic corpus, astronomical maps, adding functionalities to the practice of magnetic resonance imaging, or to the habits of buying with cards, online or smartphones. In this new way of proceeding, big data analysis and Bayesian theory are associated.. If we consider that Bayesian thinking (this way of proceeding) as one more element of computational thinking, then, to what has been said on previous occasions, we must now add the idea of ​​generalized computational thinking, which goes beyond education It is no longer about aspects purely associated with professional or vital practice to deal with life and the world of work, but as a preparation for basic research and for a research methodology in almost all disciplines. Thus defined, computational thinking is influencing research in almost all areas, both in the sciences and the humanities. An instruction focused on this component of computational thinking, or including it, at an early stage, in Secondary, would allow to activate these learnings as very valuable and very complex components at a later stage. In professional or research activity, in the training phase, undergraduate and postgraduate degrees, of these professions. Those that train for these activities and professions.

2021 ◽  
Author(s):  
Miguel Zapata-Ros

In its simplest sense, computational thinking is considered as a series of specific skills that help programmers to do their homework, but that are also useful to people in their professional life and in their personal life as a way to organize the resolution on their problems, and of representing the reality that surrounds them.In a more elaborate scheme, this complex of skills constitutes a new literacy --- or the most substantial part of it --- and an inculturation to deal with a new culture, the digital culture in the knowledge society.We have seen how Bayesian probability is used in epidemiology models to determine models for the evolution of data on contagion and deaths in COVID and in natural language processing.We could also see it in a multitude of cases in the most varied scientific and process analysis fields. In this way, with the automation of Bayesian methods and the use of probabilistic graphical models, it is possible to identify patterns and anomalies in voluminous data sets in fields as diverse as linguistic corpus, astronomical maps, add functionalities to the practice of the magnetic resonance imaging, or to card, online or smartphone purchasing habits. In this new way of proceeding, big data analysis and Bayesian theory are associated.If we consider that Bayesian thinking, this way of proceeding, as one more and more relevant element of computational thinking, then to what has been said on previous occasions we must now add the idea of generalized computational thinking, which goes beyond education. It is no longer about aspects purely associated with ordinary professional or vital practice to deal with life and the world of work, as has been what we have called computational thinking until now, but as a preparation for basic research and research methodology in almost all disciplines. Because, thus defined, computational thinking is influencing research in almost all areas, both in the sciences and in the humanities. An instruction focused on this component of computational thinking, Bayesian thinking, of including it at an early stage, in Secondary (K-12), including the inverse probability formula, would allow, based on Merrill’s First principles of learning, and in particular in the activation principle, activate these learning as very valuable and very complex components in a later stage of professional or research activity, or in the training passed, undergraduate and postgraduate degrees, of these professions or that train for these activities and professions.


2021 ◽  
Vol 8 (1) ◽  
pp. 49-74
Author(s):  
Mona Emara ◽  
Nicole Hutchins ◽  
Shuchi Grover ◽  
Caitlin Snyder ◽  
Gautam Biswas

The integration of computational modelling in science classrooms provides a unique opportunity to promote key 21st century skills including computational thinking (CT) and collaboration. The open-ended, problem-solving nature of the task requires groups to grapple with the combination of two domains (science and computing) as they collaboratively construct computational models. While this approach has produced significant learning gains for students in both science and CT in K–12 settings, the collaborative learning processes students use, including learner regulation, are not well understood. In this paper, we present a systematic analysis framework that combines natural language processing (NLP) of collaborative dialogue, log file analyses of students’ model-building actions, and final model scores. This analysis is used to better understand students’ regulation of collaborative problem solving (CPS) processes over a series of computational modelling tasks of varying complexity. The results suggest that the computational modelling challenges afford opportunities for students to a) explore resource-intensive processes, such as trial and error, to more systematic processes, such as debugging model errors by leveraging data tools, and b) learn from each other using socially shared regulation (SSR) and productive collaboration. The use of such SSR processes correlated positively with their model-building scores. Our paper aims to advance our understanding of collaborative, computational modelling in K–12 science to better inform classroom applications.


Author(s):  
José Miguel Merino-Armero ◽  
José Antonio González-Calero ◽  
Ramón Cózar-Gutiérrez

2021 ◽  
Vol 54 (2) ◽  
pp. 1-36
Author(s):  
Sameen Maruf ◽  
Fahimeh Saleh ◽  
Gholamreza Haffari

Machine translation (MT) is an important task in natural language processing (NLP), as it automates the translation process and reduces the reliance on human translators. With the resurgence of neural networks, the translation quality surpasses that of the translations obtained using statistical techniques for most language-pairs. Up until a few years ago, almost all of the neural translation models translated sentences independently , without incorporating the wider document-context and inter-dependencies among the sentences. The aim of this survey article is to highlight the major works that have been undertaken in the space of document-level machine translation after the neural revolution, so researchers can recognize the current state and future directions of this field. We provide an organization of the literature based on novelties in modelling and architectures as well as training and decoding strategies. In addition, we cover evaluation strategies that have been introduced to account for the improvements in document MT, including automatic metrics and discourse-targeted test sets. We conclude by presenting possible avenues for future exploration in this research field.


Author(s):  
Emily C. Bouck ◽  
Phil Sands ◽  
Holly Long ◽  
Aman Yadav

Increasingly in K–12 schools, students are gaining access to computational thinking (CT) and computer science (CS). This access, however, is not always extended to students with disabilities. One way to increase CT and CS (CT/CS) exposure for students with disabilities is through preparing special education teachers to do so. In this study, researchers explore exposing special education preservice teachers to the ideas of CT/CS in the context of a mathematics methods course for students with disabilities or those at risk of disability. Through analyzing lesson plans and reflections from 31 preservice special education teachers, the researchers learned that overall emerging promise exists with regard to the limited exposure of preservice special education teachers to CT/CS in mathematics. Specifically, preservice teachers demonstrated the ability to include CT/CS in math lesson plans and showed understanding of how CT/CS might enhance instruction with students with disabilities via reflections on these lessons. The researchers, however, also found a need for increased experiences and opportunities for preservice special education teachers with CT/CS to more positively impact access for students with disabilities.


Author(s):  
Michael Lodi ◽  
Simone Martini

AbstractThe pervasiveness of Computer Science (CS) in today’s digital society and the extensive use of computational methods in other sciences call for its introduction in the school curriculum. Hence, Computer Science Education is becoming more and more relevant. In CS K-12 education, computational thinking (CT) is one of the abused buzzwords: different stakeholders (media, educators, politicians) give it different meanings, some more oriented to CS, others more linked to its interdisciplinary value. The expression was introduced by two leading researchers, Jeannette Wing (in 2006) and Seymour Papert (much early, in 1980), each of them stressing different aspects of a common theme. This paper will use a historical approach to review, discuss, and put in context these first two educational and epistemological approaches to CT. We will relate them to today’s context and evaluate what aspects are still relevant for CS K-12 education. Of the two, particular interest is devoted to “Papert’s CT,” which is the lesser-known and the lesser-studied. We will conclude that “Wing’s CT” and “Papert’s CT,” when correctly understood, are both relevant to today’s computer science education. From Wing, we should retain computer science’s centrality, CT being the (scientific and cultural) substratum of the technical competencies. Under this interpretation, CT is a lens and a set of categories for understanding the algorithmic fabric of today’s world. From Papert, we should retain the constructionist idea that only a social and affective involvement of students into the technical content will make programming an interdisciplinary tool for learning (also) other disciplines. We will also discuss the often quoted (and often unverified) claim that CT automatically “transfers” to other broad 21st century skills. Our analysis will be relevant for educators and scholars to recognize and avoid misconceptions and build on the two core roots of CT.


2021 ◽  
pp. 0013189X2110579
Author(s):  
Yasmin B. Kafai ◽  
Chris Proctor

Over the past decade, initiatives around the world have introduced computing into K–12 education under the umbrella of computational thinking. While initial implementations focused on skills and knowledge for college and career readiness, more recent framings include situated computational thinking (identity, participation, creative expression) and critical computational thinking (political and ethical impacts of computing, justice). This expansion reflects a revaluation of what it means for learners to be computationally-literate in the 21st century. We review the current landscape of K–12 computing education, discuss interactions between different framings of computational thinking, and consider how an encompassing framework of computational literacies clarifies the importance of computing for broader K–12 educational priorities as well as key unresolved issues.


Sign in / Sign up

Export Citation Format

Share Document