scholarly journals Think big: learning contexts, algorithms and data science

2016 ◽  
Vol 8 (2) ◽  
pp. 69-83 ◽  
Author(s):  
Michele Baldassarre

Abstract Due to the increasing growth in available data in recent years, all areas of research and the managements of institutions and organisations, specifically schools and universities, feel the need to give meaning to this availability of data. This article, after a brief reference to the definition of big data, intends to focus attention and reflection on their type to proceed to an extension of their characterisation. One of the hubs to make feasible the use of Big Data in operational contexts is to give a theoretical basis to which to refer. The Data, Information, Knowledge and Wisdom (DIKW) model correlates these four aspects, concluding in Data Science, which in many ways could revolutionise the established pattern of scientific investigation. The Learning Analytics applications on online learning platforms can be tools for evaluating the quality of teaching. And that is where some problems arise. It becomes necessary to handle with care the available data. Finally, a criterion for deciding whether it makes sense to think of an analysis based on Big Data can be to think about the interpretability and relevance in relation to both institutional and personal processes.

TEME ◽  
2019 ◽  
pp. 1161
Author(s):  
Zvezdan Savić ◽  
Marija M Jovanović

The existing organization and direct implementation of all types and forms of teaching at the Faculty of Sport and Physical Education in Niš certainly provide an opportunity to improve their quality. When a good quality contemporary teaching is mentioned, it is presumed that students are in an active role, and that there are preconditionsmet for students to be able to participate in the teaching process actively and creatively, while according to their individual potentials they should be able to progress and develop. In order to systematically present the possible ways to have good quality teaching, this paper provided the theoretical basis and the pedagogical/didactic presentation of two contemporary teaching systems: cooperative and integrative learning. Through the explanation of the essential features of these teaching systems, their values and possibilities, articulation and challenges to implement them, this paper has provided the theoretical concept on how to ensure the quality of teaching through these modern teaching systems.


ASALIBUNA ◽  
2019 ◽  
Vol 2 (2) ◽  
Author(s):  
Nur Qomari

Bilingualism in teaching is something to be done to make the quality of teaching more effective especially in the World Class University Like Islamic State Of University Maulana Malik Ibrahim Malang. This Article will be give information about implementation of Bilingulism in teaching. And also give information about the definition of bilingual and what kind of bilingual can be implemented especially in Arabic language teaching for the foreigner. Beside of information about the kind of the suitable bilingual in teaching language, This Article also inform how much of Arabic language acquisition when that language to be tought by bilingual method. And offcourse this give information about the relationship between the bilingual teaching language with the skills of language.


2021 ◽  
Author(s):  
Ivan Triana ◽  
LUIS PINO ◽  
Dennise Rubio

UNSTRUCTURED Bio and infotech revolution including data management are global tendencies that have a relevant impact on healthcare. Concepts such as Big Data, Data Science and Machine Learning are now topics of interest within medical literature. All of them are encompassed in what recently is named as digital epidemiology. The purpose of this article is to propose our definition of digital epidemiology with the inclusion of a further aspect: Innovation. It means Digital Epidemiology of Innovation (DEI) and show the importance of this new branch of epidemiology for the management and control of diseases. In this sense, we will describe all characteristics concerning to the topic, current uses within medical practice, application for the future and applicability of DEI as conclusion.


2018 ◽  
Vol 14 (1) ◽  
pp. 15-39 ◽  
Author(s):  
Francesco Di Tria ◽  
Ezio Lefons ◽  
Filippo Tangorra

This article describes how the evaluation of modern data warehouses considers new solutions adopted for facing the radical changes caused by the necessity of reducing the storage volume, while increasing the velocity in multidimensional design and data elaboration, even in presence of unstructured data that are useful for providing qualitative information. The aim is to set up a framework for the evaluation of the physical and methodological characteristics of a data warehouse, realized by considering the factors that affect the data warehouse's lifecycle when taking into account the Big Data issues (Volume, Velocity, Variety, Value, and Veracity). The contribution is the definition of a set of criteria for classifying Big Data Warehouses on the basis of their methodological characteristics. Based on these criteria, the authors defined a set of metrics for measuring the quality of Big Data Warehouses in reference to the design specifications. They show through a case study how the proposed metrics are able to check the eligibility of methodologies falling in different classes in the Big Data context.


2022 ◽  
Vol 11 (2) ◽  
pp. 711-737
Author(s):  
Carina Spreitzer ◽  
Samuel Hafner ◽  
Konrad Krainer ◽  
Andreas Vohns

<p style="text-align: justify;">Research on instructional quality has been of great interest for several decades, leading to an immense and diverse body of literature. However, due to different definitions and operationalisations, the picture of what characteristics are important for instructional quality is not entirely clear. Therefore, in this paper, a scoping review was performed to provide an overview of existing evidence of both generic and subject-didactic characteristics with regard to student performance. More precisely, this paper aims to (a) identify both generic and subject-didactic characteristics affecting student performance in mathematics in secondary school, (b) cluster these characteristics into categories to show areas for quality teaching, and (c) analyse and assess the effects of these characteristics on student performance to rate the scientific evidence in the context of the articles considered. The results reveal that teaching characteristics, and not just the instruments for recording the quality of teaching as described in previous research, can be placed on a continuum ranging from generic to subject-didactic. Moreover, on account of the inconsistent definition of subject-didactic characteristics, the category of ‘subject-didactic specifics’ needs further development to establish it as a separate category in empirical research. Finally, this study represents a further step toward understanding the effects of teaching characteristics on student performance by providing an overview of teaching characteristics and their effects and evidence.</p>


2013 ◽  
Vol 791-793 ◽  
pp. 427-430
Author(s):  
Miao Yu ◽  
Bin Wang ◽  
Lin Ni ◽  
Peng Hui Li ◽  
Chun Ming Xue

The process of the metal liquid flows in the die cavity, with finite element method used, was simulated in the article. The article fully introduced the foundation of the die cavity, definition of the material parameters, model meshing and boundary conditions setting. Conclusion included velocity and pressure distribution of the liquid. The reasons of the casting defect were detected according to the analysis, which provided a theoretical basis for practical production in order to improve the quality of the products.


Author(s):  
George Tzanis ◽  
Ourania-Ioanna Fotopoulou

Undoubtedly the IoT is the future of technology, which can provide manifold benefits to health care. However, the posed challenges are also great. Concerning the analysis of healthcare data, various tools have been introduced to deal efficiently with the large volumes as well as the various peculiarities of data. The most popular representative of these modern tools is data mining. Although the KDD process has provided a lot of solutions, these techniques have to be scaled in order to deal with the new challenges posed by the big data paradigm. Cloud computing, as well as edge computing are the modern infrastructures that can provide the means to efficiently manage big data. Both cloud/edge computing and the IoT are very promising concepts of technology and their complementary characteristics assure that their integration, Cloud-IoT, provides a great potential of applications. The introduction of the Cloud-IoT paradigm in the healthcare domain can offer manifold benefits and opportunities that will considerably improve the quality of health care.


Author(s):  
Elham Nazari ◽  
Elias Ameli ◽  
Hamed Tabesh

Background & Aim: Today, with the advent of technology, due to the growing data in the field of health care, it is difficult to manage and analyze this type of data known as the Big Data. This analysis has many capabilities to improve the quality of care, reduce errors and reduce costs in care services. Methods: This study is based on search of databases (PubMed, Google Scholar, Science Direct, and Scopus). This investigation has done with the websites and the specialized books with standard key words. After a careful study, 50 sources were in the final article. Results: Since the Big Data Analysis in the field of health has been growing and also considered in recent years, this survey identified the necessity of these analyses, the definition of the Big Data, the benefits, resources, architecture, applications, analysis, platforms, Examples and challenges in the field of health care. Conclusions: Familiarity with the big data concepts in the field of healthcare can help researchers in conducting applied research and thus improve the quality of health care services and reduce costs.


2019 ◽  
Vol 8 (1) ◽  
pp. 20
Author(s):  
Elham Nazari ◽  
Marziyeh Afkanpour ◽  
Hamed Tabesh

The rapid development of technology over the past 20 years has led to explosive data growth in various industries, including defense industries, healthcare. The analysis of generated Big Data has recently been addressed by many researchers, because today's Big Data analysis are one of the most important and most profitable areas of development in Data Science and companies that are able to extract valuable knowledge among the massive amount of data at logical time can earn significant advantages . Accordingly, in this survey, we investigate definition of the Big Data and the data sources. Also look at advantages, challenges, applications, analysis and platforms used in the Big Data.


2019 ◽  
Vol 70 (1) ◽  
pp. 9-28 ◽  
Author(s):  
Patrick Haggard

Volition refers to a capacity for endogenous action, particularly goal-directed endogenous action, shared by humans and some other animals. It has long been controversial whether a specific set of cognitive processes for volition exist in the human brain, and much scientific thinking on the topic continues to revolve around traditional metaphysical debates about free will. At its origins, scientific psychology had a strong engagement with volition. This was followed by a period of disenchantment, or even outright hostility, during the second half of the twentieth century. In this review, I aim to reinvigorate the scientific approach to volition by, first, proposing a range of different features that constitute a new, neurocognitively realistic working definition of volition. I then focus on three core features of human volition: its generativity (the capacity to trigger actions), its subjectivity (the conscious experiences associated with initiating voluntary actions), and its teleology (the goal-directed quality of some voluntary actions). I conclude that volition is a neurocognitive process of enormous societal importance and susceptible to scientific investigation.


Sign in / Sign up

Export Citation Format

Share Document