Generating metadata to study and teach about African issues

2014 ◽  
Vol 27 (3) ◽  
pp. 341-365 ◽  
Author(s):  
Faleh Alshameri ◽  
Abdul Karim Bangura

Purpose – After almost three centuries of employing western educational approaches, many African societies are still characterized by low western literacy rates, civil conflicts, and underdevelopment. It is obvious that these western educational paradigms, which are not indigenous to Africans, have done relatively little good for Africans. Thus, the purpose of this paper is to argue that the salvation for Africans hinges upon employing indigenous African educational paradigms which can be subsumed under the rubric of ubuntugogy, which the authors define as the art and science of teaching and learning undergirded by humanity toward others. Design/methodology/approach – Therefore, ubuntugogy transcends pedagogy (the art and science of teaching), andragogy (the art and science of helping adults learn), ergonagy (the art and science of helping people learn to work), and heutagogy (the study of self-determined learning). That many great African minds, realizing the debilitating effects of the western educational systems that have been forced upon Africans, have called for different approaches. Findings – One of the biggest challenges for studying and teaching about Africa in Africa at the higher education level, however, is the paucity of published material. Automated generation of metadata is one way of mining massive data sets to compensate for this shortcoming. Originality/value – Thus, the authors address the following major research question in this paper: What is automated generation of metadata and how can the technique be employed from an African-centered perspective? After addressing this question, conclusions and recommendations are offered.

2019 ◽  
Vol 11 (1) ◽  
pp. 90-101 ◽  
Author(s):  
Mark Angolia ◽  
April Helene Reed

PurposeThe purpose of this paper is to encourage the use of simulations early in a semester, rather than as a course capstone activity, in an effort to utilize simulations as a foundational experience. The intent is to support teaching and learning, as opposed to using simulations as a capstone assignment or assessment tool.Design/methodology/approachA comprehensive literature review synthesizing higher education business simulation effectivity and evaluation methods provides support for the analysis of 60 undergraduate supply chain management students and 96 surveys conducted over two years. The research question explores effectiveness based on the point of time during a semester a simulation was used.FindingsThe analysis of simulation effectiveness, based on the impact of course enjoyment and assistance with learning key course competencies, showed no significant differences for simulations used early in a semester or as an end-of-semester capstone event.Practical implicationsSimulations are effective tools regardless of when they are employed, but there may be significant benefits to using a simulation early in a semester by capitalizing on the tool’s inherent experiential learning functionality, active learning theory and the Kolb Experiential Learning Cycle. Early use of simulations provides common student experiences and creates a foundation for educators to develop a deeper understanding of course concepts. Additional instructor effort is needed to develop external, course specific student work to supplement and enhance the simulation experience. Early use also creates post-simulation debriefing benefits that may be precluded by the end-of-semester simulation events.Originality/valueEvidence suggests that simulations are primarily utilized as course capstone events and/or serve as comprehensive tools to integrate/assess a semester’s worth of conceptual learning. This work fills a gap in the research concerning time frames within a semester when simulations are traditionally employed, presenting a paradigm shift toward early utilization.


2013 ◽  
Vol 3 (1) ◽  
pp. 100-124 ◽  
Author(s):  
Chrissie Harrington

Purpose – The purpose of this paper is to explore the inter-relationship between choreography and pedagogy. It refers specifically to a Participatory Action Research (PAR) project that dealt with investigations into performance making and the design of a teaching and learning model. Shifts from making performance from a pre-determined starting point to a participatory and interactive process are traced to reveal a “choreographic pedagogy” informed and transformed by the experience of its actors. Design/methodology/approach – The paper includes a brief explanation of the terms and shared features of choreography and pedagogy, and how PAR facilitated a cyclic generation of new findings that drove the research forward. The research question is tackled through concepts, practices and tasks within the four cycles of research, each year with new participants, questions and expanding contexts. Findings – The experience of the research participants reveals unexpected and “unfolding phenomena” that open up spaces for imagining, creating and interpreting, as a “choreographic pedagogy” in action. Research limitations/implications – The research might appear to be limited to the areas of performance and teaching and learning, although it could provide a model for other subjects, especially for those that engage with creative processes. Practical implications – The research is a “practice as research” model and has implications for research in education as a practice of knowledge exploration and generation. Originality/value – It is original and has the potential to inform the ways in which educators explore and expand their disciplines through teaching and learning investigations.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Sathyaraj R ◽  
Ramanathan L ◽  
Lavanya K ◽  
Balasubramanian V ◽  
Saira Banu J

PurposeThe innovation in big data is increasing day by day in such a way that the conventional software tools face several problems in managing the big data. Moreover, the occurrence of the imbalance data in the massive data sets is a major constraint to the research industry.Design/methodology/approachThe purpose of the paper is to introduce a big data classification technique using the MapReduce framework based on an optimization algorithm. The big data classification is enabled using the MapReduce framework, which utilizes the proposed optimization algorithm, named chicken-based bacterial foraging (CBF) algorithm. The proposed algorithm is generated by integrating the bacterial foraging optimization (BFO) algorithm with the cat swarm optimization (CSO) algorithm. The proposed model executes the process in two stages, namely, training and testing phases. In the training phase, the big data that is produced from different distributed sources is subjected to parallel processing using the mappers in the mapper phase, which perform the preprocessing and feature selection based on the proposed CBF algorithm. The preprocessing step eliminates the redundant and inconsistent data, whereas the feature section step is done on the preprocessed data for extracting the significant features from the data, to provide improved classification accuracy. The selected features are fed into the reducer for data classification using the deep belief network (DBN) classifier, which is trained using the proposed CBF algorithm such that the data are classified into various classes, and finally, at the end of the training process, the individual reducers present the trained models. Thus, the incremental data are handled effectively based on the training model in the training phase. In the testing phase, the incremental data are taken and split into different subsets and fed into the different mappers for the classification. Each mapper contains a trained model which is obtained from the training phase. The trained model is utilized for classifying the incremental data. After classification, the output obtained from each mapper is fused and fed into the reducer for the classification.FindingsThe maximum accuracy and Jaccard coefficient are obtained using the epileptic seizure recognition database. The proposed CBF-DBN produces a maximal accuracy value of 91.129%, whereas the accuracy values of the existing neural network (NN), DBN, naive Bayes classifier-term frequency–inverse document frequency (NBC-TFIDF) are 82.894%, 86.184% and 86.512%, respectively. The Jaccard coefficient of the proposed CBF-DBN produces a maximal Jaccard coefficient value of 88.928%, whereas the Jaccard coefficient values of the existing NN, DBN, NBC-TFIDF are 75.891%, 79.850% and 81.103%, respectively.Originality/valueIn this paper, a big data classification method is proposed for categorizing massive data sets for meeting the constraints of huge data. The big data classification is performed on the MapReduce framework based on training and testing phases in such a way that the data are handled in parallel at the same time. In the training phase, the big data is obtained and partitioned into different subsets of data and fed into the mapper. In the mapper, the features extraction step is performed for extracting the significant features. The obtained features are subjected to the reducers for classifying the data using the obtained features. The DBN classifier is utilized for the classification wherein the DBN is trained using the proposed CBF algorithm. The trained model is obtained as an output after the classification. In the testing phase, the incremental data are considered for the classification. New data are first split into subsets and fed into the mapper for classification. The trained models obtained from the training phase are used for the classification. The classified results from each mapper are fused and fed into the reducer for the classification of big data.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Jeanne Ho ◽  
Trivina Kang ◽  
Imran Shaari

PurposeThe purpose of this paper is to examine leading from the middle, which is consistent with calls to distribute leadership, while expanding the direction of influence, from the normal top-down to include a bottom-up or lateral direction. The paper proposes that the position of the vice-principal enables the role incumbent to lead from the middle as a boundary spanner. The research question was what leadership from the middle looks like for vice-principals.Design/methodology/approachThe study consisted of interviews of 28 vice-principals and 10 principals. A mixed case and theme-oriented strategy was adapted, with member checking with each vice-principal.FindingsThe findings indicate that in leading from the middle, vice-principals play boundary spanning roles of connecting, translating and brokering: (1) connecting between organisational levels, (2) translating between vision/direction and actualisation, (3) connecting between middle managers and (4) brokering and translating between the ministry and the school.Originality/valueLeading from the middle is a nascent concept which is worth exploring, given the complexity of educational systems with multiple ecological levels, and the need for leadership to create coherence between the levels.


2021 ◽  
Vol 13 (17) ◽  
pp. 9829 ◽  
Author(s):  
Viknesh Nair ◽  
Melor Md Yunus

Educational systems frequently employ technological equipment in a variety of ways to make lessons in an English Language classroom fun and meaningful. For both students and instructors, digital storytelling (DST) has evolved into a useful instructional tool that can be utilised in the teaching and learning process. To answer the research question on the role of digital storytelling in improving students’ speaking skills, The Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) was used to systematically review 45 articles sourced from Google scholar and ERIC, and most of these articles highlight the importance of digital storytelling as a contemporary teaching methodology. These articles showed that digital storytelling can be used as a useful tool by educators in improving students’ speaking skills from various levels of education, ranging from primary to tertiary education. Most of the authors of these research papers provided empirical proof that substantiated the advantages of employing digital storytelling in the classroom to help pupils communicate and speak more effectively.


2020 ◽  
Vol 5 (3/4) ◽  
pp. 295-305
Author(s):  
Trista Hollweck ◽  
Armand Doucet

PurposeThis thinking piece examines, from the viewpoint of two Canadian pracademics in the pandemic, the role of pedagogy and professionalism in crisis teaching and learning. The purpose of the paper is to highlight some of the tensions that have emerged and offer possible considerations to disrupt the status quo and catalyze transformation in public education during the pandemic and beyond.Design/methodology/approachThis paper considers the current context of COVID-19 and education and uses the professional capital framework (Hargreaves and Fullan, 2012) to examine pandemic pedagogies and professionalism.FindingsThe COVID-19 pandemic has catapulted educational systems into emergency remote teaching and learning. This rapid shift to crisis schooling has massive implications for pedagogy and professionalism during the pandemic and beyond. Despite the significant challenges for educators, policymakers, school leaders, students and families, the pandemic is a critical opportunity to rethink the future of schooling. A key to transformational change will be for schools and school systems to focus on their professional capital and find ways to develop teachers' individual knowledge and skills, support effective collaborative networks that include parents and the larger school community and, ultimately, trust and include educators in the decision-making and communication process.Originality/valueThis thinking piece offers the perspective of two Canadian pracademics who do not wish for a return to “normal” public education, which has never serve all children well or equitably. Instead, they believe the pandemic is an opportunity to disrupt the status quo and build the education system back better. Using the professional capital framework, they argue that it will be educators' professionalism and pandemic pedagogies that will be required to catalyze meaningful transformational change.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Riddhi Thavi ◽  
Rujuta Jhaveri ◽  
Vaibhav Narwane ◽  
Bhaskar Gardas ◽  
Nima Jafari Navimipour

Purpose This paper aims to provide a literature review on the cloud-based platforms for the education sectors. The several aspects of cloud computing adoption in education, remote/distance learning and the application of cloud-based design and manufacturing (CBDM) have been studied and theorised. Design/methodology/approach A four-step methodology was adopted to analyse and categorise the papers obtained through various search engines. Out of 429 research articles, 72 papers were shortlisted for the detailed analysis. Findings Many factors that influence cloud computing technology adoption in the education sector have been identified in this paper. The research findings on several research items have been tabulated and discussed. Based on the theoretical research done on cloud computing for education, cloud computing for remote/distance learning and CBDM, cloud computing could enhance the educational systems in mainly developing countries and improve the scope for remote/distance learning. Research limitations/implications This study is limited to papers published only in the past decade from 2011 to 2020. Besides, this review was unable to include journal articles published in different languages. Nevertheless, for the effective teaching and learning process, this paper could help understand the importance and improve the process of adopting cloud computing concepts in educational universities and platforms. Originality/value This study is a novel one as a research review constituting cloud computing applications in education and extended for remote/distance learning and CBDM, which have not been studied in the existing knowledge base.


2017 ◽  
Vol 25 (1) ◽  
pp. 26-42 ◽  
Author(s):  
Paul G. LeMahieu ◽  
Lee E. Nordstrum ◽  
Ashley Seidel Potvin

Purpose This paper is second of seven in this volume elaborating different approaches to quality improvement in education. It delineates a methodology called design-based implementation research (DBIR). The approach used in this paper is aimed at iteratively improving the quality of classroom teaching and learning practices in defined problem areas through collaborations among researchers, practitioners and other education stakeholders. Design/methodology/approach The paper describes the origins of the approach in US education, along with its foundations, core principles and a case application of DBIR in practice. The case focuses on the specific problem of teaching science and genetics in primary and secondary schools in a district. Findings The guiding principles of DBIR are: a focus on persistent problems of classroom educational practice; iterative and collaborative design and testing of innovations through partnerships between researchers and practitioners, involving multiple stakeholders’ perspectives; a concern with developing theory related to both implementation processes and classroom learning outcomes, using systematic inquiry; and development of the capacity of both researchers and practitioners to sustain changes in educational systems. Originality/value Few theoretical treatments and demonstration cases are currently available in US education that examine common models of quality improvement, particularly DBIR. By engaging practitioners with researchers in designing, testing and implementing reforms meaningfully, DBIR shows promise in offering significant on-the-ground benefits. This paper adds value by allowing readers to compare the DBIR method with the other improvement approaches explicated in this volume.


2004 ◽  
Vol 101 (Supplement3) ◽  
pp. 326-333 ◽  
Author(s):  
Klaus D. Hamm ◽  
Gunnar Surber ◽  
Michael Schmücking ◽  
Reinhard E. Wurm ◽  
Rene Aschenbach ◽  
...  

Object. Innovative new software solutions may enable image fusion to produce the desired data superposition for precise target definition and follow-up studies in radiosurgery/stereotactic radiotherapy in patients with intracranial lesions. The aim is to integrate the anatomical and functional information completely into the radiation treatment planning and to achieve an exact comparison for follow-up examinations. Special conditions and advantages of BrainLAB's fully automatic image fusion system are evaluated and described for this purpose. Methods. In 458 patients, the radiation treatment planning and some follow-up studies were performed using an automatic image fusion technique involving the use of different imaging modalities. Each fusion was visually checked and corrected as necessary. The computerized tomography (CT) scans for radiation treatment planning (slice thickness 1.25 mm), as well as stereotactic angiography for arteriovenous malformations, were acquired using head fixation with stereotactic arc or, in the case of stereotactic radiotherapy, with a relocatable stereotactic mask. Different magnetic resonance (MR) imaging sequences (T1, T2, and fluid-attenuated inversion-recovery images) and positron emission tomography (PET) scans were obtained without head fixation. Fusion results and the effects on radiation treatment planning and follow-up studies were analyzed. The precision level of the results of the automatic fusion depended primarily on the image quality, especially the slice thickness and the field homogeneity when using MR images, as well as on patient movement during data acquisition. Fully automated image fusion of different MR, CT, and PET studies was performed for each patient. Only in a few cases was it necessary to correct the fusion manually after visual evaluation. These corrections were minor and did not materially affect treatment planning. High-quality fusion of thin slices of a region of interest with a complete head data set could be performed easily. The target volume for radiation treatment planning could be accurately delineated using multimodal information provided by CT, MR, angiography, and PET studies. The fusion of follow-up image data sets yielded results that could be successfully compared and quantitatively evaluated. Conclusions. Depending on the quality of the originally acquired image, automated image fusion can be a very valuable tool, allowing for fast (∼ 1–2 minute) and precise fusion of all relevant data sets. Fused multimodality imaging improves the target volume definition for radiation treatment planning. High-quality follow-up image data sets should be acquired for image fusion to provide exactly comparable slices and volumetric results that will contribute to quality contol.


2019 ◽  
Vol 45 (9) ◽  
pp. 1183-1198
Author(s):  
Gaurav S. Chauhan ◽  
Pradip Banerjee

Purpose Recent papers on target capital structure show that debt ratio seems to vary widely in space and time, implying that the functional specifications of target debt ratios are of little empirical use. Further, target behavior cannot be adjudged correctly using debt ratios, as they could revert due to mechanical reasons. The purpose of this paper is to develop an alternative testing strategy to test the target capital structure. Design/methodology/approach The authors make use of a major “shock” to the debt ratios as an event and think of a subsequent reversion as a movement toward a mean or target debt ratio. By doing this, the authors no longer need to identify target debt ratios as a function of firm-specific variables or any other rigid functional form. Findings Similar to the broad empirical evidence in developed economies, there is no perceptible and systematic mean reversion by Indian firms. However, unlike developed countries, proportionate usage of debt to finance firms’ marginal financing deficits is extensive; equity is used rather sparingly. Research limitations/implications The trade-off theory could be convincingly refuted at least for the emerging market of India. The paper here stimulated further research on finding reasons for specific financing behavior of emerging market firms. Practical implications The results show that the firms’ financing choices are not only depending on their own firm’s specific variables but also on the financial markets in which they operate. Originality/value This study attempts to assess mean reversion in debt ratios in a unique but reassuring manner. The results are confirmed by extensive calibration of the testing strategy using simulated data sets.


Sign in / Sign up

Export Citation Format

Share Document