scholarly journals On the Meta-Analysis Review Towards efficient mechanisms for resource, energy, and data management in the fog computing

2021 ◽  
Author(s):  
Sorush Niknamian

Fog computing is an architecture that uses collaborative end-user edge devices to carry out a large amount of storage, transmission, configuration, and module function. In this computingenvironment, management issue is the process of managing, monitoring and optimizing the correlated components for improving the performance, availability, security and any fundamental operational requirement. The management strategies have a great impact on the fog computing, but, as far as we know, there is not a comprehensive and systematic study in this field. Hence, this paper classifies the management strategies into three main categories, including resource, energy and data management. In addition, it defines the new challenges in each of these categories. Finally, the differences between the reviewed strategies are investigated in terms of scalability,reliability, time, and queries attributes along with providing the main directions for future research.

2019 ◽  
Vol 3 (1) ◽  
pp. 18-26 ◽  
Author(s):  
Brandi Williams ◽  
Brian Haines ◽  
Kathy Roper ◽  
Eunhwa Yang

ABSTRACT The purpose of this paper is to discover the current methods that facility managers use to manage and track assets and identify a set of attributes for Building Information Modelling (BIM) that can improve the efficiency of the current facility management (FM) practice. A survey of over 100 facility management professionals addressed demographics such as industry sector, number of buildings managed, and use of industry standards or internally developed guidelines for data management. This information is correlated with their current asset management strategies to identify minimum sets of attributes that may be used for an FM-specific BIM. In addition, the survey asked the FM professionals their opinion on the importance of specific asset attributes and data management information that could be included in a BIM for FM. The findings of this paper indicate that there is a consensus on basic information (asset type, unique identification, manufacturer, model number, serial number) needed for asset management, and that there is no generally accepted system on how this is done in current practice. FM professionals and software providers may use the information in this paper to establish baseline sets of data to include in BIM during the design phase of projects. This paper provides insight and data as to the current practice of asset management by facility managers. Understanding the actual needs of the FM industry will assist in future research to implement BIM for FM.


2020 ◽  
Vol 9 (8) ◽  
pp. 487
Author(s):  
Zhenlong Li ◽  
Wenwu Tang ◽  
Qunying Huang ◽  
Eric Shook ◽  
Qingfeng Guan

The convergence of big data and geospatial computing has brought challenges and opportunities to GIScience with regards to geospatial data management, processing, analysis, modeling, and visualization. This special issue highlights recent advancements in integrating new computing approaches, spatial methods, and data management strategies to tackle geospatial big data challenges and meanwhile demonstrates the opportunities for using big data for geospatial applications. Crucial to the advancements highlighted here is the integration of computational thinking and spatial thinking and the transformation of abstract ideas and models to concrete data structures and algorithms. This editorial first introduces the background and motivation of this special issue followed by an overview of the ten included articles. Conclusion and future research directions are provided in the last section.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3633 ◽  
Author(s):  
Hyun-Jong Cha ◽  
Ho-Kyung Yang ◽  
You-Jin Song

It is expected that the number of devices connecting to the Internet-of-Things (IoT) will increase geometrically in the future, with improvement of their functions. Such devices may create a huge amount of data to be processed in a limited time. Under the IoT environment, data management should play the role of an intermediate level between objects and devices that generate data and applications that access to the data for analysis and the provision of services. IoT interactively connects all communication devices and allows global access to the data generated by a device. Fog computing manages data and computation at the edge of the network near an end user and provides new types of applications and services, with low latency, high frequency bandwidth and geographical distribution. In this paper, we propose a fog computing architecture for efficiently and reliably delivering IoT data to the corresponding IoT applications while ensuring time sensitivity. Based on fog computing, the proposed architecture provides efficient power management in IoT device communication between sensors and secure management of data to be decrypted based on user attributes. The functional effectiveness and the safe data management of the method proposed are compared through experiments.


2015 ◽  
Vol 25 (1) ◽  
pp. 15-23 ◽  
Author(s):  
Ryan W. McCreery ◽  
Elizabeth A. Walker ◽  
Meredith Spratford

The effectiveness of amplification for infants and children can be mediated by how much the child uses the device. Existing research suggests that establishing hearing aid use can be challenging. A wide range of factors can influence hearing aid use in children, including the child's age, degree of hearing loss, and socioeconomic status. Audiological interventions, including using validated prescriptive approaches and verification, performing on-going training and orientation, and communicating with caregivers about hearing aid use can also increase hearing aid use by infants and children. Case examples are used to highlight the factors that influence hearing aid use. Potential management strategies and future research needs are also discussed.


2020 ◽  
Author(s):  
Anthony Pease ◽  
Clement Lo ◽  
Arul Earnest ◽  
Velislava Kiriakova ◽  
Danny Liew ◽  
...  

<b>Background: </b>Time-in-range is a key glycaemic metric, and comparisons of management technologies for this outcome are critical to guide device selection. <p><b> </b></p> <p><b>Purpose: </b>We conducted a systematic review and network meta-analysis to compare and rank technologies for time in glycaemic ranges.</p> <p> </p> <p><b>Data sources: </b>We searched All Evidenced Based Medicine Reviews, CINAHL, EMBASE, MEDLINE, MEDLINE In-Process and other non-indexed citations, PROSPERO, PsycINFO, PubMed, and Web of Science until 24 April, 2019.</p> <p> </p> <p><b>Study selection: </b>We included randomised controlled trials <u>></u>2 weeks duration comparing technologies for management of type 1 diabetes in adults (<u>></u>18 years of age), excluding pregnant women. </p> <p> </p> <p><b>Data extraction: </b>Data were extracted using a predefined template. Outcomes were percent time with sensor glucose levels 3.9–10.0mmol/l (70–180mg/dL), >10.0mmol/L (180mg/dL), and <3.9mmol/L (70mg/dL). </p> <p><b> </b></p> <p><b>Data synthesis: </b>We identified 16,772 publications, of which 14 eligible studies compared eight technologies comprising 1,043 participants. Closed loop systems lead to greater percent time-in-range than any other management strategy and was 17.85 (95% predictive interval [PrI] 7.56–28.14) higher than usual care of multiple daily injections with capillary glucose testing. Closed loop systems ranked best for percent time-in-range or above range utilising surface under the cumulative ranking curve (SUCRA–98.5 and 93.5 respectively). Closed loop systems also ranked highly for time below range (SUCRA–62.2). </p> <p><b> </b></p> <p><b>Limitations: </b>Overall risk of bias ratings were moderate for all outcomes. Certainty of evidence was very low.</p> <p><b> </b></p> <p><b>Conclusions: </b>In the first integrated comparison of multiple management strategies considering time-in-range, we found that the efficacy of closed loop systems appeared better than all other approaches. </p>


2020 ◽  
Author(s):  
Kurt D Shulver ◽  
Nicholas A Badcock

We report the results of a systematic review and meta-analysis investigating the relationship between perceptual anchoring and dyslexia. Our goal was to assess the direction and degree of effect between perceptual anchoring and reading ability in typical and atypical (dyslexic) readers. We performed a literature search of experiments explicitly assessing perceptual anchoring and reading ability using PsycInfo (Ovid, 1860 to 2020), MEDLINE (Ovid, 1860 to 2019), EMBASE (Ovid, 1883 to 2019), and PubMed for all available years up to June (2020). Our eligibility criteria consisted of English-language articles and, at minimum, one experimental group identified as dyslexic - either by reading assessment at the time, or by previous diagnosis. We assessed for risk of bias using an adapted version of the Newcastle-Ottawa scale. Six studies were included in this review, but only five (n = 280 participants) were included in the meta-analysis (we were unable to access the necessary data for one study).The overall effect was negative, large and statistically significant; g = -0.87, 95% CI [-1.47, 0.27]: a negative effect size indicating less perceptual anchoring in dyslexic versus non-dyslexic groups. Visual assessment of funnel plot and Egger’s test suggest minimal bias but with significant heterogeneity; Q (4) = 9.70, PI (prediction interval) [-2.32, -0.58]. The primary limitation of the current review is the small number of included studies. We discuss methodological limitations, such as limited power, and how future research may redress these concerns. The variability of effect sizes appears consistent with the inherent variability within subtypes of dyslexia. This level of dispersion seems indicative of the how we define cut-off thresholds between typical reading and dyslexia populations, but also the methodological tools we use to investigate individual performance.


2020 ◽  
Author(s):  
Tom Joseph Barry ◽  
David John Hallford ◽  
Keisuke Takano

Decades of research has examined the difficulty that people with psychiatric diagnoses, such as Major Depressive Disorder, Schizophrenia Spectrum Disorders, and Posttraumatic Stress Disorder, have in recalling specific autobiographical memories from events that lasted less than a day. Instead, they seem to retrieve general events that have occurred many times or which occurred over longer periods of time, termed overgeneral memory. We present the first transdiagnostic meta-analysis of memory specificity/overgenerality, and the first meta-regression of proposed causal mechanisms. A keyword search of Embase, PsycARTICLES and PsycINFO databases yielded 74 studies that compared people with and without psychiatric diagnoses on the retrieval of specific (k = 85) or general memories (k = 56). Multi-level meta-analysis confirmed that people with psychiatric diagnoses typically recall fewer specific (g = -0.864, 95% CI[-1.030, -0.698]) and more general (g = .712, 95% CI[0.524, 0.900]) memories than diagnoses-free people. The size of these effects did not differ between diagnostic groups. There were no consistent moderators; effect sizes were not explained by methodological factors such as cue valence, or demographic variables such as participants’ age. There was also no support for the contribution of underlying processes that are thought to be involved in specific/general memory retrieval (e.g., rumination). Our findings confirm that deficits in autobiographical memory retrieval are a transdiagnostic factor associated with a broad range of psychiatric problems, but future research should explore novel causal mechanisms such as encoding deficits and the social processes involved in memory sharing and rehearsal.


2019 ◽  
Author(s):  
Emily L. Dennis ◽  
Karen Caeyenberghs ◽  
Robert F. Asarnow ◽  
Talin Babikian ◽  
Brenda Bartnik-Olson ◽  
...  

Traumatic brain injury (TBI) is a major cause of death and disability in children in both developed and developing nations. Children and adolescents suffer from TBI at a higher rate than the general population; however, research in this population lags behind research in adults. This may be due, in part, to the smaller number of investigators engaged in research with this population and may also be related to changes in safety laws and clinical practice that have altered length of hospital stays, treatment, and access to this population. Specific developmental issues also warrant attention in studies of children, and the ever-changing context of childhood and adolescence may require larger sample sizes than are commonly available to adequately address remaining questions related to TBI. The ENIGMA (Enhancing NeuroImaging Genetics through Meta-Analysis) Pediatric Moderate-Severe TBI (msTBI) group aims to advance research in this area through global collaborative meta-analysis. In this paper we discuss important challenges in pediatric TBI research and opportunities that we believe the ENIGMA Pediatric msTBI group can provide to address them. We conclude with recommendations for future research in this field of study.


Author(s):  
E. E. Akimkina

The problems of structuring of indicators in multidimensional data cubes with their subsequent processing with the help of end-user tools providing multidimensional visualization and data management are analyzed; the possibilities of multidimensional data processing technologies for managing and supporting decision making at a design and technological enterprise are shown; practical recommendations on the use of domestic computer environments for the structuring and visualization of multidimensional data cubes are given.


2019 ◽  
Vol 25 (3) ◽  
pp. 378-396 ◽  
Author(s):  
Arian Razmi-Farooji ◽  
Hanna Kropsu-Vehkaperä ◽  
Janne Härkönen ◽  
Harri Haapasalo

Purpose The purpose of this paper is twofold: first, to understand data management challenges in e-maintenance systems from a holistically viewpoint through summarizing the earlier scattered research in the field, and second, to present a conceptual approach for addressing these challenges in practice. Design/methodology/approach The study is realized as a combination of a literature review and by the means of analyzing the practices on an industry leader in manufacturing and maintenance services. Findings This research provides a general understanding over data management challenges in e-maintenance and summarizes their associated proposed solutions. In addition, this paper lists and exemplifies different types and sources of data which can be collected in e-maintenance, across different organizational levels. Analyzing the data management practices of an e-maintenance industry leader provides a conceptual approach to address identified challenges in practice. Research limitations/implications Since this paper is based on studying the practices of a single company, it might be limited to generalize the results. Future research topics can focus on each of mentioned data management challenges and also validate the applicability of presented model in other companies and industries. Practical implications Understanding the e-maintenance-related challenges helps maintenance managers and other involved stakeholders in e-maintenance systems to better solve the challenges. Originality/value The so-far literature on e-maintenance has been studied with narrow focus to data and data management in e-maintenance appears as one of the less studied topics in the literature. This research paper contributes to e-maintenance by highlighting the deficiencies of the discussion surrounding the perspectives of data management in e-maintenance by studying all common data management challenges and listing different types of data which need to be acquired in e-maintenance systems.


Sign in / Sign up

Export Citation Format

Share Document