scholarly journals Ronda: Real-Time Data Provision, Processing and Publication for Open Data

2021 ◽  
pp. 165-177
Author(s):  
Fabian Kirstein ◽  
Dario Bacher ◽  
Vincent Bohlen ◽  
Sonja Schimmler

AbstractThe provision and dissemination of Open Data is a flourishing concept, which is highly recognized and established in the government and public administrations domains. Typically, the actual data is served as static file downloads, such as CSV or PDF, and the established software solutions for Open Data are mostly designed to manage this kind of data. However, the rising popularity of the Internet of things and smart devices in the public and private domain leads to an increase of available real-time data, like public transportation schedules, weather forecasts, or power grid data. Such timely and extensive data cannot be used to its full potential when published in a static, file-based fashion. Therefore, we designed and developed Ronda - an open source platform for gathering, processing and publishing real-time Open Data based on industry-proven and established big data and data processing tools. Our solution easily enables Open Data publishers to provide real-time interfaces for heterogeneous data sources, fostering more sophisticated and advanced Open Data use cases. We have evaluated our work through a practical application in a production environment.

Author(s):  
Panagiota Papadopoulou ◽  
Kostas Kolomvatsos ◽  
Stathes Hadjiefthymiades

E-government can greatly benefit by the use of IoT, enabling the creation of new innovative services or the transformation and enhancement of current ones, which are informed by smart devices and real-time data. The adoption of IoT in e-government encompasses several challenges of technical as well as organizational, political and legal nature which should be addressed for developing efficient government-to-citizen and government-to-society applications. This article examines IoT adoption in e-government in a holistic approach. It provides an overview of the IoT potential in e-government across several application domains, highlighting the specific issues that seek attention in each of them. The article also investigates the challenges that should be considered and managed for IoT in e-government to reach its full potential. With the application of IoT in e-government being at an early stage, the article contributes to the theoretical and practical understanding of how IoT can be leveraged for e-government purposes.


Author(s):  
Panagiota Papadopoulou ◽  
Kostas Kolomvatsos ◽  
Stathes Hadjiefthymiades

Internet of things (IoT) brings unprecedented changes to all contexts of our lives, as they can be informed by smart devices and real-time data. Among the various IoT application settings, e-government seems to be one that can be greatly benefited by the use of IoT, transforming and augmenting public services. This chapter aims to contribute to a better understanding of how IoT can be leveraged to enhance e-government. IoT adoption in e-government encompasses several challenges of technical as well as organizational, political, and legal nature, which should be addressed for developing efficient applications. With the application of IoT in e-government being at an early stage, it is imperative to investigate these challenges and the ways they could be tackled. The chapter provides an overview of IoT in e-government across several application domains and explores the aspects that should be considered and managed before it can reach its full potential.


2021 ◽  
Author(s):  
Kayo Vanderheggen ◽  
Joost Janssen ◽  
Nate Meredith

When a wind turbine installation jack-up performs a heavy lifting operation with the crane it affects the loads on the foundation. For these units the crane typically encircles a leg or is positioned close to it. Consequently, that leg attracts most of the loads due to crane operations. For each location jack-ups prove the capacity of the foundation by applying a controlled, high load at each of the footings before commencing operations. This process is known as preloading. The achieved preload at the jack-up’s foundation determines the operational limit. Exceedance of the preload value may result in foundation instability. Depending on the site’s foundation characteristics the consequences of such an exceedance range from negligible to catastrophic failure. GustoMSC has developed Operator Support System (OSS) software with the purpose to make the operator aware of the limitations imposed by the preloaded foundation. The application outlines operational limits based on real-time data from the jack-up, jacking system and crane which enables the operators to safely unlock the full potential of their wind turbine installation jack-up.


2015 ◽  
Vol 2015 ◽  
pp. 1-11 ◽  
Author(s):  
Supun Kamburugamuve ◽  
Leif Christiansen ◽  
Geoffrey Fox

We describe IoTCloud, a platform to connect smart devices to cloud services for real time data processing and control. A device connected to IoTCloud can communicate with real time data analysis frameworks deployed in the cloud via messaging. The platform design is scalable in connecting devices as well as transferring and processing data. With IoTCloud, a user can develop real time data processing algorithms in an abstract framework without concern for the underlying details of how the data is distributed and transferred. For this platform, we primarily consider real time robotics applications such as autonomous robot navigation, where there are strict requirements on processing latency and demand for scalable processing. To demonstrate the effectiveness of the system, a robotic application is developed on top of the framework. The system and the robotics application characteristics are measured to show that data processing in central servers is feasible for real time sensor applications.


2020 ◽  
Vol 5 (7) ◽  
pp. e002203
Author(s):  
Faisal Shuaib ◽  
Abdullahi Bulama Garba ◽  
Emmanuel Meribole ◽  
Samuel Obasi ◽  
Adamu Sule ◽  
...  

In 2010, Nigeria adopted the use of web-based software District Health Information System, V.2 (DHIS2) as the platform for the National Health Management Information System. The platform supports real-time data reporting and promotes government ownership and accountability. To strengthen its routine immunisation (RI) component, the US Centers for Disease Control and Prevention (CDC) through its implementing partner, the African Field Epidemiology Network-National Stop Transmission of Polio, in collaboration with the Government of Nigeria, developed the RI module and dashboard and piloted it in Kano state in 2014. The module was scaled up nationally over the next 4 years with funding from the Bill & Melinda Gates Foundation and CDC. One implementation officer was deployed per state for 2 years to support operations. Over 60 000 RI healthcare workers were trained on data collection, entry and interpretation and each local immunisation officer in the 774 local government areas (LGAs) received a laptop and stock of RI paper data tools. Templates for national-level and state-level RI bulletins and LGA quarterly performance tools were developed to promote real-time data use for feedback and decision making, and enhance the performance of RI services. By December 2017, the DHIS2 RI module had been rolled out in all 36 states and the Federal Capital Territory, and all states now report their RI data through the RI Module. All states identified at least one government DHIS2 focal person for oversight of the system’s reporting and management operations. Government officials routinely collect RI data and use them to improve RI vaccination coverage. This article describes the implementation process—including planning and implementation activities, achievements, lessons learnt, challenges and innovative solutions—and reports the achievements in improving timeliness and completeness rates.


Author(s):  
Mpoki Mwabukusi ◽  
Esron D. Karimuribo ◽  
Mark M. Rweyemamu ◽  
Eric Beda

A paper-based disease reporting system has been associated with a number of challenges. These include difficulties to submit hard copies of the disease surveillance forms because of poor road infrastructure, weather conditions or challenging terrain, particularly in the developing countries. The system demands re-entry of the data at data processing and analysis points, thus making it prone to introduction of errors during this process. All these challenges contribute to delayed acquisition, processing and response to disease events occurring in remote hard to reach areas. Our study piloted the use of mobile phones in order to transmit near to real-time data from remote districts in Tanzania (Ngorongoro and Ngara), Burundi (Muyinga) and Zambia (Kazungula and Sesheke). Two technologies namely, digital and short messaging services were used to capture and transmit disease event data in the animal and human health sectors in the study areas based on a server–client model. Smart phones running the Android operating system (minimum required version: Android 1.6), and which supported open source application, Epicollect, as well as the Open Data Kit application, were used in the study. These phones allowed collection of geo-tagged data, with the opportunity of including static and moving images related to disease events. The project supported routine disease surveillance systems in the ministries responsible for animal and human health in Burundi, Tanzania and Zambia, as well as data collection for researchers at the Sokoine University of Agriculture, Tanzania. During the project implementation period between 2011 and 2013, a total number of 1651 diseases event-related forms were submitted, which allowed reporters to include GPS coordinates and photographs related to the events captured. It was concluded that the new technology-based surveillance system is useful in providing near to real-time data, with potential for enhancing timely response in rural remote areas of Africa. We recommended adoption of the proven technologies to improve disease surveillance, particularly in the developing countries.


Author(s):  
Katherine Shea

AbstractGlobal Forest Watch (GFW) is an online platform that distills satellite imagery into near-real-time forest change information that anyone can access and act on. Like other open-data platforms, GFW is based on the idea that transparent, publicly available data can support the greater good—in this case, reducing deforestation. By its very nature, the use of freely available data can be difficult to track and its impact difficult to measure. This chapter explores four approaches for measuring the reach and impact of GFW, including quantitative and qualitative approaches for monitoring outcomes and measuring impact. The recommendations can be applied to other transparency initiatives, especially those providing remote-sensing data.


2021 ◽  
Author(s):  
Simone Mancini ◽  
Margarita Segou ◽  
Maximilian J. Werner

<p>Artificial intelligence methods are revolutionizing modern seismology by offering unprecedentedly rich seismic catalogs. Recent developments in short-term aftershock forecasting show that Coulomb rate-and-state (CRS) models hold the potential to achieve operational skills comparable to standard statistical Epidemic-Type Aftershock Sequence (ETAS) models, but only when the near real-time data quality allows to incorporate a more detailed representation of sources and receiver fault populations. In this framework, the high-resolution reconstructions of the seismicity patterns introduced by machine-learning-derived earthquake catalogs represent a unique opportunity to test whether they can be exploited to improve the predictive power of aftershock forecasts.</p><p>Here, we present a retrospective forecast experiment on the first year of the 2016-2017 Central Italy seismic cascade, where seven M5.4+ earthquakes occurred between a few hours and five months after the initial Mw 6.0 event, migrating over a 60-km long normal fault system. As target dataset, we employ the best available high-density machine learning catalog recently released for the sequence, which reports ~1 million events in total (~22,000 with M ≥ 2).</p><p>First, we develop a CRS model featuring (1) rate-and-state variables optimized on 30 years of pre-sequence regional seismicity, (2) finite fault slip models for the seven mainshocks of the sequence, (3) spatially heterogeneous receivers informed by pre-existing faults, and (4) updating receiver fault populations using focal planes gradually revealed by aftershocks. We then test the effect of considering stress perturbations from the M2+ events. Using the same high-precision catalog, we produce a standard ETAS model to benchmark the stress-based counterparts. All models are developed on a 3D spatial grid with 2 km spacing; they are updated daily and seek to forecast the space-time occurrence of M2+ seismicity for a total forecast horizon of one year. We formally rank the forecasts with the statistical scoring metrics introduced by the Collaboratory for the Study of Earthquake Predictability and compare their performance to a generation of CRS and ETAS models previously published for the same sequence by Mancini et al. (2019), who used solely real-time data and a minimum triggering magnitude of M=3.</p><p>We find that considering secondary triggering effects from events down to M=2 slightly improves model performance. While this result highlights the importance of better seismic catalogs to model local triggering mechanisms, it also suggests that to appreciate their full potential future modelling efforts will likely have to incorporate also fine-scale rupture characterizations (e.g., smaller source fault geometries retrieved from enhanced focal mechanism catalogs) and introduce denser spatial model discretizations.</p>


2016 ◽  
Vol 8 (2) ◽  
pp. 1-20
Author(s):  
Teresa Scassa ◽  
Alexandra Diebel

This paper explores how real-time data are made available as “open data” using municipal transit data as a case study. Many transit authorities in North America and elsewhere have installed technology to gather GPS data in real-time from transit vehicles. These data are in high demand in app developer communities because of their use in communicating predicted, rather than scheduled, transit vehicle arrival times. While many municipalities have chosen to treat real-time GPS data as “open data”, the particular nature of real-time GPS data requires a different mode of access for developers than what is needed for static data files. This, in turn, has created a conflict between the “openness” of the underlying data and the sometimes restrictive terms of use which govern access to the real-time data through transit authority Application Program Interfaces (APIs). This paper explores the implications of these terms of use and considers whether real-time data require a separate standard for openness. While the focus is on the transit data context, the lessons from this area will have broader implications, particularly for open real-time data in the emerging ‘smart cities’ environment.


Sign in / Sign up

Export Citation Format

Share Document