scholarly journals Open or Closed? Open Licensing of Real-time Public Sector Transit Data

2016 ◽  
Vol 8 (2) ◽  
pp. 1-20
Author(s):  
Teresa Scassa ◽  
Alexandra Diebel

This paper explores how real-time data are made available as “open data” using municipal transit data as a case study. Many transit authorities in North America and elsewhere have installed technology to gather GPS data in real-time from transit vehicles. These data are in high demand in app developer communities because of their use in communicating predicted, rather than scheduled, transit vehicle arrival times. While many municipalities have chosen to treat real-time GPS data as “open data”, the particular nature of real-time GPS data requires a different mode of access for developers than what is needed for static data files. This, in turn, has created a conflict between the “openness” of the underlying data and the sometimes restrictive terms of use which govern access to the real-time data through transit authority Application Program Interfaces (APIs). This paper explores the implications of these terms of use and considers whether real-time data require a separate standard for openness. While the focus is on the transit data context, the lessons from this area will have broader implications, particularly for open real-time data in the emerging ‘smart cities’ environment.

2020 ◽  
Vol 10 (17) ◽  
pp. 5882
Author(s):  
Federico Desimoni ◽  
Sergio Ilarri ◽  
Laura Po ◽  
Federica Rollo ◽  
Raquel Trillo-Lado

Modern cities face pressing problems with transportation systems including, but not limited to, traffic congestion, safety, health, and pollution. To tackle them, public administrations have implemented roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. In the case of traffic sensor data not only the real-time data are essential, but also historical values need to be preserved and published. When real-time and historical data of smart cities become available, everyone can join an evidence-based debate on the city’s future evolution. The TRAFAIR (Understanding Traffic Flows to Improve Air Quality) project seeks to understand how traffic affects urban air quality. The project develops a platform to provide real-time and predicted values on air quality in several cities in Europe, encompassing tasks such as the deployment of low-cost air quality sensors, data collection and integration, modeling and prediction, the publication of open data, and the development of applications for end-users and public administrations. This paper explicitly focuses on the modeling and semantic annotation of traffic data. We present the tools and techniques used in the project and validate our strategies for data modeling and its semantic enrichment over two cities: Modena (Italy) and Zaragoza (Spain). An experimental evaluation shows that our approach to publish Linked Data is effective.


Convergence of Cloud, IoT, Networking devices and Data science has ignited a new era of smart cities concept all around us. The backbone of any smart city is the underlying infrastructure involving thousands of IoT devices connected together to work in real time. Data Analytics can play a crucial role in gaining valuable insights into the volumes of data generated by these devices. The objective of this paper is to apply some most commonly used classification algorithms to a real time dataset and compare their performance on IoT data. The performance summary of the algorithms under test is also tabulated


Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 2994 ◽  
Author(s):  
Bhagya Silva ◽  
Murad Khan ◽  
Changsu Jung ◽  
Jihun Seo ◽  
Diyan Muhammad ◽  
...  

The Internet of Things (IoT), inspired by the tremendous growth of connected heterogeneous devices, has pioneered the notion of smart city. Various components, i.e., smart transportation, smart community, smart healthcare, smart grid, etc. which are integrated within smart city architecture aims to enrich the quality of life (QoL) of urban citizens. However, real-time processing requirements and exponential data growth withhold smart city realization. Therefore, herein we propose a Big Data analytics (BDA)-embedded experimental architecture for smart cities. Two major aspects are served by the BDA-embedded smart city. Firstly, it facilitates exploitation of urban Big Data (UBD) in planning, designing, and maintaining smart cities. Secondly, it occupies BDA to manage and process voluminous UBD to enhance the quality of urban services. Three tiers of the proposed architecture are liable for data aggregation, real-time data management, and service provisioning. Moreover, offline and online data processing tasks are further expedited by integrating data normalizing and data filtering techniques to the proposed work. By analyzing authenticated datasets, we obtained the threshold values required for urban planning and city operation management. Performance metrics in terms of online and offline data processing for the proposed dual-node Hadoop cluster is obtained using aforementioned authentic datasets. Throughput and processing time analysis performed with regard to existing works guarantee the performance superiority of the proposed work. Hence, we can claim the applicability and reliability of implementing proposed BDA-embedded smart city architecture in the real world.


Electronics ◽  
2020 ◽  
Vol 9 (7) ◽  
pp. 1101 ◽  
Author(s):  
Iván García-Magariño ◽  
Moustafa M. Nasralla ◽  
Shah Nazir

Real-time data management analytics involve capturing data in real-time and, at the same time, processing data in a light way to provide an effective real-time support. Real-time data management analytics are key for supporting decisions of business intelligence. The proposed approach covers all these phases by (a) monitoring online information from websites with Selenium-based software and incrementally conforming a database, and (b) incrementally updating summarized information to support real-time decisions. We have illustrated this approach for the investor–company field with the particular fields of Bitcoin cryptocurrency and Internet-of-Things (IoT) smart-meter sensors in smart cities. The results of 40 simulations on historic data showed that one of the proposed investor strategies achieved 7.96% of profits on average in less than two weeks. However, these simulations and other simulations of up to 69 days showed that the benefits were highly variable in these two sets of simulations (respective standard deviations were 24.6% and 19.2%).


Author(s):  
Mpoki Mwabukusi ◽  
Esron D. Karimuribo ◽  
Mark M. Rweyemamu ◽  
Eric Beda

A paper-based disease reporting system has been associated with a number of challenges. These include difficulties to submit hard copies of the disease surveillance forms because of poor road infrastructure, weather conditions or challenging terrain, particularly in the developing countries. The system demands re-entry of the data at data processing and analysis points, thus making it prone to introduction of errors during this process. All these challenges contribute to delayed acquisition, processing and response to disease events occurring in remote hard to reach areas. Our study piloted the use of mobile phones in order to transmit near to real-time data from remote districts in Tanzania (Ngorongoro and Ngara), Burundi (Muyinga) and Zambia (Kazungula and Sesheke). Two technologies namely, digital and short messaging services were used to capture and transmit disease event data in the animal and human health sectors in the study areas based on a server–client model. Smart phones running the Android operating system (minimum required version: Android 1.6), and which supported open source application, Epicollect, as well as the Open Data Kit application, were used in the study. These phones allowed collection of geo-tagged data, with the opportunity of including static and moving images related to disease events. The project supported routine disease surveillance systems in the ministries responsible for animal and human health in Burundi, Tanzania and Zambia, as well as data collection for researchers at the Sokoine University of Agriculture, Tanzania. During the project implementation period between 2011 and 2013, a total number of 1651 diseases event-related forms were submitted, which allowed reporters to include GPS coordinates and photographs related to the events captured. It was concluded that the new technology-based surveillance system is useful in providing near to real-time data, with potential for enhancing timely response in rural remote areas of Africa. We recommended adoption of the proven technologies to improve disease surveillance, particularly in the developing countries.


Author(s):  
Yohee Han ◽  
Youngchan Kim ◽  
Jisun Ku ◽  
Yeonghun Jung ◽  
Jeongrae Roh

2021 ◽  
Vol 111 (12) ◽  
pp. 2133-2140
Author(s):  
Farida B. Ahmad ◽  
Robert N. Anderson ◽  
Karen Knight ◽  
Lauren M. Rossen ◽  
Paul D. Sutton

The National Center for Health Statistics’ (NCHS’s) National Vital Statistics System (NVSS) collects, processes, codes, and reviews death certificate data and disseminates the data in annual data files and reports. With the global rise of COVID-19 in early 2020, the NCHS mobilized to rapidly respond to the growing need for reliable, accurate, and complete real-time data on COVID-19 deaths. Within weeks of the first reported US cases, NCHS developed certification guidance, adjusted internal data processing systems, and stood up a surveillance system to release daily updates of COVID-19 deaths to track the impact of the COVID-19 pandemic on US mortality. This report describes the processes that NCHS took to produce timely mortality data in response to the COVID-19 pandemic. (Am J Public Health. 2021;111(12):2133–2140. https://doi.org/10.2105/AJPH.2021.306519 )


2016 ◽  
Vol 27 (4) ◽  
pp. 24-38 ◽  
Author(s):  
Salwa M'barek ◽  
Leila Baccouche ◽  
Henda Ben Ghezala

Real-time applications managing a large number of real-time data require the use of Real-time Database Management Systems (RTDBMS) to meet temporal constraints of both real-time data and transactions. However, a RTDBMS has a dynamic workload and may be frequently overloaded since the arrival times and workloads of user transactions are unpredictable. Therefore, Quality of Service management solutions have been proposed to guarantee the stability of RTDBMS even during unpredictable overload periods. While effective, the design and reuse of these solutions is challenging because they are not formally modeled and there is no tool neither a methodology that helps us design such solutions. To address these issues, the authors propose a design framework based on the Model-Driven Engineering approach providing a modeling architecture, a strategic methodology and a software tool to support modeling and reusing such solutions. The framework is implemented and tested for a real Qos management solution.


Sign in / Sign up

Export Citation Format

Share Document