Want the games industry to share data? Share yours

Nature ◽  
2021 ◽  
Author(s):  
Veli-Matti Karhulahti
Keyword(s):  
2019 ◽  
Vol 1 ◽  
pp. 1-1
Author(s):  
Alan M. MacEachren

<p><strong>Abstract.</strong> This presentation will provide an overview of a Workshop-based effort on ethics in location-based, organized by the Scientific Responsibility, Human Rights, and Law Program of the American Association for the Advancement of Science (AAAS). More specifically, the AAAS organized three workshops during 2017 and 2018 directed to exploring the ethical implications of collecting, analysing, and acting upon location-based data in crisis situations &amp;ndash; “Developing Ethical Guidelines and Best Practices for the Use of Volunteered Geographic Information and Remotely Sensed Imagery in Crisis Situations.”. The outcome of those workshops and follow up efforts was a document detailing principles and guidelines with the objective of empowering crisis response actors to use location-based data responsibly and ethically.</p><p> On behalf of all those involved, as a Participant in all three workshops and a AAAS Fellow, I will present an overview of the results of this effort. The presentation will outline the five principles developed and provide examples of their motivation and use:</p><ol><li>Do No Harm: Identify and minimize potential risk, particularly as they may affect the vulnerability of individuals and populations</li><li>Define Your Purpose: Ensure action is mission-driven and goal-oriented</li><li>Do Good Science: Employ scientifically rigorous and responsible methods</li><li>Collaborate and Consult: Engage with local partners</li><li>Give Access to Your Data: Share data openly, when safe and practicable</li></ol><p> The presentation will also reflect on (a) the specific relevance of this effort and its outcome for the international cartographic community and (b) our obligation as academic/professional cartographers to address the dual challenges of leveraging locational data cartographically to support crisis management and humanitarian efforts while also guarding against misuse of the data collected and map generated. I will conclude by reflecting on my experience in working with a diverse, interdisciplinary, international group on this hard problem.</p></p>


2015 ◽  
Author(s):  
Armin Günther ◽  
Ina Dehnhard

During the last years the call for publishing and sharing research data has become ubiquitous. Besides moral appeals towards transparency as a basic principle of science, more and more policies and regulations start to push researchers to make their research data available to the scientific community and the general public. In addition to research funding organizations, publishers are most influential in this regard as they are able to boost the practice of data sharing by incentives that are highly attractive for researchers: publications and hence reputation in academia. In consequence more and more datasets are published by researchers – in very different ways, using the quite heterogeneous tools and publishing solutions currently available.However, these data publications do not necessarily increase transparency in research. Publishing research data might even contribute to an increase in noise and opacity. Mere disclosure of date has very little value per se, as The Royal Society in its report “Science as an open enterprise” (2012) noticed. The report asks for “intelligent openness” where data are not just published but effectively communicated. To accomplish intelligent openness, data have to be accessible, intelligible, assessable and usable (p. 14).The presentation will take up this consideration and explore the challenges involved. Communicating data aims at enabling receivers (i. e. data users) to correctly interpret and appropriately use the published research data. As will be shown by examples of published datasets from the field of behavioral and social sciences, currently this aim is by no means generally achieved. In seeking to comply with data policies of funders and publishers researcher may be inclined to publish datasets that are hardly intelligible and not usable. Additionally there is a lack of data publishing infrastructures (including technical tools as well as standards and social practices) supporting researchers in communicating data in a meaningful way to their different audiences. Obviously, it is much easier to share data within the own research community than across research fields, disciplines or even cultures. In general, the less the researchers who publish data and the audience who wants to use these data share a common context (increasing “distance-from-data-origin”, Baker & Yarmey 2009), the more demanding communicating research data will become.Thus, publishers face considerable challenges when trying to advance from publishing to communicating research data. Developing solutions pointing in this direction should be, nevertheless, of primary concern, as publishing without communicating might ultimately be just a waste of resources. Therefore, besides exploring the challenges, the presentation will also try to identify steps that might help to approach the ambitious goal of communicating research data.


2020 ◽  
Vol 91 (1) ◽  
pp. 41-45 ◽  
Author(s):  
Virginia. E. Wotring ◽  
LaRona K. Smith

INTRODUCTION: There are knowledge gaps in spaceflight pharmacology with insufficient in-flight data to inform future planning. This effort directly addressed in-mission medication use and also informed open questions regarding spaceflight-associated changes in pharmacokinetics (PK) and/or pharmacodynamics (PD).METHODS: An iOS application was designed to collect medication use information relevant for research from volunteer astronaut crewmembers: medication name, dose, dosing frequency, indication, perceived efficacy, and side effects. Leveraging the limited medication choices aboard allowed a streamlined questionnaire. There were 24 subjects approved for participation.RESULTS: Six crewmembers completed flight data collection and five completed ground data collection before NASA’s early study discontinuation. There were 5766 medication use entries, averaging 20.6 ± 8.4 entries per subject per flight week. Types of medications and their indications were similar to previous reports, with sleep disturbances and muscle/joint pain as primary drivers. Two subjects treated prolonged skin problems. Subjects also used the application in unanticipated ways: to note drug tolerance testing or medication holiday per research protocols, and to share data with flight surgeons. Subjects also provided usability feedback on application design and implementation.DISCUSSION: The volume of data collected (20.6 ± 8.4 entries per subject per flight week) is much greater than was collected previously (<12 per person per entire mission), despite user criticisms regarding app usability. It seems likely that improvements in a software-based questionnaire application could result in a robust data collection tool that astronauts find more acceptable, while simultaneously providing researchers and clinicians with useful data.Wotring VE, Smith LK. Dose tracker application for collecting medication use data from International Space Station crew. Aerosp Med Hum Perform. 2020; 91(1):41–45.


2019 ◽  
pp. 5-22
Author(s):  
Szymon Buczyński

Recent technological revolutions in data and communication systemsenable us to generate and share data much faster than ever before. Sophisticated data tools aim to improve knowledge and boost confdence. That technological tools will only get better and user-friendlier over the years, big datacan be considered an important tool for the arts and culture sector. Statistical analysis, econometric methods or data mining techniques could pave theway towards better understanding of the mechanisms occurring on the artmarket. Moreover crime reduction and prevention challenges in today’sworld are becoming increasingly complex and are in need of a new techniquethat can handle the vast amount of information that is being generated. Thisarticle provides an examination of a wide range of new technological innovations (IT) that have applications in the areas of culture preservation andheritage protection. The author provides a description of recent technological innovations, summarize the available research on the extent of adoptionon selected examples, and then review the available research on the eachform of new technology. Furthermore the aim of this paper is to explore anddiscuss how big data analytics affect innovation and value creation in cultural organizations and shape consumer behavior in cultural heritage, arts andcultural industries. This paper discusses also the likely impact of big dataanalytics on criminological research and theory. Digital criminology supports huge data base in opposition to conventional data processing techniques which are not only in suffcient but also out dated. This paper aims atclosing a gap in the academic literature showing the contribution of a bigdata approach in cultural economics, policy and management both froma theoretical and practice-based perspective. This work is also a startingpoint for further research.


2020 ◽  
Vol 46 (1) ◽  
pp. 55-75
Author(s):  
Ying Long ◽  
Jianting Zhao

This paper examines how mass ridership data can help describe cities from the bikers' perspective. We explore the possibility of using the data to reveal general bikeability patterns in 202 major Chinese cities. This process is conducted by constructing a bikeability rating system, the Mobike Riding Index (MRI), to measure bikeability in terms of usage frequency and the built environment. We first investigated mass ridership data and relevant supporting data; we then established the MRI framework and calculated MRI scores accordingly. This study finds that people tend to ride shared bikes at speeds close to 10 km/h for an average distance of 2 km roughly three times a day. The MRI results show that at the street level, the weekday and weekend MRI distributions are analogous, with an average score of 49.8 (range 0–100). At the township level, high-scoring townships are those close to the city centre; at the city level, the MRI is unevenly distributed, with high-MRI cities along the southern coastline or in the middle inland area. These patterns have policy implications for urban planners and policy-makers. This is the first and largest-scale study to incorporate mobile bike-share data into bikeability measurements, thus laying the groundwork for further research.


Author(s):  
Mazen Odish ◽  
Cassia Yi ◽  
Juliann Eigner ◽  
Amelia Kenner Brininger ◽  
Kristi L. Koenig ◽  
...  

Abstract In March 2020, at the onset of the coronavirus disease 2019 (COVID-19) pandemic in the United States, the Southern California Extracorporeal Membrane Oxygenation (ECMO) Consortium was formed. The consortium included physicians and coordinators from the four ECMO centers in San Diego County. Guidelines were created to ensure that ECMO was delivered equitably and in a resource effective manner across the county during the pandemic. A biomedical ethicist reviewed the guidelines to ensure ECMO utilization would provide maximal community benefit of this limited resource. The San Diego County Health and Human Services Agency further incorporated the guidelines into its plans for the allocation of scarce resources. The consortium held weekly video conferences to review countywide ECMO capacity (including census and staffing), share data, and discuss clinical practices and difficult cases. Equipment exchanges between ECMO centers maximized regional capacity. From March 1 to November 30, 2020, consortium participants placed 97 patients on ECMO. No eligible patients were denied ECMO due to lack of resources or capacity. The Southern California ECMO Consortium may serve as a model for other communities seeking to optimize ECMO resources during the current COVID-19 or future pandemics.


2021 ◽  
Vol 13 (13) ◽  
pp. 7156
Author(s):  
Kyoung Jun Lee ◽  
Yu Jeong Hwangbo ◽  
Baek Jeong ◽  
Ji Woong Yoo ◽  
Kyung Yang Park

Many small and medium enterprises (SMEs) want to introduce recommendation services to boost sales, but they need to have sufficient amounts of data to introduce these recommendation services. This study proposes an extrapolative collaborative filtering (ECF) system that does not directly share data among SMEs but improves recommendation performance for small and medium-sized companies that lack data through the extrapolation of data, which can provide a magical experience to users. Previously, recommendations were made utilizing only data generated by the merchant itself, so it was impossible to recommend goods to new users. However, our ECF system provides appropriate recommendations to new users as well as existing users based on privacy-preserved payment transaction data. To accomplish this, PP2Vec using Word2Vec was developed by utilizing purchase information only, excluding personal information from payment company data. We then compared the performances of single-merchant models and multi-merchant models. For the merchants with more data than SMEs, the performance of the single-merchant model was higher, while for the SME merchants with fewer data, the multi-merchant model’s performance was higher. The ECF System proposed in this study is more suitable for the real-world business environment because it does not directly share data among companies. Our study shows that AI (artificial intelligence) technology can contribute to the sustainability and viability of economic systems by providing high-performance recommendation capability, especially for small and medium-sized enterprises and start-ups.


Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 621
Author(s):  
Giuseppe Psaila ◽  
Paolo Fosci

Internet technology and mobile technology have enabled producing and diffusing massive data sets concerning almost every aspect of day-by-day life. Remarkable examples are social media and apps for volunteered information production, as well as Open Data portals on which public administrations publish authoritative and (often) geo-referenced data sets. In this context, JSON has become the most popular standard for representing and exchanging possibly geo-referenced data sets over the Internet.Analysts, wishing to manage, integrate and cross-analyze such data sets, need a framework that allows them to access possibly remote storage systems for JSON data sets, to retrieve and query data sets by means of a unique query language (independent of the specific storage technology), by exploiting possibly-remote computational resources (such as cloud servers), comfortably working on their PC in their office, more or less unaware of real location of resources. In this paper, we present the current state of the J-CO Framework, a platform-independent and analyst-oriented software framework to manipulate and cross-analyze possibly geo-tagged JSON data sets. The paper presents the general approach behind the J-CO Framework, by illustrating the query language by means of a simple, yet non-trivial, example of geographical cross-analysis. The paper also presents the novel features introduced by the re-engineered version of the execution engine and the most recent components, i.e., the storage service for large single JSON documents and the user interface that allows analysts to comfortably share data sets and computational resources with other analysts possibly working in different places of the Earth globe. Finally, the paper reports the results of an experimental campaign, which show that the execution engine actually performs in a more than satisfactory way, proving that our framework can be actually used by analysts to process JSON data sets.


Author(s):  
Toshiaki Takigawa

ABSTRACT This article examines antitrust issues concerning digital platforms equipped with big data. Recent initiatives by the Japanese competition agency are highlighted, comparing them with those by the USA and EU competition authorities. First examined is whether competition among platforms would result in a select few super platforms with market power, concluding that AI with machine learning has augmented the power of super platforms with strong AI-capability, leading to increased importance of merger control over acquisitions by platforms. Next scrutinized is the argument for utility-regulation to be imposed on super platforms, concluding that wide support is limited to data portability, leaving competition law as the key tool for addressing super platforms, its core tool being the provision against exclusionary conduct, enforcement of which, initially, concerns whether to order super platforms to render their data accessible to their rivals. Passive refusal-to-share data needs to be scrutinized under the essential facility doctrine. Beyond passive refusal, platforms’ exclusionary conduct requires competition agencies to weigh the conduct’s exclusionary effects against its efficiency effects. Finally addressed is exploitative abuse, explaining its relation to consumer protection, concluding that competition law enforcement on exploitative abuse should be minimized, since it accompanies risk of over-enforcement.


Sign in / Sign up

Export Citation Format

Share Document