Data science ethics

2021 ◽  
pp. 159-180
Author(s):  
Benjamin S. Baumer ◽  
Daniel T. Kaplan ◽  
Nicholas J. Horton
Keyword(s):  
Author(s):  
Cat Drew

Data science can offer huge opportunities for government. With the ability to process larger and more complex datasets than ever before, it can provide better insights for policymakers and make services more tailored and efficient. As with all new technologies, there is a risk that we do not take up its opportunities and miss out on its enormous potential. We want people to feel confident to innovate with data. So, over the past 18 months, the Government Data Science Partnership has taken an open, evidence-based and user-centred approach to creating an ethical framework. It is a practical document that brings all the legal guidance together in one place, and is written in the context of new data science capabilities. As part of its development, we ran a public dialogue on data science ethics, including deliberative workshops, an experimental conjoint survey and an online engagement tool. The research supported the principles set out in the framework as well as provided useful insight into how we need to communicate about data science. It found that people had a low awareness of the term ‘data science’, but that showing data science examples can increase broad support for government exploring innovative uses of data. But people's support is highly context driven. People consider acceptability on a case-by-case basis, first thinking about the overall policy goals and likely intended outcome, and then weighing up privacy and unintended consequences. The ethical framework is a crucial start, but it does not solve all the challenges it highlights, particularly as technology is creating new challenges and opportunities every day. Continued research is needed into data minimization and anonymization, robust data models, algorithmic accountability, and transparency and data security. It also has revealed the need to set out a renewed deal between the citizen and state on data, to maintain and solidify trust in how we use people's data for social good. This article is part of the themed issue ‘The ethical impact of data science’.


2020 ◽  
Vol 12 (3) ◽  
Author(s):  
Christophe Olivier Schneble ◽  
Bernice Simone Elger ◽  
David Martin Shaw

2019 ◽  
Vol 3 (2) ◽  
pp. 180-190
Author(s):  
Emily S Rempel ◽  
Julie Barnett ◽  
Hannah Durrant

This study examines the hidden assumptions around running public-engagement exercises in government. We study an example of public engagement on the ethics of combining and analysing data in national government – often called data science ethics. We study hidden assumptions, drawing on hidden curriculum theories in education research, as it allows us to identify conscious and unconscious underlying processes related to conducting public engagement that may impact results. Through participation in the 2016 Public Dialogue for Data Science Ethics in the UK, four key themes were identified that exposed underlying publicengagement norms. First, that organizers had constructed a strong imagined public as neither overly critical nor supportive, which they used to find and engage participants. Second, that official aims of the engagement, such as including publics in developing ethical data regulations, were overshadowed by underlying meta-objectives, such as counteracting public fears. Third, that advisory group members, organizers and publics understood the term 'engagement' in varying ways, from creating interest to public inclusion. And finally, that stakeholder interests, particularly government hopes for a positive report, influenced what was written in the final report. Reflection on these underlying mechanisms, such as the development of meta-objectives that seek to benefit government and technical stakeholders rather than publics, suggests that the practice of public engagement can, in fact, shut down opportunities for meaningful public dialogue.


2021 ◽  
Vol 25 (4) ◽  
Author(s):  
Rebecca M. Quintana ◽  
Juan D. Pinto ◽  
Yuanru Tan

We compared discussion posts from a data science ethics MOOC that was hosted on two platforms. We characterized one platform as “open” because learners can respond to discussion prompts while viewing and responding to others. We characterized the other platform as “locked” because learners must respond to a discussion prompt before they can view and respond to others. Our objective is to determine whether these platform differences are consequential and have the potential to impact learning. We analyzed direct responses to two discussion prompts from two modules located in modules two and six of an eight module course. We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.


Author(s):  
Charles Bouveyron ◽  
Gilles Celeux ◽  
T. Brendan Murphy ◽  
Adrian E. Raftery

Sign in / Sign up

Export Citation Format

Share Document