Innovation Policies and Big Data: Opportunities and Challenges

Author(s):  
Eva Arrilucea ◽  
Miren Nekane Bilbao ◽  
Javier Herrera ◽  
Javier Del Ser

Innovation policies are considered to be one of the main tools to turn innovation into wealth, well-being and competitiveness in territories worldwide. However, given the ever-growing data-centered ecosystem where such policies coexist nowadays there is a founded suspicion that traditional methods for policy analysis, design and evaluation begin to fail, particularly when faster and more effective answers to societal paradigms are requested in a context characterized by sharp technological changes and unprecedented economic, scientific, political and social scenario. This chapter addresses the question whether Big Data analytics can become a tool capable of overcoming the current obstacles and adapt the public policy cycle to the new reality as it seems to be happening in the case of the private sector. We also explore if Big Data analytics can be the definitive tool to develop best policy solutions in a subjective, uncertain and dynamic environment, underpinned by different interests, as well as the degree of maturity for its application. To this end this work explores and exposes the role played to date by data in the design of innovation policies, concluding with a reasoned insight on the practical issues and unsolved research challenges that should be surpassed before empowering innovation policy making processes with Big Data analytics.


2020 ◽  
pp. 217-230
Author(s):  
Philip Garnett ◽  
Sarah M. Hughes

In this chapter, Garnett and Hughes focus on the role of big data in accessing information from public inquiries. Looking at the Chelsea Manning court martial in the US and the Leveson Inquiry in the UK, they argue that the manner in which information pertaining to inquiries is made public is, at best, unsatisfactory. They propose a variety of means to make this information more accessible and hence more transparent to the public through employing big data techniques.



2018 ◽  
Vol 331 ◽  
pp. 301-311
Author(s):  
Gergely László Szὄke

Big Data is clearly one of the most used buzzwords nowadays, but it really seems that the phenomenon of Big Data will have a huge effect on many different fields, and may be regarded as the new wave of the information revolution started in the 60s of the last century. The potential of exploiting Big Data promises significant benefits (and also new challenges) both in the private and the public sector – this essay will focus on this latter. After a short introduction about Big Data, this paper will first sum up the potential use of Big Data analytics in the public sector. Then I will focus on a specific issue within this scope, namely, how the use of Big Data and algorithm-based decision-making may affect transparency and access to these data. I will focus on the question why the transparency of the algorithms is raised at all, and what the current legal framework for the potential accessibility to them is.



Stroke ◽  
2015 ◽  
Vol 46 (suppl_1) ◽  
Author(s):  
Sara J Kavanagh ◽  
Benjamin Bray ◽  
Lizz Paley ◽  
James T Campbell ◽  
Emma Vestesson ◽  
...  

Introduction: The Sentinel Stroke National Audit Programme (SSNAP) is the new national stroke register of England and Wales. It has been designed to harness the power of “Big Data” to produce near real-time data collection, analysis and reporting. Sophisticated data visualization is used to provide customized analytics for clinical teams, administrators, healthcare funders and stroke survivors and carers. Methods: A portfolio of cutting edge data visualisation outputs, including team level slidedecks, performance charts, dashboards , and interactive maps, was produced. Visualisations for patients and the public were co-designed with stroke survivors. Stakeholder feedback regarding accessibility and usefulness of the resources was sought via online polls. Results: Key SSNAP results are made accessible electronically every three months in a range of bespoke graphical formats. Individualised slidedecks and data summaries are produced for every hospital, funding group, and region to enable provider level performance and quality reporting and regional and national benchmarking. Dynamic maps enhance dissemination and use of results. Real time root cause analysis tools help teams identify areas of improvement. Feedback reports unprecedented utility of these resources for clinical teams, funders, regional and national health bodies, patients and the public in identifying areas of good practice and requiring improvements, highlighting variations, and driving change. Conclusion: SSNAP is a potential new model of healthcare quality measurement that uses recent developments in big data analytics and visualization to provide information on stroke care quality that is more useful to stakeholders. Similar approaches could be used in other healthcare settings and populations.



2017 ◽  
Author(s):  
Robert Brauneis ◽  
Ellen P. Goodman

Emerging across many disciplines are questions about algorithmic ethics – about the values embedded in artificial intelligence and big data analytics that increasingly replace human decisionmaking. Many are concerned that an algorithmic society is too opaque to be accountable for its behavior. An individual can be denied parole or denied credit, fired or not hired for reasons she will never know and cannot be articulated. In the public sector, the opacity of algorithmic decisionmaking is particularly problematic both because governmental decisions may be especially weighty, and because democratically-elected governments bear special duties of accountability. Investigative journalists have recently exposed the dangerous impenetrability of algorithmic processes used in the criminal justice field – dangerous because the predictions they make can be both erroneous and unfair, with none the wiser. We set out to test the limits of transparency around governmental deployment of big data analytics, focusing our investigation on local and state government use of predictive algorithms. It is here, in local government, that algorithmically-determined decisions can be most directly impactful. And it is here that stretched agencies are most likely to hand over the analytics to private vendors, which may make design and policy choices out of the sight of the client agencies, the public, or both. To see just how impenetrable the resulting “black box” algorithms are, we filed 42 open records requests in 23 states seeking essential information about six predictive algorithm programs. We selected the most widely-used and well-reviewed programs, including those developed by for-profit companies, nonprofits, and academic/private sector partnerships. The goal was to see if, using the open records process, we could discover what policy judgments these algorithms embody, and could evaluate their utility and fairness. To do this work, we identified what meaningful “algorithmic transparency” entails. We found that in almost every case, it wasn’t provided. Over-broad assertions of trade secrecy were a problem. But contrary to conventional wisdom, they were not the biggest obstacle. It will not usually be necessary to release the code used to execute predictive models in order to dramatically increase transparency. We conclude that publicly-deployed algorithms will be sufficiently transparent only if (1) governments generate appropriate records about their objectives for algorithmic processes and subsequent implementation and validation; (2) government contractors reveal to the public agency sufficient information about how they developed the algorithm; and (3) public agencies and courts treat trade secrecy claims as the limited exception to public disclosure that the law requires. Although it would require a multi-stakeholder process to develop best practices for record generation and disclosure, we present what we believe are eight principal types of information that such records should ideally contain.



2019 ◽  
Vol 3 (2) ◽  
pp. 80-93
Author(s):  
Haviluddin Haviluddin ◽  
Rayner Alfred

Big Data has a characteristics is size, new opportunities and have the potential to transform corporations and government and its interactions with the public. This paper attempts to offer a broader definition of Big Data that captures it is other unique and defining characteristics. This paper presents a consolidated description of Big Data by integrating definitions from practitioners and academics. In addition, we summarize the issues, trends, problems and controversies related to Big Data (technology, applications, and people) from infrastructure (i.e., hardware and software), technology for Big Data Analytics (BDA), management, educational and scientists, and government-related to policies perspectives in order to support the Economic Community ASEAN (AEC) era.



Industry revolution 4.0 makes people face change, the auditor profession is no exception. Auditors no longer conduct audits using the manual method but use computerized systems such as big data analytics. Our research aims to find out how auditors must change, when facing new technology approach audit. Very complicated and various data kinds that was too large to be audited manually. The research method in this research is descriptive qualitative, the method of data collection uses interviews with informants. The informant is the auditor partner of the public accounting firm. Big data has advantage and disadvantage to audit and fraud detection profession. The results of this study state that audit firm must be aware of the obstacles that come from internal and external for the implementation of big data analytics audit. Obstacles will hinder the adoption of technology for the auditor, while the auditor must change rapidly following the demands of technological development





Traffic Jam has been one of the worst problems in the Country. Traffic Jams are leading consumption of enormous amount of Time, Energy and Money. Even though various traffic avoiding techniques are implemented, we are not able to reduce the traffic due to growing intensity of vehicles. Hence, there is a requirement for alternate method to overcome this traffic congestion. In this paper, we are implementing a separate lane for public transport by allotting a separate lane for them and to monitor the traffic we are using Artificial Number Plate Recognition camera which can capture the vehicles number plates and can store in database which can also be used as Real Time monitoring of traffic. The public will also be notified by sending them a message to use public transport so that they can save their time and money.



Sign in / Sign up

Export Citation Format

Share Document