Linking data analytics to real-world business issues: The power of the pivot table

2021 ◽  
Vol 57 ◽  
pp. 100744
Author(s):  
Madeline A. Domino ◽  
Daniel Schrag ◽  
Mariah Webinger ◽  
Carmelita Troy
Keyword(s):  
IEEE Access ◽  
2017 ◽  
Vol 5 ◽  
pp. 22760-22774 ◽  
Author(s):  
Jiachen Sun ◽  
Liang Shen ◽  
Guoru Ding ◽  
Rongpeng Li ◽  
Qihui Wu

2020 ◽  
Vol 117 (46) ◽  
pp. 28582-28588
Author(s):  
Thomas Gessey-Jones ◽  
Colm Connaughton ◽  
Robin Dunbar ◽  
Ralph Kenna ◽  
Pádraig MacCarron ◽  
...  

Network science and data analytics are used to quantify static and dynamic structures in George R. R. Martin’s epic novels,A Song of Ice and Fire, works noted for their scale and complexity. By tracking the network of character interactions as the story unfolds, it is found that structural properties remain approximately stable and comparable to real-world social networks. Furthermore, the degrees of the most connected characters reflect a cognitive limit on the number of concurrent social connections that humans tend to maintain. We also analyze the distribution of time intervals between significant deaths measured with respect to the in-story timeline. These are consistent with power-law distributions commonly found in interevent times for a range of nonviolent human activities in the real world. We propose that structural features in the narrative that are reflected in our actual social world help readers to follow and to relate to the story, despite its sprawling extent. It is also found that the distribution of intervals between significant deaths in chapters is different to that for the in-story timeline; it is geometric rather than power law. Geometric distributions are memoryless in that the time since the last death does not inform as to the time to the next. This provides measurable support for the widely held view that significant deaths inA Song of Ice and Fireare unpredictable chapter by chapter.


Author(s):  
Gopala Krishna Behara

This chapter covers the essentials of big data analytics ecosystems primarily from the business and technology context. It delivers insight into key concepts and terminology that define the essence of big data and the promise it holds to deliver sophisticated business insights. The various characteristics that distinguish big data datasets are articulated. It also describes the conceptual and logical reference architecture to manage a huge volume of data generated by various data sources of an enterprise. It also covers drivers, opportunities, and benefits of big data analytics implementation applicable to the real world.


Author(s):  
Amanuel Fekade Tadesse ◽  
Nishani Vincent

This advisory case is designed to develop data analytics skills using multiple large real-world datasets based on eXtensible Business Reporting Language (XBRL). This case can also be used to introduce students to XBRL concepts such as extension taxonomies. Students are asked to recommend an XBRL preparation software for a hypothetical company (ViewDrive) that is adopting XBRL to satisfy the financial report filing requirements imposed by the Securities and Exchange Commission (SEC). Students perform data cleansing (extract, transform, load) procedures to prepare large datasets for data analytics. Students are encouraged to think critically, specify assumptions before performing data analytics (using analytic software such as Tableau), and generate visualizations that support their written recommendations. The case is easy to implement, promotes active learning, and has received favorable student and instructor feedback. This case can be used to introduce technology and data analytics topics into the accounting curriculum to help satisfy AACSB’s objectives.


2022 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Sepanta Sharafuddin ◽  
Ivan Belik

PurposeThe present study provides a comprehensive review of the evolution of data analytics using real-world cases. The purpose is to provide a distinct overview of where the phenomenon was derived from, where it currently stands and where it is heading.Design/methodology/approachThree case studies were selected to represent three different eras of data analytics: Yesterday (1950s–1990s), Today (2000s–2020s) and Tomorrow (2030s–2050s).FindingsRapid changes in information technologies more likely moving us towards a more cyber-physical society, where an increasing number of devices, people and corporations are connected. We can expect the development of a more connected cyber society, open for data exchange than ever before.Social implicationsThe analysis of technological trends through the lens of representative real-world cases helps to clarify where data analytics was derived from, where it currently stands and where it is heading towards. The presented case studies accentuate that data analytics is constantly evolving with no signs of stagnation.Originality/valueAs the field of data analytics is constantly evolving, the study of its evolution based on particular studies aims to better understand the paradigm shift in data analytics and the resulting technological advances in the IT business through the representative real-life cases.


2018 ◽  
Vol 108 (07-08) ◽  
pp. 543-548
Author(s):  
T. Pschybilla ◽  
D. Baumann ◽  
S. Manz ◽  
W. Wenger ◽  

Mit der fortschreitenden Digitalisierung in der Produktion werden konstant ansteigende Datenmengen generiert. Eine besondere Rolle kommt dabei dem Gebiet der Data Analytics zu, welches die Gewinnung von Wissen aus Daten und damit die Entscheidungsfindung unterstützen kann. Im Beitrag wird ein Reifegradmodell zur Einordnung von Anwendungsfällen der Data Analytics in der Produktion vorgestellt und an einem Beispiel der Smart Services der Trumpf GmbH + Co. KG angewendet.   With the progressing digitization in manufacturing, continuously increasing amounts of data are being generated. The field of data analytics plays an important role in this context by advancing the acquisition of knowledge from data and thus decision-making. This paper presents a maturity model for the classification of data analytics use cases in manufacturing. The model is applied to an example of Smart Services at Trumpf GmbH + Co. KG.


2021 ◽  
Author(s):  
Ashish Naidu ◽  
Archak Mittal ◽  
Rebecca Kreucher ◽  
Alice Chen Zhang ◽  
Walter Ortmann ◽  
...  

2021 ◽  
Vol 73 (03) ◽  
pp. 34-37
Author(s):  
Judy Feder

The time needed to eliminate complications and accidents accounts for 20–25% of total well construction time, according to a 2020 SPE paper (SPE 200740). The same paper notes that digital twins have proven to be a key enabler in improving sustainability during well construction, shrinking the carbon footprint by reducing overall drilling time and encouraging and bringing confidence to contactless advisory and collaboration. The paper also points out the potential application of digital twins to activities such as geothermal drilling. Advanced data analytics and machine learning (ML) potentially can reduce engineering hours up to 70% during field development, according to Boston Consulting Group. Increased field automation, remote operations, sensor costs, digital twins, machine learning, and improved computational speed are responsible. It is no surprise, then, that digital twins are taking on a greater sense of urgency for operators, service companies, and drilling contractors working to improve asset and enterprise safety, productivity, and performance management. For 2021, digital twins appear among the oil and gas industry’s top 10 digital spending priorities. DNV GL said in its Technology Outlook 2030 that this could be the decade when cloud computing and advanced simulation see virtual system testing, virtual/augmented reality, and machine learning progressively merge into full digital twins that combine data analytics, real-time, and near-real-time data for installations, subsurface geology, and reservoirs to bring about significant advancements in upstream asset performance, safety, and profitability. The biggest challenges to these advancements, according to the firm, will be establishing confidence in the data and computational models that a digital twin uses and user organizations’ readiness to work with and evolve alongside the digital twin. JPT looked at publications from inside and outside the upstream industry and at several recent SPE papers to get a snapshot of where the industry stands regarding uptake of digital twins in well construction and how the technology is affecting operations and outcomes. Why Digital Twins Gartner Information defines a digital twin as a digital representation of a real-world entity or system. “The implementation of a digital twin,” Gartner writes, “is an encapsulated software object or model that mirrors a unique physical object, process, organization, person or other abstraction.” Data from multiple digital twins can be aggregated for a composite view across several real-world entities and their related processes. In upstream oil and gas, digital twins focus on the well—and, ultimately, the field—and its lifecycle. Unlike a digital simulation, which produces scenarios based on what could happen in the physical world but whose scenarios may not be actionable, a digital twin represents actual events from the physical world, making it possible to visualize and understand real-life scenarios to make better decisions. Digital well construction twins can pertain to single assets or processes and to the reservoir/subsurface or the surface. Ultimately, when process and asset sub-twins are connected, the result is an integrated digital twin of the entire asset or well. Massive sensor technology and the ability to store and handle huge amounts of data from the asset will enable the full digital twin to age throughout the life-cycle of the asset, along with the asset itself (Fig. 1).


Sign in / Sign up

Export Citation Format

Share Document