Automating the audience commodity: The unacknowledged ancestry of programmatic advertising

2019 ◽  
Vol 21 (11-12) ◽  
pp. 2366-2385 ◽  
Author(s):  
Lee McGuigan

Programmatic advertising describes techniques for automating and optimizing transactions in the audience marketplace. Facilitating real-time bidding for audience impressions and personalized targeting, programmatic technologies are at the leading edge of digital, data-driven advertising. But almost no research considers programmatic advertising within a general history of information technology in commercial media industries. The computerization of advertising and media buying remains curiously unexamined. Using archival sources, this study situates programmatic advertising within a longer trajectory, focusing on the incorporation of electronic data processing into the spot television business, starting in the 1950s. The article makes three contributions: it illustrates that (1) demands for information, data processing, and rapid communications have long been central to advertising and media buying; (2) automation “ad tech” developed gradually through efforts to coordinate and accelerate transactions; and (3) the use of computers to increase efficiency and approach mathematical optimization reformatted calculative resources for media and marketing decisions.

2013 ◽  
Vol 2 (3) ◽  
pp. 63-78 ◽  
Author(s):  
Ruben Xing ◽  
John Wang ◽  
Qiyang Chen

The authors critically review the history of information technology innovations, from a national competitive advantage perspective. Definitions of key terms are grounded in a thorough literature review, to inform a future meta-analysis. The authors identify the most significant US-based innovations, which in turn are driving future IT development. Propositions are generated for future IT-related studies.


Geophysics ◽  
2020 ◽  
pp. 1-91
Author(s):  
Ruijia Wang ◽  
Brian Hornby ◽  
Kristoffer Walker ◽  
Chung Chang ◽  
Gary Kainer ◽  
...  

Real-time open hole wireline sonic logging data processing becomes a nontrivial task to accurately, automatically and efficiently evaluate both the compressional and shear slowness of a borehole rock formation when human interaction is not possible and signal processing time is limited to elapsed time between different transmitter firings. To address real-time sonic data processing challenges, we present self-adaptive, data-driven methods to accurately measure formation compressional and shear wave slowness from both monopole and dipole waveforms in all types of formations. These new real-time processing techniques take advantage of the fact that advanced wireline sonic logging tools have wide frequency responses and little to no detectable tool body arrivals. These technology improvements provide an opportunity to implement a first-motion-detection technique that detects the onset of compressional waves in the monopole array waveforms. The knowledge of compressional arrival time and corresponding slowness are then used to project an appropriate slowness-time window in order to identify the monopole refracted shear wave and its slowness based on the range of possible Vp/Vs for earth rock formation. To process the borehole dipole flexural waves, we provide a new, data-driven frequency domain method that enables the evaluation of the full flexural-wave dispersion response and its corresponding low-frequency shear slowness asymptote. Field data processing results show that our methods provide high-quality compressional slowness (DTC) and shear slowness(DTS) measurements that are not affected by other borehole modes or dispersion complications in all formation types.


Author(s):  
Peter O’Donovan ◽  
Ken Bruton ◽  
Dominic T.J. O’Sullivan

Integrated, real-time and open approaches relating to the development of industrial analytics capabilities are needed to support smart manufacturing. However, adopting industrial analytics can be challenging due to its multidisciplinary and cross-departmental (e.g. Operation and Information Technology) nature. These challenges stem from the significant effort needed to coordinate and manage teams and technologies in a connected enterprise. To address these challenges, this research presents a formal industrial analytics methodology that may be used to inform the development of industrial analytics capabilities. The methodology classifies operational teams that comprise the industrial analytics ecosystem, and presents a technology agnostic reference architecture to facilitate the industrial analytics lifecycle. Finally, the proposed methodology is demonstrated in a case study, where an industrial analytics platform is used to identify an operational issue in a large-scale Air Handling Unit (AHU).


2003 ◽  
Vol 48 (S11) ◽  
pp. 225-261 ◽  
Author(s):  
Greg Downey

As co-editor of this IRSH supplement “Uncovering Labour in Information Revolutions”, I have to begin this commentary with a confession. Before I entered the world of abstract knowledge production, commodification, and consumption known as academia, I was myself a worker in a world of much more concrete information processing: I was a computer programmer in the US from the mid-1980s to the mid-1990s, a time we might now consider the nostalgic heyday of desktop-office information technology (IT). In the spirit of full disclosure, before I leap into an analysis of how we might more broadly conceptualize information technology together with information labor in different historical contexts, I have decided to work through my own historical narrative a bit. After all, if historical practice teaches us nothing else, it teaches that each of us makes sense of the world through the lens of personal experience, leaving historians (among others) with the daunting task of interpreting, translating, and finding patterns of meaning in those experiences. Thus I offer this candid admission: “I was a teenage information worker!”


2022 ◽  
pp. 8-22

This chapter defines the scope of informing science. The chapter begins by examining whether informing science is a discipline or field of knowledge. Next, the development of software engineering and informing science are discussed. The chapter then analyzes four key periods in the history of information processing models: (1) machine-centric computing, (2) application-oriented data processing, (3) service-oriented utility environments, and (4) interactive approaches. Next, the concept of informing science is analyzed, and a matrix model of informing science is presented. The chapter concludes by considering some of the contemporary issues with informing science, including (1) the relationship between ICT as it is applied in businesses and ICT as it is developed as a science in higher education (2) as well as the strategies used by universities for educating students in this field.


Sign in / Sign up

Export Citation Format

Share Document