How Datafication Drives Legacy Newspapers to Change Their Advertising Model for Business Survival

2019 ◽  
Vol 1 (2) ◽  
pp. 62-74
Author(s):  
Luis Sangil

Technological advances have introduced changes in digital media business and funding models. Traditional “legacy” newspapers are reacting to the superior business performance of digital intermediaries such as Google and Facebook, which capture a big part of total digital advertising revenues. This work describes the change of focus of the Unidad Editorial, publisher of a set of leading digital newspapers in Spain, including elmundo.es. The company ceased perceiving other digital newspapers as its competitor and tried to learn from the advertising revenue models of major players in the digital arena. This study argues that the management of big data is deeply transforming legacy newspapers' advertising regime. Their advertising model is increasingly based on more sophisticated segmentation tools and programmatic advertising techniques. It finds that a strategy to attract revenue based on learning from competitive models of big platforms is efficient and logical. Hence, the ability to market the value of individual users in real-time is a key factor in the success of this model.

2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Tian J. Ma ◽  
Rudy J. Garcia ◽  
Forest Danford ◽  
Laura Patrizi ◽  
Jennifer Galasso ◽  
...  

AbstractThe amount of data produced by sensors, social and digital media, and Internet of Things (IoTs) are rapidly increasing each day. Decision makers often need to sift through a sea of Big Data to utilize information from a variety of sources in order to determine a course of action. This can be a very difficult and time-consuming task. For each data source encountered, the information can be redundant, conflicting, and/or incomplete. For near-real-time application, there is insufficient time for a human to interpret all the information from different sources. In this project, we have developed a near-real-time, data-agnostic, software architecture that is capable of using several disparate sources to autonomously generate Actionable Intelligence with a human in the loop. We demonstrated our solution through a traffic prediction exemplar problem.


Healthcare ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 234 ◽  
Author(s):  
Hyun Yoo ◽  
Soyoung Han ◽  
Kyungyong Chung

Recently, a massive amount of big data of bioinformation is collected by sensor-based IoT devices. The collected data are also classified into different types of health big data in various techniques. A personalized analysis technique is a basis for judging the risk factors of personal cardiovascular disorders in real-time. The objective of this paper is to provide the model for the personalized heart condition classification in combination with the fast and effective preprocessing technique and deep neural network in order to process the real-time accumulated biosensor input data. The model can be useful to learn input data and develop an approximation function, and it can help users recognize risk situations. For the analysis of the pulse frequency, a fast Fourier transform is applied in preprocessing work. With the use of the frequency-by-frequency ratio data of the extracted power spectrum, data reduction is performed. To analyze the meanings of preprocessed data, a neural network algorithm is applied. In particular, a deep neural network is used to analyze and evaluate linear data. A deep neural network can make multiple layers and can establish an operation model of nodes with the use of gradient descent. The completed model was trained by classifying the ECG signals collected in advance into normal, control, and noise groups. Thereafter, the ECG signal input in real time through the trained deep neural network system was classified into normal, control, and noise. To evaluate the performance of the proposed model, this study utilized a ratio of data operation cost reduction and F-measure. As a result, with the use of fast Fourier transform and cumulative frequency percentage, the size of ECG reduced to 1:32. According to the analysis on the F-measure of the deep neural network, the model had 83.83% accuracy. Given the results, the modified deep neural network technique can reduce the size of big data in terms of computing work, and it is an effective system to reduce operation time.


2021 ◽  
pp. 100489
Author(s):  
Paul La Plante ◽  
P.K.G. Williams ◽  
M. Kolopanis ◽  
J.S. Dillon ◽  
A.P. Beardsley ◽  
...  

Molecules ◽  
2020 ◽  
Vol 26 (1) ◽  
pp. 20
Author(s):  
Reynaldo Villarreal-González ◽  
Antonio J. Acosta-Hoyos ◽  
Jaime A. Garzon-Ochoa ◽  
Nataly J. Galán-Freyle ◽  
Paola Amar-Sepúlveda ◽  
...  

Real-time reverse transcription (RT) PCR is the gold standard for detecting Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2), owing to its sensitivity and specificity, thereby meeting the demand for the rising number of cases. The scarcity of trained molecular biologists for analyzing PCR results makes data verification a challenge. Artificial intelligence (AI) was designed to ease verification, by detecting atypical profiles in PCR curves caused by contamination or artifacts. Four classes of simulated real-time RT-PCR curves were generated, namely, positive, early, no, and abnormal amplifications. Machine learning (ML) models were generated and tested using small amounts of data from each class. The best model was used for classifying the big data obtained by the Virology Laboratory of Simon Bolivar University from real-time RT-PCR curves for SARS-CoV-2, and the model was retrained and implemented in a software that correlated patient data with test and AI diagnoses. The best strategy for AI included a binary classification model, which was generated from simulated data, where data analyzed by the first model were classified as either positive or negative and abnormal. To differentiate between negative and abnormal, the data were reevaluated using the second model. In the first model, the data required preanalysis through a combination of prepossessing. The early amplification class was eliminated from the models because the numbers of cases in big data was negligible. ML models can be created from simulated data using minimum available information. During analysis, changes or variations can be incorporated by generating simulated data, avoiding the incorporation of large amounts of experimental data encompassing all possible changes. For diagnosing SARS-CoV-2, this type of AI is critical for optimizing PCR tests because it enables rapid diagnosis and reduces false positives. Our method can also be used for other types of molecular analyses.


Author(s):  
Hina Jamil ◽  
Tariq Umer ◽  
Celal Ceken ◽  
Fadi Al-Turjman
Keyword(s):  
Big Data ◽  

Sign in / Sign up

Export Citation Format

Share Document