scholarly journals Online-to-offline advertisements as field experiments

Author(s):  
Akira Matsui ◽  
Daisuke Moriwaki

AbstractOnline advertisements have become one of today’s most widely used tools for enhancing businesses partly because of their compatibility with A/B testing. A/B testing allows sellers to find effective advertisement strategie,s such as ad creatives or segmentations. Even though several studies propose a technique to maximize the effect of an advertisement, there is insufficient comprehension of the customers’ offline shopping behavior invited by the online advertisements. Herein, we study the difference in offline behavior between customers who received online advertisements and regular customers (i.e., the customers visits the target shop voluntary), and the duration of this difference. We analyze approximately three thousand users’ offline behavior with their 23.5 million location records through 31 A/B testings. We first demonstrate the externality that customers with advertisements traverse larger areas than those without advertisements, and this spatial difference lasts several days after their shopping day. We then find a long-run effect of this externality of advertising that a certain portion of the customers invited to the offline shops revisit these shops. Finally, based on this revisit effect findings, we utilize a causal machine learning model to propose a marketing strategy to maximize the revisit ratio. Our results suggest that advertisements draw customers who have different behavior traits from regular customers. This study demonstrates that a simple analysis may underrate the effects of advertisements on businesses, and an analysis considering externality can attract potentially valuable customers.

2021 ◽  
Vol 73 (02) ◽  
pp. 37-39
Author(s):  
Trent Jacobs

For all that logging-while-drilling has provided since its wide-spread adoption in the 1980s, there is one thing on the industry’s wish list that it could never offer: an accurate way to tell the difference between oil and gas. A new technology created by petrotechnicals at Equinor, however, has made this possible. The innovation could be thought of as a pseudo-log, but Equinor is describing it as a reservoir-fluid-identification system. Using an internally developed machine-learning model, it compares a database of more than 4,000 reservoir samples against the real-time analysis of the mud gas that flows up a well as it is drilled. Crunched out of the technology’s various hardware and software components is a prediction on the gas/oil ratio (GOR) that the rock being drilled through will have once it is producing. Since this happens in real time, it boils down to an alert system for when drillers are tapping into uneconomic pay zones. “This is something people have tried to do for 30 years - using partial information to predict entire oil and gas properties,” said Tao Yang. He added that “the data acquisition is rather cheap compared with all the downhole tools, and it doesn’t cost you rig time,” highlighting that the mud-gas analyzer critical to the process sits on a rig or platform without interfering with drilling operations. Yang is a reservoir technology specialist at Equinor and one of the authors of a technical paper (SPE 201323) about the new digital technology that was presented at the SPE Annual Technical Conference and Exhibition in October. He and his colleagues spent more than 3 years building the system which began in the Norwegian oil company’s Houston office as a project to improve pressure/volume/temperature (PVT) analysis in tight-oil wells in North America. It has since found a home in the company’s much larger offshore business unit in Stavanger. Offshore projects designed around certain oil-production targets can face harsh realities when they end up producing more associated gas than expected. It is the difference between drilling an underperforming well full of headaches and one that will pay out hundreds of millions of dollars over its lifetime. By introducing real-time fluid identification, Equinor is trying to enforce a new control on that risk by giving drillers the information they need to pull the bit back and start drilling a side-track deeper into the formation where the odds are better of finding higher proportions of oil or condensates. At the conference, Yang shared details about some of the first field implementations, saying that in most cases the GOR predictions made by the fluid-identification system were confirmed by traditional PVT analysis from the trial wells. Unlike other advancements made on this front, he also said the new approach is the first of its kind to combine such a large database of PVT data with a machine-learning model “that is common to any well.” That means “we do not need to know where this well is located” to make a GOR prediction, said Yang.


2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Ayten Kayi Cangir ◽  
Kaan Orhan ◽  
Yusuf Kahya ◽  
Hilal Özakıncı ◽  
Betül Bahar Kazak ◽  
...  

Abstract Introduction Radiomics methods are used to analyze various medical images, including computed tomography (CT), magnetic resonance, and positron emission tomography to provide information regarding the diagnosis, patient outcome, tumor phenotype, and the gene-protein signatures of various diseases. In low-risk group, complete surgical resection is typically sufficient, whereas in high-risk thymoma, adjuvant therapy is usually required. Therefore, it is important to distinguish between both. This study evaluated the CT radiomics features of thymomas to discriminate between low- and high-risk thymoma groups. Materials and methods In total, 83 patients with thymoma were included in this study between 2004 and 2019. We used the Radcloud platform (Huiying Medical Technology Co., Ltd.) to manage the imaging and clinical data and perform the radiomics statistical analysis. The training and validation datasets were separated by a random method with a ratio of 2:8 and 502 random seeds. The histopathological diagnosis was noted from the pathology report. Results Four machine-learning radiomics features were identified to differentiate a low-risk thymoma group from a high-risk thymoma group. The radiomics feature names were Energy, Zone Entropy, Long Run Low Gray Level Emphasis, and Large Dependence Low Gray Level Emphasis. Conclusions The results demonstrated that a machine-learning model and a multilayer perceptron classifier analysis can be used on CT images to predict low- and high-risk thymomas. This combination could be a useful preoperative method to determine the surgical approach for thymoma.


2021 ◽  
Author(s):  
Ayten KAYICANGIR ◽  
Kaan ORHAN ◽  
Yusuf KAHYA ◽  
Hilal ÖZAKINCI ◽  
Betül Bahar KAZAK ◽  
...  

Abstract IntroductionRadiomics has become a hot issue in the medical imaging field, particularly in cancer imaging. Radiomics methods are used to analyze various medical images, including computed tomography (CT), magnetic resonance, and positron emission tomography to provide information regarding the diagnosis, patient outcome, tumor phenotype, and the gene-protein signatures of various diseases.This study evaluated the CT radiomics features of thymomas to discriminate between low- and high-risk thymoma groups.Materials and MethodsIn total, 83 patients with thymoma were included in this study between 2004 and 2019. We used the Radcloud platform (Huiying Medical Technology Co., Ltd.) to manage the imaging and clinical data and perform the radiomics statistical analysis. The training and validation datasets were separated by a random method with a ratio of 2:8 and 502 random seeds. The histopathological diagnosis was noted from the pathology report.ResultsFour machine learning radiomics features were identified to differentiate a low-risk thymoma group from a high-risk thymoma group. The radiomics feature names were Energy, Zone Entropy, Long Run Low Gray Level Emphasis, and Large Dependence Low Gray Level Emphasis.ConclusionsThe results demonstrated that a machine-learning model and a multilayer perceptron classifier analysis can be used on CT images to predict low- and high-risk thymomas. This combination could be a useful preoperative method to determine the surgical approach for thymoma.


Author(s):  
Osval Antonio Montesinos López ◽  
Abelardo Montesinos López ◽  
Jose Crossa

AbstractThe overfitting phenomenon happens when a statistical machine learning model learns very well about the noise as well as the signal that is present in the training data. On the other hand, an underfitted phenomenon occurs when only a few predictors are included in the statistical machine learning model that represents the complete structure of the data pattern poorly. This problem also arises when the training data set is too small and thus an underfitted model does a poor job of fitting the training data and unsatisfactorily predicts new data points. This chapter describes the importance of the trade-off between prediction accuracy and model interpretability, as well as the difference between explanatory and predictive modeling: Explanatory modeling minimizes bias, whereas predictive modeling seeks to minimize the combination of bias and estimation variance. We assess the importance and different methods of cross-validation as well as the importance and strategies of tuning that are key to the successful use of some statistical machine learning methods. We explain the most important metrics for evaluating the prediction performance for continuous, binary, categorical, and count response variables.


Horticulturae ◽  
2021 ◽  
Vol 7 (8) ◽  
pp. 207
Author(s):  
Kunpeng Zheng ◽  
Yu Bo ◽  
Yanda Bao ◽  
Xiaolei Zhu ◽  
Jian Wang ◽  
...  

Photorespiration results in a large amount of leaf photosynthesis consumption. However, there are few studies on the response of photorespiration to multi-factors. In this study, a machine learning model for the photorespiration rate of cucumber leaves’ response to multi-factors was established. It provides a theoretical basis for studies related to photorespiration. Machine learning models of different methods were designed and compared. The photorespiration rate was expressed as the difference between the photosynthetic rate at 2% O2 and 21% O2 concentrations. The results show that the XGBoost models had the best fit performance with an explained variance score of 0.970 for both photosynthetic rate datasets measured using air and 2% O2, with mean absolute errors of 0.327 and 0.181, root mean square errors of 1.607 and 1.469, respectively, and coefficients of determination of 0.970 for both. In addition, this study indicates the importance of the features of temperature, humidity and the physiological status of the leaves for predicted results of photorespiration. The model established in this study performed well, with high accuracy and generalization ability. As a preferable exploration of the research on photorespiration rate simulation, it has theoretical significance and application prospects.


2018 ◽  
Author(s):  
Steen Lysgaard ◽  
Paul C. Jennings ◽  
Jens Strabo Hummelshøj ◽  
Thomas Bligaard ◽  
Tejs Vegge

A machine learning model is used as a surrogate fitness evaluator in a genetic algorithm (GA) optimization of the atomic distribution of Pt-Au nanoparticles. The machine learning accelerated genetic algorithm (MLaGA) yields a 50-fold reduction of required energy calculations compared to a traditional GA.


2019 ◽  
Author(s):  
Siddhartha Laghuvarapu ◽  
Yashaswi Pathak ◽  
U. Deva Priyakumar

Recent advances in artificial intelligence along with development of large datasets of energies calculated using quantum mechanical (QM)/density functional theory (DFT) methods have enabled prediction of accurate molecular energies at reasonably low computational cost. However, machine learning models that have been reported so far requires the atomic positions obtained from geometry optimizations using high level QM/DFT methods as input in order to predict the energies, and do not allow for geometry optimization. In this paper, a transferable and molecule-size independent machine learning model (BAND NN) based on a chemically intuitive representation inspired by molecular mechanics force fields is presented. The model predicts the atomization energies of equilibrium and non-equilibrium structures as sum of energy contributions from bonds (B), angles (A), nonbonds (N) and dihedrals (D) at remarkable accuracy. The robustness of the proposed model is further validated by calculations that span over the conformational, configurational and reaction space. The transferability of this model on systems larger than the ones in the dataset is demonstrated by performing calculations on select large molecules. Importantly, employing the BAND NN model, it is possible to perform geometry optimizations starting from non-equilibrium structures along with predicting their energies.


Author(s):  
V. Dumych ◽  

The purpose of research: to improve the technology of growing flax in the Western region of Ukraine on the basis of the introduction of systems for minimizing tillage, which will increase the yield of trusts and seeds. Research methods: field, laboratory, visual and comparative calculation method. Research results: Field experiments included the study of three tillage systems (traditional, canning and mulching) and determining their impact on growth and development and yields of trusts and flax seeds. The traditional tillage system included the following operations: plowing with a reversible plow to a depth of 27 cm, cultivation with simultaneous harrowing and pre-sowing tillage. The conservation system is based on deep shelfless loosening of the soil and provided for chiseling to a depth of 40 cm, disking to a depth of 15 cm, cultivation with simultaneous harrowing, pre-sowing tillage. During the implementation of the mulching system, disking to a depth of 15 cm, cultivation with simultaneous harrowing and pre-sowing tillage with a combined unit was carried out. Tillage implements and machines were used to perform tillage operations: disc harrow BDVP-3,6, reversible plow PON-5/4, chisel PCh-3, cultivator KPSP-4, pre-sowing tillage unit LK-4. The SZ-3,6 ASTPA grain seeder was used for sowing long flax of the Kamenyar variety. Simultaneously with the sowing of flax seeds, local application of mineral fertilizers (nitroammophoska 2 c/ha) was carried out. The application of conservation tillage allows to obtain the yield of flax trust at the level of 3,5 t/ha, which is 0,4 t/ha (12.9 %) more than from the area of traditional tillage and 0,7 t/ha (25 %) in comparison with mulching. In the area with canning treatment, the seed yield was the highest and amounted to 0,64 t/ha. The difference between this option and traditional and mulching tillage reaches 0,06 t/ha (10,3 %) and 0.10 t/ha (18.5 %), respectively. Conclusions. Preservation tillage, which is based on shelf-free tillage to a depth of 40 cm and disking to a depth of 15 cm has a positive effect on plant growth and development, yield and quality of flax.


Sign in / Sign up

Export Citation Format

Share Document