scholarly journals A Measurement Model of Mutual Influence for Information Dissemination

Entropy ◽  
2020 ◽  
Vol 22 (7) ◽  
pp. 725
Author(s):  
Liang Zhang ◽  
Yong Quan ◽  
Bin Zhou ◽  
Yan Jia ◽  
Liqun Gao

The recent development of the mobile Internet and the rise of social media have significantly enriched the way people access information. Accurate modeling of the probability of information propagation between users is essential for studying information dissemination issues in social networks. As the dissemination of information is inseparable from the interactions between users, the probability of propagation can be characterized by such interactions. In general, there are differences in the dissemination modes of information that carry different topics in a real social network. Using these factors, we propose a method (TMIVM) to measure the mutual influence between users at the topic level. The method associates two vectorization parameters for each user—an influence vector and a susceptibility vector—where the dimensions of the vector represent different topic categories. The magnitude of the mutual influence between users on different topics can be obtained by the product of the corresponding elements of the vectors. Specifically, in this article, we fit a social network historical information cascade data through Survival Analysis to learn the parameters of the influence and susceptibility vectors. The experimental results on a synthetic data set and a real Microblog data set show that this method better measures the propagation probability and information cascade predictions compared to other methods.

2019 ◽  
Vol 63 (11) ◽  
pp. 1689-1703 ◽  
Author(s):  
Xiaoyang Liu ◽  
Daobing He

Abstract This paper proposes a new information dissemination and opinion evolution IPNN (Information Propagation Neural Network) model based on artificial neural network. The feedforward network, feedback network and dynamic evolution algorithms are designed and implemented. Firstly, according to the ‘six degrees separation’ theory of information dissemination, a seven-layer neural network underlying framework with input layer, propagation layer and termination layer is constructed; secondly, the information sharing and information interaction evolution process between nodes are described by using the event information forward propagation algorithm, opinion difference reverse propagation algorithm; finally, the external factors of online social network information dissemination is considered, the impact of external behavior patterns is measured by media public opinion guidance and network structure dynamic update operations. Simulation results show that the proposed new mathematical model reveals the relationship between the state of micro-network nodes and the evolution of macro-network public opinion. It accurately depicts the internal information interaction mechanism and diffusion mechanism in online social network. Furthermore, it reveals the process of network public opinion formation and the nature of public opinion explosion in online social network. It provides a new scientific method and research approach for the study of social network public opinion evolution.


Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


Water ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 107
Author(s):  
Elahe Jamalinia ◽  
Faraz S. Tehrani ◽  
Susan C. Steele-Dunne ◽  
Philip J. Vardon

Climatic conditions and vegetation cover influence water flux in a dike, and potentially the dike stability. A comprehensive numerical simulation is computationally too expensive to be used for the near real-time analysis of a dike network. Therefore, this study investigates a random forest (RF) regressor to build a data-driven surrogate for a numerical model to forecast the temporal macro-stability of dikes. To that end, daily inputs and outputs of a ten-year coupled numerical simulation of an idealised dike (2009–2019) are used to create a synthetic data set, comprising features that can be observed from a dike surface, with the calculated factor of safety (FoS) as the target variable. The data set before 2018 is split into training and testing sets to build and train the RF. The predicted FoS is strongly correlated with the numerical FoS for data that belong to the test set (before 2018). However, the trained model shows lower performance for data in the evaluation set (after 2018) if further surface cracking occurs. This proof-of-concept shows that a data-driven surrogate can be used to determine dike stability for conditions similar to the training data, which could be used to identify vulnerable locations in a dike network for further examination.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2014 ◽  
Vol 7 (3) ◽  
pp. 781-797 ◽  
Author(s):  
P. Paatero ◽  
S. Eberly ◽  
S. G. Brown ◽  
G. A. Norris

Abstract. The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement of factor elements (BS-DISP). The goal of these methods is to capture the uncertainty of PMF analyses due to random errors and rotational ambiguity. It is shown that the three methods complement each other: depending on characteristics of the data set, one method may provide better results than the other two. Results are presented using synthetic data sets, including interpretation of diagnostics, and recommendations are given for parameters to report when documenting uncertainty estimates from EPA PMF or ME-2 applications.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-13 ◽  
Author(s):  
Yanni Liu ◽  
Dongsheng Liu ◽  
Yuwei Chen

With the rapid development of mobile Internet, the social network has become an important platform for users to receive, release, and disseminate information. In order to get more valuable information and implement effective supervision on public opinions, it is necessary to study the public opinions, sentiment tendency, and the evolution of the hot events in social networks of a smart city. In view of social networks’ characteristics such as short text, rich topics, diverse sentiments, and timeliness, this paper conducts text modeling with words co-occurrence based on the topic model. Besides, the sentiment computing and the time factor are incorporated to construct the dynamic topic-sentiment mixture model (TSTS). Then, four hot events were randomly selected from the microblog as datasets to evaluate the TSTS model in terms of topic feature extraction, sentiment analysis, and time change. The results show that the TSTS model is better than the traditional models in topic extraction and sentiment analysis. Meanwhile, by fitting the time curve of hot events, the change rules of comments in the social network is obtained.


2016 ◽  
Vol 40 (7) ◽  
pp. 867-881 ◽  
Author(s):  
Dingguo Yu ◽  
Nan Chen ◽  
Xu Ran

Purpose With the development and application of mobile internet access, social media represented by Weibo, WeChat, etc. has become the main channel for information release and sharing. High-impact users in social networks are key factors stimulating the large-scale propagation of information within social networks. User influence is usually related to the user’s attention rate, activity level, and message content. The paper aims to discuss these issues. Design/methodology/approach In this paper, the authors focused on Sina Weibo users, centered on users’ behavior and interactive information, and formulated a weighted interactive information network model, then present a novel computational model for Weibo user influence, which combined multiple indexes such as the user’s attention rate, activity level, and message content influence, etc., the model incorporated the time dimension, through the calculation of users’ attribute influence and interactive influence, to comprehensively measure the user influence of Sina Weibo users. Findings Compared with other models, the model reflected the dynamics and timeliness of the user influence in a more accurate way. Extensive experiments are conducted on the real-world data set, and the results validate the performance of the approach, and demonstrate the effectiveness of the dynamics and timeliness. Due to the similarity in platform architecture and user behavior between Sina Weibo and Twitter, the calculation model is also applicable to Twitter. Originality/value This paper presents a novel computational model for Weibo user influence, which combined multiple indexes such as the user’s attention rate, activity level, and message content influence, etc.


Sign in / Sign up

Export Citation Format

Share Document