joint probability model
Recently Published Documents


TOTAL DOCUMENTS

20
(FIVE YEARS 10)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
Vol 112 ◽  
pp. 102710
Author(s):  
Xiaoyu Bai ◽  
Hui Jiang ◽  
Xiaoyu Huang ◽  
Guangsong Song ◽  
Xinyi Ma

2021 ◽  
pp. 1-11
Author(s):  
Jiawei Sheng ◽  
Qian Li ◽  
Yiming Hei ◽  
Shu Guo ◽  
Bowen Yu ◽  
...  

Abstract This paper presents a winning solution for the CCKS-2020 financial event extraction task, where the goal is to identify event types, triggers and arguments in sentences across multiple event types. In this task, we focus on resolving two challenging problems (i.e., low resources and element overlapping) by proposing a joint learning framework, named SaltyFishes. We first formulate the event extraction task as a joint probability model. By sharing parameters in the model across different types, we can learn to adapt to low-resource events based on high-resource events. We further address the element overlapping problems by a mechanism of Conditional Layer Normalization, achieving even better extraction accuracy. The overall approach achieves an F1-score of 87.8% which ranks the first place in the task.


Author(s):  
Ruda Zhang ◽  
Patrick Wingo ◽  
Rodrigo Duran ◽  
Kelly Rose ◽  
Jennifer Bauer ◽  
...  

Economic assessment in environmental science means measuring and evaluating environmental impacts, adaptation, and vulnerability. Integrated assessment modeling (IAM) is a unifying framework of environmental economics, which attempts to combine key elements of physical, ecological, and socioeconomic systems. The first part of this article reviews the literature on the IAM framework: its components, relations between the components, and examples. For such models to inform environmental decision-making, they must quantify the uncertainties associated with their estimates. Uncertainty characterization in integrated assessment varies by component models: uncertainties associated with mechanistic physical models are often assessed with an ensemble of simulations or Monte Carlo sampling, while uncertainties associated with impact models are evaluated by conjecture or econometric analysis. The second part of this article reviews the literature on uncertainty in integrated assessment, by type and by component. Probabilistic learning on manifolds (PLoM) is a machine learning technique that constructs a joint probability model of all relevant variables, which may be concentrated on a low-dimensional geometric structure. Compared to traditional density estimation methods, PLoM is more efficient especially when the data are generated by a few latent variables. With the manifold-constrained joint probability model learned by PLoM from a small, initial sample, manifold sampling creates new samples for evaluating converged statistics, which helps answer policy-making questions from prediction, to response, and prevention. As a concrete example, this article reviews IAMs of offshore oil spills—which integrate environmental models, transport models, spill scenarios, and exposure metrics—and demonstrates the use of manifold sampling in assessing the risk of drilling in the Gulf of Mexico.


2020 ◽  
Vol 10 (8) ◽  
pp. 2919
Author(s):  
Jian Li ◽  
Mengmin He ◽  
Gaofeng Cui ◽  
Xiaoming Wang ◽  
Weidong Wang ◽  
...  

The detection of seismic signals is vital in seismic data processing and analysis. Many algorithms have been proposed to resolve this issue, such as the ratio of short-term and long-term power averages (STA/LTA), F detector, Generalize F, and etc. However, the detection performance will be affected by the noise signals severely. In this paper, we propose a novel seismic signal detection method based on the historical waveform features to improve the seismic signals detection performance and reduce the affection from the noise signals. We use the historical events location information in a specific area and waveform features information to build the joint probability model. For the new signal from this area, we can determine whether it is the seismic signal according to the value of the joint probability. The waveform features used to construct the model include the average spectral energy on a specific frequency band, the energy of the component obtained by decomposing the signal through empirical mode decomposition (EMD), and the peak and the ratio of STA/LTA trace. We use the Gaussian process (GP) to build each feature model and finally get a multi-features joint probability model. The historical events location information is used as the kernel of the GP, and the historical waveform features are used to train the hyperparameters of GP. The beamforming data of the seismic array KSRS of International Monitoring System are used to train and test the model. The testing results show the effectiveness of the proposed method.


2019 ◽  
Vol 148 (1) ◽  
pp. 241-257 ◽  
Author(s):  
Wentao Li ◽  
Quan J. Wang ◽  
Qingyun Duan

Abstract Statistical postprocessing methods can be used to correct bias and dispersion error in raw ensemble forecasts from numerical weather prediction models. Existing postprocessing models generally perform well when they are assessed on all events, but their performance for extreme events still needs to be investigated. Commonly used joint probability postprocessing models are based on the correlation between forecasts and observations. Because the correlation may be lower for extreme events as a result of larger forecast uncertainty, the dependence between forecasts and observations can be asymmetric with respect to the magnitude of the precipitation. However, the constant correlation coefficient in the traditional joint probability model lacks the flexibility to model asymmetric dependence. In this study, we formulated a new postprocessing model with a decreasing correlation coefficient to characterize asymmetric dependence. We carried out experiments using Global Ensemble Forecast System reforecasts for daily precipitation in the Huai River basin in China. The results show that, although it performs well in terms of continuous ranked probability score or reliability for all events, the traditional joint probability model suffers from overestimation for extreme events defined by the largest 2.5% or 5% of raw forecasts. On the contrary, the proposed variable-correlation model is able to alleviate the overestimation and achieves better reliability for extreme events than the traditional model. The proposed variable-correlation model can be seen as a flexible extension of the traditional joint probability model to improve the performance for extreme events.


Algorithmica ◽  
2019 ◽  
Vol 82 (5) ◽  
pp. 1410-1433 ◽  
Author(s):  
Haris Aziz ◽  
Péter Biró ◽  
Serge Gaspers ◽  
Ronald de Haan ◽  
Nicholas Mattei ◽  
...  

AbstractWe consider the two-sided stable matching setting in which there may be uncertainty about the agents’ preferences due to limited information or communication. We consider three models of uncertainty: (1) lottery model—for each agent, there is a probability distribution over linear preferences, (2) compact indifference model—for each agent, a weak preference order is specified and each linear order compatible with the weak order is equally likely and (3) joint probability model—there is a lottery over preference profiles. For each of the models, we study the computational complexity of computing the stability probability of a given matching as well as finding a matching with the highest probability of being stable. We also examine more restricted problems such as deciding whether a certainly stable matching exists. We find a rich complexity landscape for these problems, indicating that the form uncertainty takes is significant.


Sign in / Sign up

Export Citation Format

Share Document