simplex model
Recently Published Documents


TOTAL DOCUMENTS

38
(FIVE YEARS 3)

H-INDEX

10
(FIVE YEARS 0)

2021 ◽  
Vol 11 (16) ◽  
pp. 7237
Author(s):  
Pengjun Zhai ◽  
Chen Wang ◽  
Yu Fang

Most existing medical event extraction methods have primarily adopted a simplex model based on either pattern matching or deep learning, which ignores the distribution characteristics of entities and events in the medical corpus. They have not categorized the granularity of event elements, leading to the poor generalization ability of the model. This paper proposes a diagnosis and treatment event extraction method in the Chinese language, fusing long short-level semantic dependency of the corpus, LSLSD, for solving these problems. LSLSD can effectively capture different levels of semantic information within and between event sentences in the electronic medical record (EMR) corpus. Moreover, the event arguments are divided into short word-level and long sentence-level, with the sequence annotation and pattern matching combined to realize multi-granularity argument recognition, as well as to improve the generalization ability of the model. Finally, this paper constructs a diagnosis and treatment event data set of Chinese EMRs by proposing a semi-automatic corpus labeling method, and an enormous number of experiment results show that LSLSD can improve the F1-value of event extraction task by 7.1% compared with the several strong baselines.



Author(s):  
Johana Chylíková

The aim of this chapter is to illustrate the application of the quasi-simplex model (QSM) for reliability estimation in longitudinal data and to employ it to obtain information about the reliability of the European Union—Survey on Income and Living Conditions (EU-SILC) data collected between 2012 and 2017. Reliability of two survey questions is analysed: one which asks respondents about the financial situation in their households, and one which requests information about respondents’ health. Employing the QSM on the two items resulted in 80 reliability estimates from 17 and 11 European countries, respectively. Results revealed statistically significant differences in reliability between post-communist Central and Eastern European (CEE) countries and the rest of Europe, and similar patterns of the size of reliability estimates were observed for both items. The highest reliability (i.e. reliability over 0.8) was observed in CEE countries such as Bulgaria, Romania, Czechia, Poland, and Hungary. The lowest reliability (i.e. reliability lower than 0.7) was observed for data from Sweden, Slovenia, Norway, Spain, Portugal, Austria, Italy, and the Netherlands. The remarkable variation in longitudinal reliability across culturally and historically different European regions is discussed both from substantive and methodological perspectives.



Author(s):  
Alexandru Cernat ◽  
Peter Lugtig ◽  
Nicole Watson ◽  
S.C. Noah Uhrig

The quasi-simplex model (QSM) makes use of at least three repeated measures of the same variable to estimate reliability. The model has rather strict assumptions and ignoring them may bias estimates of reliability. While some previous studies have outlined how several of its assumptions can be relaxed, they have not been exhaustive and systematic. Thus, it is unclear what all the assumptions are and how to test and free them in practice. This chapter will addresses this situation by presenting the main assumptions of the quasi-simplex model and the ways in which users can relax these with relative ease when more than three waves are available. Additionally, by using data from the British Household Panel Survey we show how this is practically done and highlight the potential biases found when ignoring the violations of the assumptions. We conclude that relaxing the assumptions should be implemented routinely when more than three waves of data are available.



2020 ◽  
Author(s):  
Noémi Katalin Schuurman

Recent development in the collection and modeling of intensive longitudinal data has made it possible to fit dynamic genetic twin models, in which within-person processes are separated into a genetic and environmental component. A relatively well-known dynamic twin model is the genetic simplex model, which is fitted to a few repeated measures for a group of twins. A more recently developed model is the iFACE model, which is fitted to many repeated measures for a single pair of twins. In this paper we introduce a missing link between these two models - a multilevel extension that allows for making both population-level and twin-level inferences. We provide a proof-of-principle simulation study for this model, and apply it to an experience sampling data set on 148 monozygotic and 88 dizygotic twins. We use the multilevel model to examine the overlap and differences between the dynamic genetic twin models and the classic twin models, as well as their interpretation.



2020 ◽  
Author(s):  
Vukašin Gligorić ◽  
Ana Vilotijević

Although it began recently, research on pseudo-profound bullshit has corroborated substantial knowledge about people who are receptive to this kind of bullshit. We built on this individual difference paradigm by investigating the relationship of pseudo-profound bullshit receptivity with disintegration, while also trying to replicate the findings on neoliberals’ susceptibility to this kind of bullshit (Sterling et al., 2016) in another cultural context. In the present paper, we report the results from a partially published study. We found the association between pseudo-profound bullshit receptivity and disintegration which can be explained by the O/I simplex model. We did not replicate the relationship with neoliberalism, possibly due to lower power and/or different cultural settings.



Author(s):  
Ken Kobayashi ◽  
Naoki Hamada ◽  
Akiyoshi Sannai ◽  
Akinori Tanaka ◽  
Kenichi Bannai ◽  
...  

Multi-objective optimization problems require simultaneously optimizing two or more objective functions. Many studies have reported that the solution set of an M-objective optimization problem often forms an (M − 1)-dimensional topological simplex (a curved line for M = 2, a curved triangle for M = 3, a curved tetrahedron for M = 4, etc.). Since the dimensionality of the solution set increases as the number of objectives grows, an exponentially large sample size is needed to cover the solution set. To reduce the required sample size, this paper proposes a Bézier simplex model and its fitting algorithm. These techniques can exploit the simplex structure of the solution set and decompose a high-dimensional surface fitting task into a sequence of low-dimensional ones. An approximation theorem of Bézier simplices is proven. Numerical experiments with synthetic and real-world optimization problems demonstrate that the proposed method achieves an accurate approximation of high-dimensional solution sets with small samples. In practice, such an approximation will be conducted in the postoptimization process and enable a better trade-off analysis.



2016 ◽  
Vol 7 (2) ◽  
pp. 87-96
Author(s):  
Veronica Stefan ◽  
Valentin Radu

Abstract Identifying, analyzing and using the most appropriate and efficient methods for planning business processes is a key to success for every enterprise. Our paper are two objectives: to identify the most proper methods used for planning and applying for the same case study. We are looking to demonstrate the methodology of using some software tools and compare the obtained results. The application domain will be the planning of production processes but these methods can be extended to the service processes. The methodology is represented by the mathematical models and software applications in this fields, such as Simplex model theory, Solver Excel tools, WinQSE application, linear programming with PL.exe. The results of the research have a high applicability level and can be extended in other business fields too.



Author(s):  
Mike A. Kheirallah ◽  
Badih Jawad ◽  
Liping Liu

Noise reduction is considered as a challenging task in the engineering field. The main objective of this study is focused on providing an optimal new design of a cooling fan with better performance by minimizing the acoustic signature using the surface dipole acoustic power as function. The process of designing a new cooling fan with optimal performance and reduced acoustic signature can be fairly lengthy and expensive. With the use of CFD and specific tools like mesh morphing, in conjunction with state-of-the-art optimization techniques such as Simple model, a given baseline design can be optimized for performance and acoustics. The present study focuses on minimizing the acoustic signature of a given cooling fan using the surface dipole acoustic power as the objective function. The Mesh Morpher Optimizer (MMO) in ANSYS Fluent is used in conjunction with a Simplex model of the broadband acoustic modeling. The broadband model estimated the acoustic power of the surface dipole sources on the surface of the blade without the need for expensive unsteady simulations. It has been shown in the previous work that such a model can provide reliable design guidance. The new promising approach has shown a reduced dipole surface intensity of around 46% of the original value. Other acoustics sources (quadropole noise) are ignored due to the relatively low fan speed considered in this study. Considering this as first attempt study, it is believed that advanced additional studies may improve the model in changing the mesh and objective function.





Sign in / Sign up

Export Citation Format

Share Document