Research on Clustering Analysis Based on SOM

2013 ◽  
Vol 475-476 ◽  
pp. 968-971
Author(s):  
Hai Xue Liu ◽  
Rui Jun Yang ◽  
Wen Ju Li ◽  
Wan Jun Yu ◽  
Wei Lu

In this paper, we present an improved text clustering algorithm. It not only maintains the self-organizing features of SOM network, but also makes up the disadvantages of the bad clustering effect caused by the inadequate selection of K-means algorithm. Firstly, data is preprocessed to form vector space model for subsequent process. Then, we analyze the features of original clustering algorithm and SOM algorithm, and plan an improved SOM clustering algorithm to overcome low stability and poor quality of original algorithm. The experimental results indicate that the improved algorithm has a higher accuracy and has a better stability, compared with the original algorithm.

2015 ◽  
Vol 14 (7) ◽  
pp. 5877-5886
Author(s):  
Khalid Kahloot ◽  
Mohammad A. Mikki ◽  
Akram A. ElKhatib

Text in articles is based on expert opinion of a large number of people including the views of authors. These views are based on cultural or community aspects, which make extracting information from text very difficult. This paper introduced how to utilize the capabilities of a modified graph-based Self-Organizing Map (SOM) in showing text similarities. Text similarities are extracted from an article using Google's PageRank algorithm. Sentences from an input article are represented as graph model instead of vector space model. The resulted graph can be shown in a visual animation for eight famous graph algorithms execution with animation speed control.The resulted graph is used as an input to SOM. SOM clustering algorithm is used to construct knowledge from text data. We used a visual animation for eight famous graph methods with animation speed control and according to similarity measure; an adjustable number of most similar sentences are arranged in visual form. In addition, this paper presents a wide variety of text searching. We had compared our project with famous clustering and visualization project in term of purity, entropy and F measure. Our project showed accepted results and mostly superiority over other projects.


The importance of some issues of legal and organizational support of local self-government is highlighted. In the research general and special methods were applied: systemic, comparative legal, formal-logical, etc., which allowed consider-ing the limited volume of the article to approach the stated research problems in the most complex way. On the basis of analysis of norms of the modern legislation, law-enforcement practice, and also the experience of work in local governments the author defines four problems of modern legal support of local government and its organization which are significantly hindering the development of this form of democracy. These are the unreasonable dynamism of the legislation on local government, establishment of excessive and unreasonable dispositive norms, poor quality of municipal law-making, imperfect system of selection of personnel for municipal service. The author proposes legislative and law enforcement ways to solve the mentioned problems. This, in turn, should contribute to improving the level of guarantee of local self-government and the implementation of the Article 12 of the Constitution of the Russian Federation.


2014 ◽  
Vol 41 (6) ◽  
pp. 400-405 ◽  
Author(s):  
Maria Elizete de Almeida Araújo ◽  
Anderson da Paz Penha ◽  
Fernando Luiz Westphal ◽  
Marcus Tolentino Silva ◽  
Taís Freire Galvão

Objective: To evaluate the effectiveness and safety of correction of pectus excavatum by the Nuss technique based on the available scientific evidence.Methods: We conducted an evidence synthesis following systematic processes of search, selection, extraction and critical appraisal. Outcomes were classified by importance and had their quality assessed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE).Results: The process of selection of items led to the inclusion of only one systematic review, which synthesized the results of nine observational studies comparing the Nuss and Ravitch procedures. The evidence found was rated as poor and very poor quality. The Nuss procedure has increased the incidence of hemothorax (RR = 5.15; 95% CI: 1.07; 24.89), pneumothorax (RR = 5.26; 95% CI: 1.55; 17.92) and the need for reintervention (RR = 4.88; 95% CI: 2.41; 9.88) when compared to the Ravitch. There was no statistical difference between the two procedures in outcomes: general complications, blood transfusion, hospital stay and time to ambulation. The Nuss operation was faster than the Ravitch (mean difference [MD] = -69.94 minutes, 95% CI: -139.04, -0.83).Conclusion: In the absence of well-designed prospective studies to clarify the evidence, especially in terms of aesthetics and quality of life, surgical indication should be individualized and the choice of the technique based on patient preference and experience of the team.


2013 ◽  
Vol 655-657 ◽  
pp. 1000-1004
Author(s):  
Chen Guang Yan ◽  
Yu Jing Liu ◽  
Jin Hui Fan

SOM (Self-organizing Map) algorithm is a clustering method basing on non-supervision condition. The paper introduces an improved algorithm based on SOM neural network clustering. It proposes SOM’s basic theory on data clustering. For SOM’s practical problems in applications, the algorithm also improved the selection of initial weights and the scope of neighborhood parameters. Finally, the simulation results in Matlab prove that the improved clustering algorithm improve the correct rate and computational efficiency of data clustering and to make the convergence speed better.


2020 ◽  
Vol 77 (9) ◽  
pp. 974-985
Author(s):  
Sanja Uzelac ◽  
Radica Zivkovic-Zaric ◽  
Milan Radovanovic ◽  
Goran Rankovic ◽  
Slobodan Jankovic

Backgroun/Aim. Although majority of guidelines recommend triazoles (voriconazole, posaconazole, itraconazole and isavuconazole) as first-line therapeutic option for treatment of invasive aspergillosis, echinocandins (caspofungin, micafungin and anidulafungin) are also used for this purpose. However, head-to-head comparison of triazoles and echinocandins for invasive aspergillosis was rarely target of clinical trials. The aim of this meta-analysis was to compare efficacy and safety of triazoles and echinocandins when used for treatment of patients with invasive aspergillosis. Methods. This meta-analysis was based on systematic search of literature and selection of high-quality evidence according to pre-set inclusion and exclusion criteria. The literature search was made for comparison of treatment with any of triazoles (isavuconazole, itraconazole, posaconazole or voriconazole) versus any of echinocandins (caspofungin, anidulafungin or micafungin). The effects of triazoles (itraconazole, posaconazole or voriconazole) and echinocandins (caspofungin, anidulafungin or micafungin) were summarized using RevMan 5.3.5 software, and heterogeneity assessed by the Cochrane Q test and I? values. Several types of bias were assessed, and publication bias was shown by the funnel plot and Egger?s regression. Results. Two clinical trials and three cohort studies were included in this meta-analysis. Mortality in patients with invasive aspergillosis who were treated with triazoles was significantly lower than in patients treated with echinocandins [odds ratio 0.29 (0.13, 0.67)], and rate of favorable response (overall treatment success) 12 weeks after the therapy onset was higher in patients treated with triazoles [3.05 (1.52, 6.13)]. On the other hand, incidence of adverse events was higher with triazoles than with echinocandins in patients treated for invasive aspergillosis [3.75 (0.89, 15.76)], although this difference was not statistically significant. Conclusion.Triazoles (voriconazole in the first place) could be considered as more effective and somewhat less safe therapeutic option than echinocandins for invasive aspergillosis: However, due to poor quality of studies included in this meta-analysis, definite conclusion should await results of additional, well designed clinical trials.


2021 ◽  
Vol 37 ◽  
pp. 00110
Author(s):  
Vaselina Lyubomirova ◽  
Elena Romanova ◽  
Vasily Romanov ◽  
Ludmila Shadieva

The work is devoted to the study of reproductive aging of African catfish in the conditions of industrial aquaculture. The problem is urgent, because industrial aquaculture changes the biology of African catfish so much that it loses its ability to reproduce naturally. The offspring of African catfish can be obtained only with the use of hormonal inducers of gametogenesis. Questions of age selection of producers and age composition of breeding stock in this type of fish are still open. In practice, we have to face the facts of poor quality of sexual products in primary spawning or old females and males. The aim of the study was a comparative assessment of age-related variability of reproductive properties of female and male African catfish in the conditions of industrial aquaculture. The results of our study showed the presence of age-related dynamics in the quality of sexual products in African catfish. Age-dependent differences in the quality and fertilization of eggs, the viability of embryos and larvae, their size, and the quality of offspring were established. When studying the properties of sexual products in fish of different ages, differences in morphometric and physiological parameters were found for a complex of indicators such as the size and diameter of eggs, sperm concentration, and the number of viable spermatozoa. Evaluation of the influence of parents age on the viability of offspring in the embryonic and postembryonic periods revealed that this indicator is the lowest in first – spawning fish, and the highest in middle-aged fish. The Russian Foundation for Basic Research has supported our study with the grant No. 18-416-730005.


2017 ◽  
Vol 10 (2) ◽  
pp. 474-479
Author(s):  
Ankush Saklecha ◽  
Jagdish Raikwal

Clustering is well-known unsupervised learning method. In clustering a set of essentials is separated into uniform groups.K-means is one of the most popular partition based clustering algorithms in the area of research. But in the original K-means the quality of the resulting clusters mostly depends on the selection of initial centroids, so number of iterations is increase and take more time because of that it is computationally expensive. There are so many methods have been proposed for improving accuracy, performance and efficiency of the k-means clustering algorithm. This paper proposed enhanced K-Means Clustering approach in addition to Collaborative filtering approach to recommend quality content to its users. This research would help those users who have to scroll through pages of results to find important content.


Data Mining is the process of extracting useful information. Data Mining is about finding new information from pre-existing databases. It is the procedure of mining facts from data and deals with the kind of patterns that can be mined. Therefore, this proposed work is to detect and categorize the illness of people who are affected by Dengue through Data Mining techniques mainly as the Clustering method. Clustering is the method of finding related groups of data in a dataset and used to split the related data into a group of sub-classes. So, in this research work clustering method is used to categorize the age group of people those who are affected by mosquito-borne viral infection using K-Means and Hierarchical Clustering algorithm and Kohonen-SOM algorithm has been implemented in Tanagra tool. The scientists use the data mining algorithm for preventing and defending different diseases like Dengue disease. This paper helps to apply the algorithm for clustering of Dengue fever in Tanagra tool to detect the best results from those algorithms.


2021 ◽  
Vol 21 (1) ◽  
pp. 96-104
Author(s):  
S. S. Gurskiy ◽  
N. S. Mogilevskaya

Introduction. In all types of digital communication, error control coding techniques are used. Many digital communication standards, such as Wi-Fi and 5G, use low density parity check (LDPC) codes. These codes are popular because they provide building encoders and decoders with low computational complexity. This work objective is to increase the error correcting capability of the well-known bit-flipping decoder (BF) of LDPC-codes. For this purpose, a modification of the decoder is built, which enables to dynamically control one of its main parameters whose choice affects significantly the quality of decoding.Materials and Methods. The well-known bit-flipping decoder of binary LDPC-codes is considered. This decoder has several parameters that are not rigidly bound with the code parameters. The dependence of the decoding quality on the selection of the output parameters of the bit-flipping decoder was investigated through simulation modeling. It is shown that the decoding results in this case are significantly affected by the input parameter of the decoder — threshold T. A modification of the BF-decoder of binary LDPC-codes has been developed, in which it is proposed to set the threshold dynamically during the execution of the algorithm depending on the error rate. A comparative analysis of the error- correcting capability of decoders is carried out by the simulation modeling method.Results. A lemma on the maximum value of the decoder threshold T is formulated and proved. Upper bounds for the number of operations are found for the original and modified decoders. A simulation model that implements a digital noise-immune communication channel has been built. In the model, the initial data is encoded with a given LDPC-code, then it is made noisy by additive uniformly distributed errors, and thereafter, it is decoded in turn by the bit-flipping algorithm with different threshold T parameters, as well as by a modified decoder. Based on the input and output data, the correction capacity of the decoders used is estimated. Experiments have shown that the error-correcting capability of the modified decoder in the range of the real error rate is higher than that of the original decoder, regardless of the selection of its parameters.Discussion and Conclusions. The lemma, proved in the paper, sets the upper bound on the threshold value in the original decoder, which simplifies its adjustment. The developed modification of the decoder has a better error- correcting capability compared to the original decoder. Nevertheless, the complexity of the modification is slightly increased compared to the original algorithm. It has been pointed out that the decoding quality of a modified decoder develops with a decrease in the number of cycles in the Tanner graph and an increase in the length of the code.Keywords: LDPC-codes, error-correcting capability, dynamic threshold, binary symmetric channel, experimental research.


2013 ◽  
Vol 303-306 ◽  
pp. 1026-1029
Author(s):  
Xue Dong Fan

Abstract. In this paper, a clustering algorithm based on data mining technology applications, the use of the extraction mode noise characteristics amount and pattern recognition algorithms, extraction and selection of the characteristic quantities of the three types of mode, carried out under the same working conditions data mining clustering analysis ultimately satisfying recognition.


Sign in / Sign up

Export Citation Format

Share Document