scholarly journals Seleksi Fitur Pada Klasifikasi Multi-label Menggunakan Proportional Feature Rough Selector

2021 ◽  
Vol 8 (4) ◽  
pp. 2084-2094
Author(s):  
Vilat Sasax Mandala Putra Paryoko

Proportional Feature Rough Selector (PFRS) merupakan sebuah metode seleksi fitur yang dikembangkan berdasarkan Rough Set Theory (RST). Pengembangan ini dilakukan dengan merinci pembagian wilayah dalam set data menjadi beberapa bagian penting yaitu lower approximation, upper approximation dan boundary region. PFRS memanfaatkan boundary region untuk menemukan wilayah yang lebih kecil yaitu Member Section (MS) dan Non-Member Section (NMS). Namun PFRS masih hanya digunakan dalam seleksi fitur pada klasifikasi biner dengan tipe data teks. PFRS ini juga dikembangkan tanpa memperhatikan hubungan antar fitur, sehingga PFRS memiliki potensi untuk ditingkatkan dengan mempertimbangkan korelasi antar fitur dalam set data. Untuk itu, penelitian ini bertujuan untuk melakukan penyesuaian PFRS untuk bisa diterapkan pada klasifikasi multi-label dengan data campuran yakni data teks dan data bukan teks serta mempertimbangkan korelasi antar fitur untuk meningkatkan performa klasifikasi multi-label. Pengujian dilakukan pada set data publik yaitu 515k Hotel Reviews dan Netflix TV Shows. Set data ini diuji dengan menggunakan empat metode klasifikasi yaitu DT, KNN, NB dan SVM. Penelitian ini membandingkan penerapan seleksi fitur PFRS pada data multi-label dengan pengembangan PFRS yaitu dengan mempertimbangkan korelasi. Hasil penelitian menunjukkan bahwa penggunaan PFRS berhasil meningkatkan performa klasifikasi. Dengan mempertimbangkan korelasi, PFRS menghasilkan peningkatan akurasi hingga 23,76%. Pengembangan PFRS juga menunjukkan peningkatan kecepatan yang signifikan pada semua metode klasifikasi sehingga pengembangan PFRS dengan mempertimbangkan korelasi mampu memberikan kontribusi dalam meningkatkan performa klasifikasi.

Author(s):  
Mona Hosny ◽  
Ali Kandil ◽  
Osama A. El-Tantawy ◽  
Sobhy A. El-Sheikh

This chapter concerns construction of a new rough set structure for an ideal ordered topological spaces and ordered topological filters. The approximation space approached depend on general binary relation, partially order relation, ideal and filter concepts. Properties of lower and upper approximation are extended to an ideal order topological approximation spaces. The main aim of the rough set theory is reducing the bouwndary region by increasing the lower approximation and decreasing the upper approximation. So, in this chapter different methods are proposed to reduce the boundary region. Comparisons between the current approximations and the previous approximations (El-Shafei et al.,2013) are introduced. It's therefore shown that the current approximations are more generally and reduce the boundary region by increasing the lower approximation and decreasing the upper approximation. The lower and upper approximations satisfy some properties in analogue of Pawlak's spaces (Pawlak, 1982). Moreover, we give several examples for comparison between the current approach and (El-Shafei et al., 2013).


2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Hengrong Ju ◽  
Huili Dou ◽  
Yong Qi ◽  
Hualong Yu ◽  
Dongjun Yu ◽  
...  

Decision-theoretic rough set is a quite useful rough set by introducing the decision cost into probabilistic approximations of the target. However, Yao’s decision-theoretic rough set is based on the classical indiscernibility relation; such a relation may be too strict in many applications. To solve this problem, aδ-cut decision-theoretic rough set is proposed, which is based on theδ-cut quantitative indiscernibility relation. Furthermore, with respect to criterions of decision-monotonicity and cost decreasing, two different algorithms are designed to compute reducts, respectively. The comparisons between these two algorithms show us the following: (1) with respect to the original data set, the reducts based on decision-monotonicity criterion can generate more rules supported by the lower approximation region and less rules supported by the boundary region, and it follows that the uncertainty which comes from boundary region can be decreased; (2) with respect to the reducts based on decision-monotonicity criterion, the reducts based on cost minimum criterion can obtain the lowest decision costs and the largest approximation qualities. This study suggests potential application areas and new research trends concerning rough set theory.


Rough set theory is a mathematical method proposed by Pawlak . Rough set theory has been developed to manage uncertainties in information that presents missing and noises. Rough set theory is an expansion of the conventional set theory that supports approximations in decision making process. Fundamental of assumption of rough set theory is that with every object of the universe has some information associated it. Rough set theory is correlate two crisp sets, called lower and upper approximation. The lower approximation of a set consists of all elements that surely belong to the set, and the upper approximation of the set constitutes of all elements that possibly belong to the set. The boundary region of the set consists of all elements that cannot be classified uniquely as belonging to the set or as belonging to its complement, with respect to the available knowledge Rough sets are applied in several domains, such as, pattern recognition, medicine, finance, intelligent agents, telecommunication, control theory ,vibration analysis, conflict resolution, image analysis, process industry, marketing, banking risk assessment etc. This paper gives detail survey of rough set theory and its properties and various applications of rough set theory.


Filomat ◽  
2020 ◽  
Vol 34 (2) ◽  
pp. 287-301
Author(s):  
Mona Hosny

The current work concentrates on generating different topologies by using the concept of the ideal. These topologies are used to make more thorough studies on generalized rough set theory. The rough set theory was first proposed by Pawlak in 1982. Its core concept is upper and lower approximations. The principal goal of the rough set theory is reducing the vagueness of a concept to uncertainty areas at their borders by increasing the lower approximation and decreasing the upper approximation. For the mentioned goal, different methods based on ideals are proposed to achieve this aim. These methods are more accurate than the previous methods. Hence it is very interesting in rough set context for removing the vagueness (uncertainty).


2019 ◽  
Vol 2019 ◽  
pp. 1-8 ◽  
Author(s):  
Ferdaous Bouaziz ◽  
Naveed Yaqoob

This paper concerns the study of hyperfilters of ordered LA-semihypergroups, and presents some examples in this respect. Furthermore, we study the combination of rough set theory and hyperfilters of an ordered LA-semihypergroup. We define the concept of rough hyperfilters and provide useful examples on it. A rough hyperfilter is a novel extension of hyperfilter of an ordered LA-semihypergroup. We prove that the lower approximation of a left (resp., right, bi) hyperfilter of an ordered LA-semihypergroup becomes left (resp., right, bi) hyperfilter of an ordered LA-semihypergroup. Similarly we prove it for upper approximation.


2014 ◽  
Vol 1 (1) ◽  
pp. 1-14 ◽  
Author(s):  
Sharmistha Bhattacharya Halder

The concept of rough set was first developed by Pawlak (1982). After that it has been successfully applied in many research fields, such as pattern recognition, machine learning, knowledge acquisition, economic forecasting and data mining. But the original rough set model cannot effectively deal with data sets which have noisy data and latent useful knowledge in the boundary region may not be fully captured. In order to overcome such limitations, some extended rough set models have been put forward which combine with other available soft computing technologies. Many researchers were motivated to investigate probabilistic approaches to rough set theory. Variable precision rough set model (VPRSM) is one of the most important extensions. Bayesian rough set model (BRSM) (Slezak & Ziarko, 2002), as the hybrid development between rough set theory and Bayesian reasoning, can deal with many practical problems which could not be effectively handled by original rough set model. Based on Bayesian decision procedure with minimum risk, Yao (1990) puts forward a new model called decision theoretic rough set model (DTRSM) which brings new insights into the probabilistic approaches to rough set theory. Throughout this paper, the concept of decision theoretic rough set is studied and also a new concept of Bayesian decision theoretic rough set is introduced. Lastly a comparative study is done between Bayesian decision theoretic rough set and Rough set defined by Pawlak (1982).


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Bin Yang ◽  
Ziqiong Lin ◽  
William Zhu

Rough set theory is an efficient and essential tool for dealing with vagueness and granularity in information systems. Covering-based rough set theory is proposed as a significant generalization of classical rough sets. Matroid theory is a vital structure with high applicability and borrows extensively from linear algebra and graph theory. In this paper, one type of covering-based approximations is studied from the viewpoint of Eulerian matroids. First, we explore the circuits of an Eulerian matroid from the perspective of coverings. Second, this type of covering-based approximations is represented by the circuits of Eulerian matroids. Moreover, the conditions under which the covering-based upper approximation operator is the closure operator of a matroid are presented. Finally, a matroidal structure of covering-based rough sets is constructed. These results show many potential connections between covering-based rough sets and matroids.


2012 ◽  
Vol 548 ◽  
pp. 735-739
Author(s):  
Hong Mei Nie ◽  
Jia Qing Zhou

Rough set theory has been proposed by Pawlak as a useful tool for dealing with the vagueness and granularity in information systems. Classical rough set theory is based on equivalence relation. The covering rough sets are an improvement of Pawlak rough set to deal with complex practical problems which the latter one can not handle. This paper studies covering-based generalized rough sets. In this setting, we investigate common properties of classical lower and upper approximation operations hold for the covering-based lower and upper approximation operations and relationships among some type of covering rough sets.


Author(s):  
ZHIMING ZHANG ◽  
JINGFENG TIAN

Intuitionistic fuzzy (IF) rough sets are the generalization of traditional rough sets obtained by combining the IF set theory and the rough set theory. The existing research on IF rough sets mainly concentrates on the establishment of lower and upper approximation operators using constructive and axiomatic approaches. Less effort has been put on the attribute reduction of databases based on IF rough sets. This paper systematically studies attribute reduction based on IF rough sets. Firstly, attribute reduction with traditional rough sets and some concepts of IF rough sets are reviewed. Then, we introduce some concepts and theorems of attribute reduction with IF rough sets, and completely investigate the structure of attribute reduction. Employing the discernibility matrix approach, an algorithm to find all attribute reductions is also presented. Finally, an example is proposed to illustrate our idea and method. Altogether, these findings lay a solid theoretical foundation for attribute reduction based on IF rough sets.


Sign in / Sign up

Export Citation Format

Share Document