term weights
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 0)

2020 ◽  
Vol 34 (4) ◽  
pp. 445-456
Author(s):  
Radha Mothukuri ◽  
Bobba Basaveswararao ◽  
Suneetha Bulla

The world has taken dramatic transformation after advent of Information Technology, it is hard to find the people without cyber connected and every activity of us is guided and regulated by the connected networks. As the world is depending upon the information technology there is same extent of research is getting on cyber monitoring activities taking place around the world. Now, it is very vital to classify and prediction of cybercrimes on the connected era. The objective of the paper is to classify the cyber crime judgments precedents for providing knowledgeable and relevant information to the cyber crime legal stakeholders. The stakeholders extract information from the precedents is a crucial research problem because so much of judgments available in a digital form with remarkable evaluation of internet and bid data analytics. It is necessary to classify the precedents and to provide a bird- eye view of the relevant legal topics. In this study cybercrime related 2500 judgments are considered for evaluation of the Feed Forward Neural - Shuffled Frog Leaping (FNN-SFL) model. To achieve this objective a Feed Forward Neural based model with tuning of Term weights by adaption of a Bio Inspired tuning model Shuffled Frog Leaping model. The experiments are conducted and implemented the newly proposed FNN-SFL algorithm. The results and discussions are presented. The conclusions and future scope are presented at the end of the paper.


Author(s):  
Edward Kai Fung Dang ◽  
Robert Wing Pong Luk ◽  
James Allan

Feature engineering is one aspect of knowledge engineering. Besides feature selection, the appropriate assignment of feature values is also crucial to the performance of many software applications, such as text categorization (TC) and speech recognition. In this work, we develop a general method to enhance TC performance by the use of context-dependent feature values (aka term weights), which are obtained by a novel adaptation of a context-dependent adjustment procedure previously shown to be effective in information retrieval. The motivation of our approach is that the general method can be used with different text representations and in combination of other TC techniques. Experiments on several test collections show that our context-dependent feature values can improve TC over traditional context-independent unigram feature values, using a strong classifier like Support Vector Machine (SVM), which past works have found to be hard to improve. We also show that the relative performance improvement of our method over the context-independent baseline is comparable to the levels attained by recent word embedding methods in the literature, while an advantage of our approach is that it does not require the substantial training needed to learn word embedding representations.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 200063-200072
Author(s):  
Songchun Yang ◽  
Xiangwen Zheng ◽  
Xiangfei Yin ◽  
Huajian Mao ◽  
Dongsheng Zhao

2014 ◽  
Vol 701-702 ◽  
pp. 110-113
Author(s):  
Qi Rui Zhang ◽  
He Xian Wang ◽  
Jiang Wei Qin

This paper reports a comparative study of feature selection algorithms on a hyperlipimedia data set. Three methods of feature selection were evaluated, including document frequency (DF), information gain (IG) and aχ2 statistic (CHI). The classification systems use a vector to represent a document and use tfidfie (term frequency, inverted document frequency, and inverted entropy) to compute term weights. In order to compare the effectives of feature selection, we used three classification methods: Naïve Bayes (NB), k Nearest Neighbor (kNN) and Support Vector Machines (SVM). The experimental results show that IG and CHI outperform significantly DF, and SVM and NB is more effective than KNN when macro-averagingF1 measure is used. DF is suitable for the task of large text classification.


2014 ◽  
Vol 65 (6) ◽  
pp. 1134-1148 ◽  
Author(s):  
Edward K. F. Dang ◽  
Robert W. P. Luk ◽  
James Allan

Sign in / Sign up

Export Citation Format

Share Document