scholarly journals Design and Implementation of ASCII based Method for Author Attribution

With the presence of computer and internet, a developing variety of hoodlums are utilizing the web to spread a wide extend of illicit materials and wrong information universally in mysterious manner, making criminal personality following troublesome in the cybercrime examination handle. The virtual world provides criminals with an anonymous environment to conduct malicious activities such as malware, sending random messages, spamming, stealing intellectual property and sending ransom e-mails. All of these activities are text in somehow. Therefore, there is a need for a tool in order to identify the author or creator of this criminality by analyzing the text. Text-based Authorship Attribution techniques are used to identify the most possible author from a bunch of potential suspects of text. In this paper, the novel approach is presented for authorship attribution in English text using ASCII based processing approach Using this ASCII based method for authorship attribution help us to obtain better result in terms of accuracy and computational efficiency. The result is based on the text which is posted on social media considering real world data set.

Electronics ◽  
2020 ◽  
Vol 9 (9) ◽  
pp. 1525
Author(s):  
Chathurangi Edussuriya ◽  
Kasun Vithanage ◽  
Namila Bandara ◽  
Janaka Alawatugoda ◽  
Manjula Sandirigama ◽  
...  

The Internet of Things (IoT) is the novel paradigm of connectivity and the driving force behind state-of-the-art applications and services. However, the exponential growth of the number of IoT devices and services, their distributed nature, and scarcity of resources has increased the number of security and privacy concerns ranging from the risks of unauthorized data alterations to the potential discrimination enabled by data analytics over sensitive information. Thus, a blockchain based IoT-platform is introduced to address these issues. Built upon the tamper-proof architecture, the proposed access management mechanisms ensure the authenticity and integrity of data. Moreover, a novel approach called Block Analytics Tool (BAT), integrated with the platform is proposed to analyze and make predictions on data stored on the blockchain. BAT enables the data-analysis applications to be developed using the data stored in the platform in an optimized manner acting as an interface to off-chain processing. A pharmaceutical supply chain is used as the use case scenario to show the functionality of the proposed platform. Furthermore, a model to forecast the demand of the pharmaceutical drugs is investigated using a real-world data set to demonstrate the functionality of BAT. Finally, the performance of BAT integrated with the platform is evaluated.


2019 ◽  
Vol 16 (1) ◽  
pp. 22-36
Author(s):  
Muchou Wang ◽  
Yiming Li ◽  
Sheng Luo ◽  
Zhuxin Hu

With the development of service-oriented architecture, the number of services is expanding rapidly. Important services usually have high quality, and they can be recommended to users if the users do not give any keyword. However, how to discover the important services is still a problem facing many people. In this article, the authors propose a novel approach to discover important services based on service networks. First, their approach uses service networks to abstract services and the relations between them. Second, the authors employ the weighted k-core decomposition approach in the field of complex networks to partition the service network into a layered structure and calculate the weighted coreness value of each service node. Finally, services will be ranked according to their weighted coreness values in a descending order. The top-ranked services are the important ones the authors' approach recommends. Experimental results on a real-world data set crawled from ProgrammableWeb validate the effectiveness of their approach.


2011 ◽  
Vol 38 (1) ◽  
pp. 3-14 ◽  
Author(s):  
Xiaolin Zheng ◽  
Zhongkai Hu ◽  
Aiwu Xu ◽  
DeRen Chen ◽  
Kuang Liu ◽  
...  

Obtaining answers from community-based question answering (CQA) services is typically a lengthy process. In this light, the authors propose an algorithm that recommends answer providers. A two-step framework is developed, in which a query likelihood language model is constructed that enables the determination of the interests of answer providers. The model is then used to identify answer providers who are interested in answering questions related to the identified topics. At the same time, a maximum entropy model is designed to estimate answer quality. Finally, an answer-quality-based algorithm is developed to model the expertise of answer providers for the purpose of differentiating answer providers of various capacities. The proposed scheme leverages answer provider interest and expertise, allowing for more effective differentiation. Experiments on real-world data from Baidu Knows, a renowned Chinese CQA service similar to Yahoo! Answers, reveal significant improvements over the baseline methods, and test results demonstrate the effective of the novel approach.


Author(s):  
Bruno Ohana ◽  
Brendan Tierney

Opinion Mining is an emerging field of research concerned with applying computational methods to the treatment of subjectivity in text, with a number of applications in fields such as recommendation systems, contextual advertising and business intelligence. In this chapter the authors survey the area of opinion mining and discuss the SentiWordNet lexicon of sentiment information for terms derived from WordNet. Furthermore, the results of their research in applying this lexicon to sentiment classification of film reviews along with a novel approach that leverages opinion lexicons to build a data set of features used as input to a supervised learning classifier are also presented. The results obtained are in line with other experiments based on manually built opinion lexicons with further improvements obtained by using the novel approach, and are indicative that lexicons built using semi supervised methods such as SentiWordNet can be an important resource in sentiment classification tasks. Considerations on future improvements are also presented based on a detailed analysis of classification results.


Sci ◽  
2020 ◽  
Vol 2 (2) ◽  
pp. 27
Author(s):  
Jessica Cooper ◽  
Ognjen Arandjelović

In recent years, a range of problems under the broad umbrella of computer vision based analysis of ancient coins have been attracting an increasing amount of attention. Notwithstanding this research effort, the results achieved by the state of the art in published literature remain poor and far from sufficiently well performing for any practical purpose. In the present paper we present a series of contributions which we believe will benefit the interested community. We explain that the approach of visual matching of coins, universally adopted in existing published papers on the topic, is not of practical interest because the number of ancient coin types exceeds by far the number of those types which have been imaged, be it in digital form (e.g., online) or otherwise (traditional film, in print, etc.). Rather, we argue that the focus should be on understanding the semantic content of coins. Hence, we describe a novel approach—to first extract semantic concepts from real-world multimodal input and associate them with their corresponding coin images, and then to train a convolutional neural network to learn the appearance of these concepts. On a real-world data set, we demonstrate highly promising results, correctly identifying a range of visual elements on unseen coins with up to 84% accuracy.


2018 ◽  
Vol 124 (2) ◽  
pp. 283-290 ◽  
Author(s):  
Alessandro Bellofiore ◽  
Rebecca Vanderpool ◽  
Melanie J. Brewis ◽  
Andrew J. Peacock ◽  
Naomi C. Chesler

Clinical assessment of right ventricular (RV) contractility in diseases such as pulmonary arterial hypertension (PAH) has been hindered by the lack of a robust methodology. Here, a novel, clinically viable, single-beat method was developed to assess end-systolic elastance (Ees), a measure of right ventricular (RV) contractility. We hypothesized that this novel approach reduces uncertainty and interobserver variability in the estimation of the maximum isovolumic pressure (Piso), the key step in single-beat methods. The new method was designed to include a larger portion of the RV pressure data and minimize subjective adjustments by the operator. Data were obtained from right heart catheterization of PAH patients in a multicenter prospective study ( data set 1) and a single-center retrospective study ( data set 2). To obtain Piso, three independent observers used an established single-beat method (based on the first derivative of the pressure waveform) and the novel method (based on the second derivative). Interobserver variability analysis included paired t-test, one-way ANOVA, interclass correlation (ICC) analysis, and a modified Bland-Altman analysis. The Piso values obtained from the two methods were linearly correlated for both data set 1 ( R2 = 0.74) and data set 2 ( R2 = 0.91). Compared with the established method, the novel method resulted in smaller interobserver variability ( P < 0.001), nonsignificant differences between observers, and a narrower confidence interval. By reducing uncertainty and interobserved variability, this novel approach may pave the way for more effective clinical management of PAH. NEW & NOTEWORTHY A novel methodology to assess right ventricular contractility from clinical data is demonstrated. This approach significantly reduces interobserver variability in the analysis of ventricular pressure data, as demonstrated in a relatively large population of subjects with pulmonary hypertension. This study may enable more accurate clinical monitoring of systolic function in subjects with pulmonary hypertension.


Sci ◽  
2020 ◽  
Vol 2 (1) ◽  
pp. 8 ◽  
Author(s):  
Jessica Cooper ◽  
Ognjen Arandjelović

In recent years, a range of problems under the broad umbrella of computer vision based analysis of ancient coins have been attracting an increasing amount of attention. Notwithstanding this research effort, the results achieved by the state of the art in published literature remain poor and far from sufficiently well performing for any practical purpose. In the present paper we present a series of contributions which we believe will benefit the interested community. We explain that the approach of visual matching of coins, universally adopted in existing published papers on the topic, is not of practical interest because the number of ancient coin types exceeds by far the number of those types which have been imaged, be it in digital form (e.g., online) or otherwise (traditional film, in print, etc.). Rather, we argue that the focus should be on understanding the semantic content of coins. Hence, we describe a novel approach—to first extract semantic concepts from real-world multimodal input and associate them with their corresponding coin images, and then to train a convolutional neural network to learn the appearance of these concepts. On a real-world data set, we demonstrate highly promising results, correctly identifying a range of visual elements on unseen coins with up to 84% accuracy.


Author(s):  
Guo-Zheng Li

This chapter introduces great challenges and the novel machine learning techniques employed in clinical data processing. It argues that the novel machine learning techniques including support vector machines, ensemble learning, feature selection, feature reuse by using multi-task learning, and multi-label learning provide potentially more substantive solutions for decision support and clinical data analysis. The authors demonstrate the generalization performance of the novel machine learning techniques on real world data sets including one data set of brain glioma, one data set of coronary heart disease in Chinese Medicine and some tumor data sets of microarray. More and more machine learning techniques will be developed to improve analysis precision of clinical data sets.


2012 ◽  
pp. 875-897
Author(s):  
Guo-Zheng Li

This chapter introduces great challenges and the novel machine learning techniques employed in clinical data processing. It argues that the novel machine learning techniques including support vector machines, ensemble learning, feature selection, feature reuse by using multi-task learning, and multi-label learning provide potentially more substantive solutions for decision support and clinical data analysis. The authors demonstrate the generalization performance of the novel machine learning techniques on real world data sets including one data set of brain glioma, one data set of coronary heart disease in Chinese Medicine and some tumor data sets of microarray. More and more machine learning techniques will be developed to improve analysis precision of clinical data sets.


2021 ◽  
Author(s):  
Jessica Fagan ◽  
Jon E. Grahe ◽  
Kate Faasse ◽  
Amber Matteson ◽  
Ricky Haneda ◽  
...  

Dr. Jon Grahe and the students of his 2020 Fall Semester class, Advanced Statistics and Research Methods at Pacific Lutheran University, three collaborators outside of the class, and three collaborators from other universities, collected data from online participants living in the United States (Nraw = 1019, Nprocessed = 821). Participants answered questions pertaining to various subjects related to the Novel Coronavirus-2019 pandemic including how closely they had been gathering information, their information sources, level of trust in various authorities, perceived risk, knowledge of COVID-19, avoidance behaviors, demographics, and COVID-19 exposure. The data are available here: https://osf.io/e8rzm/. These data could be used to examine psychological variables during a pandemic as well as provide a novel, real-world data set for students studying statistics and research methods.


Sign in / Sign up

Export Citation Format

Share Document