Review of Improvement of Web Search Based on Web Log File

2012 ◽  
Vol 3 (2) ◽  
pp. 298-300 ◽  
Author(s):  
Soniya P. Chaudhari ◽  
Prof. Hitesh Gupta ◽  
S. J. Patil

In this paper we review various research of journal paper as Web Searching efficiency improvement. Some important method based on sequential pattern Mining. Some are based on supervised learning or unsupervised learning. And also used for other method such as Fuzzy logic and neural network

Author(s):  
Amina Kemmar ◽  
Yahia Lebbah ◽  
Samir Loudni

Mining web access patterns consists in extracting knowledge from server log files. This problem is represented as a sequential pattern mining problem (SPM) which allows to extract patterns which are sequences of accesses that occur frequently in the web log file. There are in the literature many efficient algorithms to solve SMP (e.g., GSP, SPADE, PrefixSpan, WAP-tree, LAPIN, PLWAP). Despite the effectiveness of these methods, they do not allow to express and to handle new constraints defined on patterns, new implementations are required. Recently, many approaches based on constraint programming (CP) was proposed to solve SPM in a declarative and generic way. Since no CP-based approach was applied for mining web access patterns, the authors introduce in this paper an efficient CP-based approach for solving the web log mining problem. They bring back the problem of web log mining to SPM within a CP environment which enables to handle various constraints. Experimental results on non-trivial web log mining problems show the effectiveness of the authors' CP-based mining approach.


Author(s):  
Yan Chen ◽  
Yan-Qing Zhang

For most Web searching applications, queries are commonly ambiguous because words or phrases have different linguistic meanings for different Web users. The conventional keyword-based search engines cannot disambiguate queries to provide relevant results matching Web users’ intents. Traditional Word Sense Disambiguation (WSD) methods use statistic models or ontology-based knowledge systems to measure associations among words. The contexts of queries are used for disambiguation in these methods. However, due to the fact that numerous combinations of words may appear in queries and documents, it is difficult to extract concepts’ relations for all possible combinations. Moreover, queries are usually short, so contexts in queries do not always provide enough information to disambiguate queries. Therefore, the traditional WSD methods are not sufficient to provide accurate search results for ambiguous queries. In this chapter, a new model, Granular Semantic Tree (GST), is introduced for more conveniently representing associations among concepts than the traditional WSD methods. Additionally, users’ preferences are used to provide personalized search results that better adapt to users’ unique intents. Fuzzy logic is used to determine the most appropriate concepts related to queries based on contexts and users’ preferences. Finally, Web pages are analyzed by the GST model. The concepts of pages for the queries are evaluated, and the pages are re-ranked according to similarities of concepts between pages and queries.


Healthcare ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 234 ◽  
Author(s):  
Hyun Yoo ◽  
Soyoung Han ◽  
Kyungyong Chung

Recently, a massive amount of big data of bioinformation is collected by sensor-based IoT devices. The collected data are also classified into different types of health big data in various techniques. A personalized analysis technique is a basis for judging the risk factors of personal cardiovascular disorders in real-time. The objective of this paper is to provide the model for the personalized heart condition classification in combination with the fast and effective preprocessing technique and deep neural network in order to process the real-time accumulated biosensor input data. The model can be useful to learn input data and develop an approximation function, and it can help users recognize risk situations. For the analysis of the pulse frequency, a fast Fourier transform is applied in preprocessing work. With the use of the frequency-by-frequency ratio data of the extracted power spectrum, data reduction is performed. To analyze the meanings of preprocessed data, a neural network algorithm is applied. In particular, a deep neural network is used to analyze and evaluate linear data. A deep neural network can make multiple layers and can establish an operation model of nodes with the use of gradient descent. The completed model was trained by classifying the ECG signals collected in advance into normal, control, and noise groups. Thereafter, the ECG signal input in real time through the trained deep neural network system was classified into normal, control, and noise. To evaluate the performance of the proposed model, this study utilized a ratio of data operation cost reduction and F-measure. As a result, with the use of fast Fourier transform and cumulative frequency percentage, the size of ECG reduced to 1:32. According to the analysis on the F-measure of the deep neural network, the model had 83.83% accuracy. Given the results, the modified deep neural network technique can reduce the size of big data in terms of computing work, and it is an effective system to reduce operation time.


Author(s):  
Zheng Zhang ◽  
Jianrong Zheng

Taking the crankshaft-rolling bearing system in a certain type of compressor as the research objective, dynamic analysis software is used to conduct detailed dynamic analysis and optimal design under the rated power of the compressor. Using Hertz mathematical formula and the analysis method of the superstatic orientation problem, the relationship expression between the bearing force and deformation of the rolling bearing is solved, and the dynamic analysis model of the elastic crankshaft-rolling bearing system is constructed in the simulation software ADAMS. The weighted average amplitude of the center of the neck between the main bearings is used as the target, and the center line of the compressor cylinder is selected as the design variable. Finally, an example analysis shows that by introducing the fuzzy logic neural network algorithm into the compressor crankshaft-rolling bearing system design, the optimal solution between the design variables and the objective function can be obtained, which is of great significance to the subsequent compressor dynamic design.


Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3373
Author(s):  
Ludek Cicmanec

The main objective of this paper is to describe a building process of a model predicting the soil strength at unpaved airport surfaces (unpaved runways, safety areas in runway proximity, runway strips, and runway end safety areas). The reason for building this model is to partially substitute frequent and meticulous inspections of an airport movement area comprising the bearing strength evaluation and provide an efficient tool to organize surface maintenance. Since the process of building such a model is complex for a physical model, it is anticipated that it might be addressed by a statistical model instead. Therefore, fuzzy logic (FL) and artificial neural network (ANN) capabilities are investigated and compared with linear regression function (LRF). Large data sets comprising the bearing strength and meteorological characteristics are applied to train the likely model variations to be subsequently compared with the application of standard statistical quantitative parameters. All the models prove that the inclusion of antecedent soil strength as an additional model input has an immense impact on the increase in model accuracy. Although the M7 model out of the ANN group displays the best performance, the M3 model is considered for practical implications being less complicated and having fewer inputs. In general, both the ANN and FL models outperform the LRF models well in all the categories. The FL models perform almost equally as well as the ANN but with slightly decreased accuracy.


Sign in / Sign up

Export Citation Format

Share Document