Rough Web Intelligent Techniques for Page Recommendation

Author(s):  
H. Inbarani ◽  
K. Thangavel

Recommender systems represent a prominent class of personalized Web applications, which particularly focus on the user-dependent filtering and selection of relevant information. Recommender Systems have been a subject of extensive research in Artificial Intelligence over the last decade, but with today’s increasing number of e-commerce environments on the Web, the demand for new approaches to intelligent product recommendation is higher than ever. There are more online users, more online channels, more vendors, more products, and, most importantly, increasingly complex products and services. These recent developments in the area of recommender systems generated new demands, in particular with respect to interactivity, adaptivity, and user preference elicitation. These challenges, however, are also in the focus of general Web page recommendation research. The goal of this chapter is to develop robust techniques to model noisy data sets containing an unknown number of overlapping categories and apply them for Web personalization and mining. In this chapter, rough set-based clustering approaches are used to discover Web user access patterns, and these techniques compute a number of clusters automatically from the Web log data using statistical techniques. The suitability of rough clustering approaches for Web page recommendation are measured using predictive accuracy metrics.

2021 ◽  
Vol 13 (2) ◽  
pp. 50
Author(s):  
Hamed Z. Jahromi ◽  
Declan Delaney ◽  
Andrew Hines

Content is a key influencing factor in Web Quality of Experience (QoE) estimation. A web user’s satisfaction can be influenced by how long it takes to render and visualize the visible parts of the web page in the browser. This is referred to as the Above-the-fold (ATF) time. SpeedIndex (SI) has been widely used to estimate perceived web page loading speed of ATF content and a proxy metric for Web QoE estimation. Web application developers have been actively introducing innovative interactive features, such as animated and multimedia content, aiming to capture the users’ attention and improve the functionality and utility of the web applications. However, the literature shows that, for the websites with animated content, the estimated ATF time using the state-of-the-art metrics may not accurately match completed ATF time as perceived by users. This study introduces a new metric, Plausibly Complete Time (PCT), that estimates ATF time for a user’s perception of websites with and without animations. PCT can be integrated with SI and web QoE models. The accuracy of the proposed metric is evaluated based on two publicly available datasets. The proposed metric holds a high positive Spearman’s correlation (rs=0.89) with the Perceived ATF reported by the users for websites with and without animated content. This study demonstrates that using PCT as a KPI in QoE estimation models can improve the robustness of QoE estimation in comparison to using the state-of-the-art ATF time metric. Furthermore, experimental result showed that the estimation of SI using PCT improves the robustness of SI for websites with animated content. The PCT estimation allows web application designers to identify where poor design has significantly increased ATF time and refactor their implementation before it impacts end-user experience.


2021 ◽  
pp. 08-14
Author(s):  
Nafea ali majeed .. ◽  
◽  
◽  
◽  
Khalid Hameed Zaboon ◽  
...  

Recently, the technology become an important part of our live, and it is employed to work together with the Medicine, Space Science, Agriculture, and industry and more else. Stored the information in the servers and cloud become required. It is a global force that has transformed people's lives with the availability of various web applications that serve billions of websites every day. However, there are many types of attack could be targeting the internet, and there is a need to recognize, classify and protect thesis types of attack. Due to its important global role, it has become important to ensure that web applications are secure, accurate, and of high quality. One of the basic problems found on the Web is DDoS attacks. In this work, the review classifies and delineates attack types, test characteristics, evaluation techniques; evaluation methods and test data sets used in the proposed Strategic Strategy methodology. Finally, this work affords guidance and possible targets in the fight against creating better events to overcome the most dangers Cyber-attack types which is DDoS attacks.


Author(s):  
Giuliano Armano ◽  
Alessandro Giuliani ◽  
Eloisa Vargiu

Information Filtering deals with the problem of selecting relevant information for a given user, according to her/his preferences and interests. In this chapter, the authors consider two ways of performing information filtering: recommendation and contextual advertising. In particular, they study and analyze them according to a unified view. In fact, the task of suggesting an advertisement to a Web page can be viewed as the task of recommending an item (the advertisement) to a user (the Web page), and vice versa. Starting from this insight, the authors propose a content-based recommender system based on a generic solution for contextual advertising and a hybrid contextual advertising system based on a generic hybrid recommender system. Relevant case studies have been considered (i.e., a photo recommender and a Web advertiser) with the goal of highlighting how the proposed approach works in practice. In both cases, results confirm the effectiveness of the proposed solutions.


Author(s):  
John DiMarco

Web authoring is the process of developing Web pages. The Web development process requires you to use software to create functional pages that will work on the Internet. Adding Web functionality is creating specific components within a Web page that do something. Adding links, rollover graphics, and interactive multimedia items to a Web page creates are examples of enhanced functionality. This chapter demonstrates Web based authoring techniques using Macromedia Dreamweaver. The focus is on adding Web functions to pages generated from Macromedia Fireworks and to overview creating Web pages from scratch using Dreamweaver. Dreamweaver and Fireworks are professional Web applications. Using professional Web software will benefit you tremendously. There are other ways to create Web pages using applications not specifically made to create Web pages. These applications include Microsoft Word and Microsoft PowerPoint. The use of Microsoft applications for Web page development is not covered in this chapter. However, I do provide steps on how to use these applications for Web page authoring within the appendix of this text. If you feel that you are more comfortable using the Microsoft applications or the Macromedia applications simply aren’t available to you yet, follow the same process for Web page conceptualization and content creation and use the programs available to you. You should try to get Web page development skills using Macromedia Dreamweaver because it helps you expand your software skills outside of basic office applications. The ability to create a Web page using professional Web development software is important to building a high-end computer skills set. The main objectives of this chapter are to get you involved in some technical processes that you’ll need to create the Web portfolio. Focus will be on guiding you through opening your sliced pages, adding links, using tables, creating pop up windows for content and using layers and timelines for dynamic HTML. The coverage will not try to provide a complete tutorial set for Macromedia Dreamweaver, but will highlight essential techniques. Along the way you will get pieces of hand coded action scripts and JavaScripts. You can decide which pieces you want to use in your own Web portfolio pages. The techniques provided are a concentrated workflow for creating Web pages. Let us begin to explore Web page authoring.


2018 ◽  
Vol 7 (3.29) ◽  
pp. 275
Author(s):  
P Chandrashaker Reddy ◽  
A Suresh Babu

With the coming of the World Wide Web and the rise of web-based business applications and informal organizations, associations over the web create a lot of information on a daily basis. It is becoming more complex and critical task to retrieve exact information from web expected by its users. In the recent times, the Web has extended its noteworthiness to the point of transforming into the point of convergence of our propelled lives. The search engine as an apparatus to explore the web must get the coveted outcomes for any given query. The greater part of the search engines can't totally fulfill user’s necessities and the outcomes are regularly inaccurate and irrelevant. knowledge of ontology and history is not much personalization in the existing techniques. To conquer these issues, data mining systems must be connected to the web and one advanced powerful concept is web-page recommendation which is becoming more powerful now a day. In this paper, the design of a fuzzy logic classifier algorithm is defined as a search problem in the solution space where every node represents a rule set, membership function, and the particular framework behaviour. Therefore, the hybrid optimization algorithm is applied to search for an optimal location of this solution space which hopefully represents the near optimal rule set and membership function. In this article, we reviewed various techniques proposed by different researchers for web page personalization and proposed a novel approach for finding optimal solutions to search the relevant information..  


GEOMATICA ◽  
2020 ◽  
Author(s):  
Françoise Bahoken ◽  
Grégoire Le Campion ◽  
Marion Maisonobe ◽  
Laurent Jégou ◽  
Étienne Côme

RESUMÉ. L’analyse de la dynamique des aires urbaines ou des métropoles, la délimitation de leurs aires fonctionnelles et la comparaison spatio-temporelle de leurs motifs est souvent freinée par l’insuffisance de données relationnelles (portant sur des liens entre des entités) ouvertes et l’absence jusque récemment de dispositifs d’analyse et de géo-visualisation dédiés. Au-delà des questions d’ouverture des données (géo)numériques, nous proposons un panorama du geoweb, le processus de création de cartes dans le contexte du web 2.0, spécifique aux flux et réseaux localisés. L’éclairage ainsi apporté sur les pratiques cartographiques actuelles révèle trois grandes familles d’applications web ainsi que les besoins d’une communauté, restreinte mais dynamique, d’analyser librement ses propres jeux de données. ABSTRACT. Analysing the dynamics of urban areas or metropolises, delineating their functional areas and comparing their spatio-temporal patterns is often limited by the lack of open relational data (on links between entities) and the absence until recently of dedicated analysis and geo-visualization frameworks. Beyond the questions of opening (geo)digital data, we propose a panorama of a geoweb, the process of creating maps in the context of the Web 2.0, specific to flows and networks. The insights provided on current mapping practices reveal three main families of web applications, as well as the needs of a small but dynamic community to freely analyze its own data sets.


2013 ◽  
Vol 16 (1) ◽  
Author(s):  
Luis Rivero ◽  
Raimundo Barreto ◽  
Tayana Conte

Usability is one of the most relevant quality aspects in Web applications. A Web application is usable if it provides a friendly, direct and easy to understand interface. Many Usability Inspection Methods (UIMs) have been proposed as a cost effective way to enhance usability. However, many companies are not aware of these UIMs and consequently, are not using them. A secondary study can identify, evaluate and interpret all data that is relevant to the current knowledge available regarding UIMs that have been used to evaluate Web applications in the past few decades. Therefore, we have extended a systematic mapping study about Usability Evaluation Methods by analyzing 26 of its research papers from which we extracted and categorized UIMs. We provide practitioners and researches with the rationale to understand both the strengths and weaknesses of the emerging UIMs for the Web. Furthermore, we have summarized the relevant information of the UIMs, which suggested new ideas or theoretical basis regarding usability inspection in the Web domain. In addition, we present a new UIM and a tool for Web usability inspection starting from the results shown in this paper.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Deivamani Mallayya ◽  
Baskaran Ramachandran ◽  
Suganya Viswanathan

Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user’s request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.


2011 ◽  
pp. 232-255
Author(s):  
Roberto Paiano ◽  
Leonardo Mangia ◽  
Vito Perrone

This chapter defines a publishing model for Web applications starting from the analysis of the most well-known modeling methodology, such as HDM, OOHDM, WebML, Conallen’s method and others. The analysis has been focused to verify the state of art about the modeling of Web application pages. In particular, the different types of elements that compose the Web page in the above models are taken into consideration. This chapter describes the evolution of the HDM methodology starting from the first approach based on the definition of a LP concept up to the more structured and complex Conceptual page, based on the influence of “operations” on the modeling of the dynamics of navigation between pages.


2020 ◽  
Vol 27 (2) ◽  
pp. 1-14
Author(s):  
Ann Ablahd ◽  
Suhair Dawwod

At present the web applications are used for most of the life activities, these applications are affected by an attack called (Structure Query Language Injection Attack) SQLIA due to the vulnerabilities of the web application. The vulnerabilities of the web application are increased because most of application developers do not care to security in designing.SQL injection is a common attack that infects a web application. The attacker adds (Structured Query Language) SQL code to web page for accessing and changing victim databases.The vital step in securing the database and detecting such an attack in web apps is preparing a tool. Many researchers propose different ways for detection and prevention of such as an attack. In this paper a tool it proposed using a powerful micro-framework web application designer called Flask in Python 3.7 to detect and prevent such attacks. The proposed system is called SQLIAD. SQLIAD analyzed a web application on-line.


Sign in / Sign up

Export Citation Format

Share Document