scholarly journals Maintenance Web Based Applications Using Feature Location

2020 ◽  
Vol 5 (2) ◽  
pp. 115
Author(s):  
Achmad Arwan ◽  
Denny Sagita Rusdianto

Maintenance web applications are a complex set of efforts. The FilkomApps are the web application used by the Faculty of Computer Science of Universitas Brawijaya to arrange the academic, theses of students, assignment of faculty, inventory, presence, honorarium. It has about 6K number of files(HTML, PHP, JS, CSS). The feature location was able to help the maintenance of the web applications by locating specific features on the files. The process comprises of preprocessing (tokenizing, web language syntax removal, splitting, stopword and stemming), indexing (VSM Lucene), and evaluations (precision and recall). The experiments were done by querying the keywords originate from previous maintenance modification effort and feature of a system. The results of precision were 86% and recall were 47%. The precision was better 374% than the conventional method (using the IDE search feature)

2018 ◽  
Vol 1 (2) ◽  
pp. 25-35
Author(s):  
Aliga Paul Aliga ◽  
Adetokunbo MacGregor John-Otumu ◽  
Rebecca E Imhanhahimi ◽  
Atuegbelo Confidence Akpe

Web-based applications has turn out to be very prevalent due to the ubiquity of web browsers to deliver service oriented application on-demand to diverse client over the Internet and cross site scripting (XSS) attack is a foremost security risk that has continuously ravage the web applications over the years. This paper critically examines the concept of XSS and some recent approaches for detecting and preventing XSS attacks in terms of architectural framework, algorithm used, solution location, and so on. The techniques were analysed and results showed that most of the available recognition and avoidance solutions to XSS attacks are more on the client end than the server end because of the peculiar nature of web application vulnerability and they also lack support for self-learning ability in order to detect new XSS attacks. Few researchers as cited in this paper inculcated the self-learning ability to detect and prevent XSS attacks in their design architecture using artificial neural networks and soft computing approach; a lot of improvement is still needed to effectively and efficiently handle the web application security menace as recommended.


Author(s):  
Kimihito Ito ◽  
Yuzuru Tanaka

Web applications, which are computer programs ported to the Web, allow end-users to use various remote services and tools through their Web browsers. There are an enormous number of Web applications on the Web, and they are becoming the basic infrastructure of everyday life. In spite of the remarkable development of Web-based infrastructure, it is still difficult for end-users to compose new integrated tools of both existing Web applications and legacy local applications, such as spreadsheets, chart tools, and database. In this chapter, the authors propose a new framework where end-users can wrap remote Web applications into visual components, called pads, and functionally combine them together through drag-and-drop operations. The authors use, as the basis, a meme media architecture IntelligentPad that was proposed by the second author. In the IntelligentPad architecture, each visual component, called a pad, has slots as data I/O ports. By pasting a pad onto another pad, users can integrate their functionalities. The framework presented in this chapter allows users to visually create a wrapper pad for any Web application by defining HTML nodes within the Web application to work as slots. Examples of such a node include input-forms and text strings on Web pages. Users can directly manipulate both wrapped Web applications and wrapped local legacy tools on their desktop screen to define application linkages among them. Since no programming expertise is required to wrap Web applications or to functionally combine them together, end-users can build new integrated tools of both wrapped Web applications and local legacy applications.


2015 ◽  
Vol 30 (2) ◽  
pp. 220-236 ◽  
Author(s):  
Frances Buchanan ◽  
Niccolo Capanni ◽  
Horacio González-Vélez

AbstractThe sources of information on the Web relating to Fine Art and in particular to Fine Artists are numerous, heterogeneous and distributed. Data relating to the biographies of an artist, images of their artworks, location of the artworks and exhibition reviews invariably reside in distinct and seemingly unrelated, or at least unlinked, sources. While communication and exchange exists, there is a great deal of independence between major repositories, such as museum, often owing to their ownership or heritage. This increases the individuality in the repository’s own processes and dissemination. It is currently necessary to browse through numerous different websites to obtain information about any one artist, and at this time there is little aggregation of Fine Art Information. This is in contrast to the domain of books and music, where the aggregation and re-grouping of information (usually by author or artist/band name) has become the norm. A Museum API (Application Programming Interface), however, is a tool that can facilitate a similar information service for the domain of Fine Art, by allowing the retrieval and aggregation of Web-based Fine Art Information, whilst at the same time increasing public access to the content of a museum’s collection. In this paper, we present the case for a pragmatic solution to the problems of heterogeneity and distribution of Fine Art Data and this is the first step towards the comprehensive re-presentation of Fine Art Information in a more ‘artist-centric’ way, via accessible Web applications. This paper examines the domain of Fine Art Information on the Web, putting forward the case for more Web services such as generic Museum APIs, highlighting this via a prototype Web application known as the ArtBridge. The generic Museum API is the standardisation mechanism to enable interfacing with specific Museum APIs.


Author(s):  
Priyanka Dixit

This chapter describes how security is an important aspect in today's digital world. Every day technology grows with new advancements in various areas, especially in the development of web-based applications. All most all of the web applications are on the internet, hence there is a large probability of attacks on those applications and threads. This makes security necessary while developing any web application. Lots of techniques have been developed for mitigating and defending against threats to the web based applications over the internet. This chapter overviews the important region of web application security, by sequencing the current strategies into a major picture to further the future research and advancement. Firstly, this chapter explains the major problem and obstacles that makes efforts unsuccessful for developing secure web applications. Next, this chapter distinguishes three basic security properties that a web application should possess: validation, integrity, accuracy and portray the comparing vulnerabilities that damage these properties alongside the assault vectors that contain these vulnerabilities.


Author(s):  
Ganeshkumar S ◽  
Elango Govindaraju

The end to end encryption of connections over the internet have evolved from SSL to TLS 1.3 over the years. Attacks have exposed vulnerabilities on each upgraded version of the cryptographic protocols used to secure connections over the internet. Organisations have to keep updating their web based applications to use the latest cryptographic protocol to ensure users are protected and feel comfortable using their web applications. But, the problem is that, web applications are not always standalone systems, there is usually a maze of systems that are integrated to provide services to the end user. The interactions between these systems happens within the controlled internal private network environment of the organisation. While only the front ending web application is visible to the end user. It is not often feasible to upgrade all internal systems to use the latest cryptographic protocol for internal interfaces/integration due to prohibitive cost of redevelopment and upgrades to infra and systems. Here we define an algorithm to setup internal & external firewalls to downgrade to a lower version of the cryptographic protocol (SSL) within the internal network for the integration/interfacing connections of internal systems while mandating the latest cryptographic protocol (TLS 1.x) for end user connections to the web application.


2012 ◽  
Vol 2 (2) ◽  
pp. 112-116
Author(s):  
Shikha Bhatia ◽  
Mr. Harshpreet Singh

With the mounting demand of web applications, a number of issues allied to its quality have came in existence. In the meadow of web applications, it is very thorny to develop high quality web applications. A design pattern is a general repeatable solution to a generally stirring problem in software design. It should be noted that design pattern is not a finished product that can be directly transformed into source code. Rather design pattern is a depiction or template that describes how to find solution of a problem that can be used in many different situations. Past research has shown that design patterns greatly improved the execution speed of a software application. Design pattern are classified as creational design patterns, structural design pattern, behavioral design pattern, etc. MVC design pattern is very productive for architecting interactive software systems and web applications. This design pattern is partition-independent, because it is expressed in terms of an interactive application running in a single address space. We will design and analyze an algorithm by using MVC approach to improve the performance of web based application. The objective of our study will be to reduce one of the major object oriented features i.e. coupling between model and view segments of web based application. The implementation for the same will be done in by using .NET framework.


2018 ◽  
Vol 48 (3) ◽  
pp. 84-90 ◽  
Author(s):  
E. A. Lapchenko ◽  
S. P. Isakova ◽  
T. N. Bobrova ◽  
L. A. Kolpakova

It is shown that the application of the Internet technologies is relevant in the selection of crop production technologies and the formation of a rational composition of the machine-and-tractor fl eet taking into account the conditions and production resources of a particular agricultural enterprise. The work gives a short description of the web applications, namely “ExactFarming”, “Agrivi” and “AgCommand” that provide a possibility to select technologies and technical means of soil treatment, and their functions. “ExactFarming” allows to collect and store information about temperature, precipitation and weather forecast in certain areas, keep records of information about crops and make technological maps using expert templates. “Agrivi” allows to store and provide access to weather information in the fi elds with certain crops. It has algorithms to detect and make warnings about risks related to diseases and pests, as well as provides economic calculations of crop profi tability and crop planning. “AgCommand” allows to track the position of machinery and equipment in the fi elds and provides data on the weather situation in order to plan the use of agricultural machinery in the fi elds. The web applications presented hereabove do not show relation between the technologies applied and agro-climatic features of the farm location zone. They do not take into account the phytosanitary conditions in the previous years, or the relief and contour of the fi elds while drawing up technological maps or selecting the machine-and-tractor fl eet. Siberian Physical-Technical Institute of Agrarian Problems of Siberian Federal Scientifi c Center of AgroBioTechnologies of the Russian Academy of Sciences developed a software complex PIKAT for supporting machine agrotechnologies for production of spring wheat grain at an agricultural enterprise, on the basis of which there is a plan to develop a web application that will consider all the main factors limiting the yield of cultivated crops.


2021 ◽  
Vol 13 (2) ◽  
pp. 50
Author(s):  
Hamed Z. Jahromi ◽  
Declan Delaney ◽  
Andrew Hines

Content is a key influencing factor in Web Quality of Experience (QoE) estimation. A web user’s satisfaction can be influenced by how long it takes to render and visualize the visible parts of the web page in the browser. This is referred to as the Above-the-fold (ATF) time. SpeedIndex (SI) has been widely used to estimate perceived web page loading speed of ATF content and a proxy metric for Web QoE estimation. Web application developers have been actively introducing innovative interactive features, such as animated and multimedia content, aiming to capture the users’ attention and improve the functionality and utility of the web applications. However, the literature shows that, for the websites with animated content, the estimated ATF time using the state-of-the-art metrics may not accurately match completed ATF time as perceived by users. This study introduces a new metric, Plausibly Complete Time (PCT), that estimates ATF time for a user’s perception of websites with and without animations. PCT can be integrated with SI and web QoE models. The accuracy of the proposed metric is evaluated based on two publicly available datasets. The proposed metric holds a high positive Spearman’s correlation (rs=0.89) with the Perceived ATF reported by the users for websites with and without animated content. This study demonstrates that using PCT as a KPI in QoE estimation models can improve the robustness of QoE estimation in comparison to using the state-of-the-art ATF time metric. Furthermore, experimental result showed that the estimation of SI using PCT improves the robustness of SI for websites with animated content. The PCT estimation allows web application designers to identify where poor design has significantly increased ATF time and refactor their implementation before it impacts end-user experience.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 130
Author(s):  
Emil Semastin ◽  
Sami Azam ◽  
Bharanidharan Shanmugam ◽  
Krishnan Kannoorpatti ◽  
Mirjam Jonokman ◽  
...  

Today’s contemporary business world has incorporated Web Services and Web Applications in its core of operating cycle nowadays and security plays a major role in the amalgamation of such services and applications with the business needs worldwide. OWASP (Open Web Application Security Project) states that the effectiveness of security mechanisms in a Web Application can be estimated by evaluating the degree of vulnerability against any of the nominated top ten vulnerabilities, nominated by the OWASP. This paper sheds light on a number of existing tools that can be used to test for the CSRF vulnerability. The main objective of the research is to identify the available solutions to prevent CSRF attacks. By analyzing the techniques employed in each of the solutions, the optimal tool can be identified. Tests against the exploitation of the vulnerabilities were conducted after implementing the solutions into the web application to check the efficacy of each of the solutions. The research also proposes a combined solution that integrates the passing of an unpredictable token through a hidden field and validating it on the server side with the passing of token through URL.  


ReCALL ◽  
1999 ◽  
Vol 11 (S1) ◽  
pp. 31-39
Author(s):  
Pierre-Yves Foucou ◽  
Natalie Kübler

In this paper, we present the Web-based CALL environment (or WALL) which is currently being experimented with at the University of Paris 13 in the Computer Science Department of the Institut Universitaire de Technologie. Our environment is being developed to teach computer science (CS) English to CS French-speaking students, and will be extended to other languages for specific purposes such as, for example, English or French for banking, law, economics or medicine, where on-line resources are available.English, and more precisely CS English is, for our students, a necessary tool, and not an object of study. The learning activities must therefore stimulate the students' interest and reflection about language phenomena. Our pedagogical objective, relying on research acquisition (Wokusch 1997) consists in linking various texts together with other documents, such as different types of dictionaries or other types of texts, so that knowledge can be acquired using various appropriate contexts.Language teachers are not supposed to be experts in fields such as computer sciences or economics. We aim at helping them to make use of the authentic documents that are related to the subject area in which they teach English. As shown in Foucou and Kübler (1998) the wide range of resources available on the Web can be processed to obtain corpora, i.e. teaching material. Our Web-based environment therefore provides teachers with a series of tools which enable them to access information about the selected specialist subject, select appropriate specialised texts, produce various types of learning activities and evaluate students' progress.Commonly used textbooks Tor specialised English offer a wide range of learning activities, but they are based on documents that very quickly become obsolete, and that are sometimes widely modified. Moreover, they are not adaptable to the various levels of language of the students. From the students' point of view, working on obsolete texts that are either too easy or too difficult can quickly become demotivating, not to say boring.In the next section, we present the general architecture of the teaching/learning environment; the method of accessing and using it, for teachers as well as for students, is then described. The following section deals with the actual production of exercises and their limits. We conclude and present some possible research directions.


Sign in / Sign up

Export Citation Format

Share Document