scholarly journals A New Open-Source Web Application with Animations to Support Learning of Neuron-to-Neuron Signaling

2021 ◽  
Vol 83 (9) ◽  
pp. 600-602
Author(s):  
Catherine E. LePrevost ◽  
W. Gregory Cope ◽  
Yan Shen ◽  
Donnie Wrights

Pesticides and their associated modes of action serve as real-world examples of chemical toxicity, stimulating student interest and supporting their understanding of nervous system function and cell signaling. An open-source web application called “Neuron-to-Neuron Normal and Toxic Actions” hosts narrated animations of pesticide toxic actions and exists as a resource for instructors of advanced secondary or undergraduate biology courses. This article describes the features of the web application, reports student feedback on the animations, and details a cooperative learning procedure for instructors to use the web application in online learning environments or in-person classroom settings with technology support.

2018 ◽  
Vol 7 (2.7) ◽  
pp. 941 ◽  
Author(s):  
M Surekha ◽  
K Kiran Kumar ◽  
M V.S.Prasanth ◽  
P S.G.Aruna Sri

Web Applications security has turned out to be logically more essential nowadays. Tremendous quantities of assaults are being sent on the web application layer. Because of emotional increment in Web applications, security gets helpless against assortment of dangers. The ma-jority of these assaults are focused towards the web application layer and system firewall alone can't keep these sorts of assaults. The essen-tial explanation for achievement of these assaults is the numbness of utilization designers while composing the web applications and the vulnerabilities in the current advancements. Web application assaults are the most recent pattern and programmers are attempting to abuse the web application utilizing diverse strategies. Different arrangements are accessible as open source and in business showcase. Be that as it may, the choice of appropriate answer for the security of the authoritative frameworks is a noteworthy issue. This overview paper looked at the Web Application Firewall (WAF) arrangements with critical highlights essential for the security at application layer. Basic examination on WAF arrangements is useful for the clients to choose the most appropriate answer for their surroundings.  


2021 ◽  
Vol 17 (2) ◽  
pp. 58-65
Author(s):  
Iman Khazal ◽  
Mohammed Hussain

Cross-Site Scripting (XSS) is one of the most common and dangerous attacks. The user is the target of an XSS attack, but the attacker gains access to the user by exploiting an XSS vulnerability in a web application as Bridge. There are three types of XSS attacks: Reflected, Stored, and Dom-based. This paper focuses on the Stored-XSS attack, which is the most dangerous of the three. In Stored-XSS, the attacker injects a malicious script into the web application and saves it in the website repository. The proposed method in this paper has been suggested to detect and prevent the Stored-XSS. The prevent Stored-XSS Server (PSS) was proposed as a server to test and sanitize the input to web applications before saving it in the database. Any user input must be checked to see if it contains a malicious script, and if so, the input must be sanitized and saved in the database instead of the harmful input. The PSS is tested using a vulnerable open-source web application and succeeds in detection by determining the harmful script within the input and prevent the attack by sterilized the input with an average time of 0.3 seconds.


2018 ◽  
Vol 7 (2.32) ◽  
pp. 431
Author(s):  
K Siva Prasad ◽  
Dr K. Raja Sekhar ◽  
Dr P. Rajarajeswari

Current digitized world has surpassed the days of mere existence of internet. Furnishing the services through web has become the most often element to be implemented by almost every sector. These ever-changing technologies has also brought about, devastating evading techniques compromising the fragility of the web application. Assessing the existing vulnerabilities of a web application and testing all possible penetrations would be tedious if the tools used are bearing a cost factor. This paper suggests an integrated approach of assessing the vulnerabilities in any web application using free and open source tools where the reports are generated with respect to vulnerabilities and their categories and level of severity. The tools are integrated and correlated for producing the accurate results in better manner similar to the results produced by the commercial ones. The analysis has been done by considering the reports released by OWASP, OSSTMM, ISSAF, CVE, Exploit Database and Microsoft Vulnerability Research. The report produced after vulnerability assessment has been taken for testing different penetrations for a single application. The identified vulnerabilities are therefore exploited for testing the penetrations of a web application. The report will be generated finally stating all possible exploitable vulnerabilities that are encountered in a web application. The final report generated would help the developers to fix the vulnerable issues.  


Author(s):  
Shweta Sondarva ◽  
Dr. Priyanka Sharma ◽  
Prof. Dharti Dholariya

This paper describes OSINT Tools and Approaches to find out sensitive information of any organization's Web Application or network. The paper contains the steps for gathering information and how to secure the web application, organization or network. There are many automated and paid tools available for vulnerability finding and penetration testing. In this paper we are performing recon with the help of OSINT to gather information and give the solution, before an attacker uses this vulnerability and exploits it. Nowadays lot many vulnerabilities are on the web application. I already learned the many cases in the security programs, where a Sensitive data leakage was happening on many reputed websites. So I will start to find out a web-application in which such types of information’s are disclosed, the Problem was that if we find out such information leaking like credentials, Token, API key we can easily get authorization to admins/users account. I found a lot many well-known websites where we can easily use this sensitive data. To perform such kind of attack you just need to perform reconnaissance with the help of various open source tools available on internet.


2018 ◽  
Vol 48 (3) ◽  
pp. 84-90 ◽  
Author(s):  
E. A. Lapchenko ◽  
S. P. Isakova ◽  
T. N. Bobrova ◽  
L. A. Kolpakova

It is shown that the application of the Internet technologies is relevant in the selection of crop production technologies and the formation of a rational composition of the machine-and-tractor fl eet taking into account the conditions and production resources of a particular agricultural enterprise. The work gives a short description of the web applications, namely “ExactFarming”, “Agrivi” and “AgCommand” that provide a possibility to select technologies and technical means of soil treatment, and their functions. “ExactFarming” allows to collect and store information about temperature, precipitation and weather forecast in certain areas, keep records of information about crops and make technological maps using expert templates. “Agrivi” allows to store and provide access to weather information in the fi elds with certain crops. It has algorithms to detect and make warnings about risks related to diseases and pests, as well as provides economic calculations of crop profi tability and crop planning. “AgCommand” allows to track the position of machinery and equipment in the fi elds and provides data on the weather situation in order to plan the use of agricultural machinery in the fi elds. The web applications presented hereabove do not show relation between the technologies applied and agro-climatic features of the farm location zone. They do not take into account the phytosanitary conditions in the previous years, or the relief and contour of the fi elds while drawing up technological maps or selecting the machine-and-tractor fl eet. Siberian Physical-Technical Institute of Agrarian Problems of Siberian Federal Scientifi c Center of AgroBioTechnologies of the Russian Academy of Sciences developed a software complex PIKAT for supporting machine agrotechnologies for production of spring wheat grain at an agricultural enterprise, on the basis of which there is a plan to develop a web application that will consider all the main factors limiting the yield of cultivated crops.


2020 ◽  
Author(s):  
Darshak Mota ◽  
Neel Zadafiya ◽  
Jinan Fiaidhi

Java Spring is an application development framework for enterprise Java. It is an open source platform which is used to develop robust Java application easily. Spring can also be performed using MVC structure. The MVC architecture is based on Model View and Controller techniques, where the project structure or code is divided into three parts or sections which helps to categorize the code files and other files in an organized form. Model, View and Controller code are interrelated and often passes and fetches information from each other without having to put all code in a single file which can make testing the program easy. Testing the application while and after development is an integral part of the Software Development Life Cycle (SDLC). Different techniques have been used to test the web application which is developed using Java Spring MVC architecture. And compares the results among all the three different techniques used to test the web application.


2021 ◽  
Vol 13 (2) ◽  
pp. 50
Author(s):  
Hamed Z. Jahromi ◽  
Declan Delaney ◽  
Andrew Hines

Content is a key influencing factor in Web Quality of Experience (QoE) estimation. A web user’s satisfaction can be influenced by how long it takes to render and visualize the visible parts of the web page in the browser. This is referred to as the Above-the-fold (ATF) time. SpeedIndex (SI) has been widely used to estimate perceived web page loading speed of ATF content and a proxy metric for Web QoE estimation. Web application developers have been actively introducing innovative interactive features, such as animated and multimedia content, aiming to capture the users’ attention and improve the functionality and utility of the web applications. However, the literature shows that, for the websites with animated content, the estimated ATF time using the state-of-the-art metrics may not accurately match completed ATF time as perceived by users. This study introduces a new metric, Plausibly Complete Time (PCT), that estimates ATF time for a user’s perception of websites with and without animations. PCT can be integrated with SI and web QoE models. The accuracy of the proposed metric is evaluated based on two publicly available datasets. The proposed metric holds a high positive Spearman’s correlation (rs=0.89) with the Perceived ATF reported by the users for websites with and without animated content. This study demonstrates that using PCT as a KPI in QoE estimation models can improve the robustness of QoE estimation in comparison to using the state-of-the-art ATF time metric. Furthermore, experimental result showed that the estimation of SI using PCT improves the robustness of SI for websites with animated content. The PCT estimation allows web application designers to identify where poor design has significantly increased ATF time and refactor their implementation before it impacts end-user experience.


BMJ Open ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. e043328
Author(s):  
Ildikó Gágyor ◽  
Katrin Rentzsch ◽  
Stephanie Strube-Plaschke ◽  
Wolfgang Himmel

ObjectivesTo validate the urinary tract infection-Symptom and Impairment Questionnaire (UTI-SIQ-8), a questionnaire that consists of four items to assess the symptom severity for dysuria, urgency, frequenc, and low abdominal pain and four items to assess the resulting impairment of activity by UTIs.DesignProspective observation study.SettingGerman primary care practices.ParticipantsAn unselected population of women with UTI. Women could participate online via a web application for smartphones, smartwatches and tablets or use a paper-and-pencil version.Main outcomesPsychometric properties of the UTI-SIQ-8 regarding reliability, validity and sensitivity to change by using factor analysis and multilevel and network analysis.ResultsData from 120 women with a total of 769 symptom reports across 7 days of measurement were analysed. The majority of the participating patients (87/120) used the web application via smartphones or other devices. The reliability of the UTI-SIQ-8 was high, with Cronbach’s alpha of .86 at intake; convergent and discriminant validity was satisfactory. Intraclass correlation demonstrated high sensitivity to change, with 68% of the total variance being due to time differences. These daily changes in an individual’s symptoms moved parallel with daily changes in the EQ-5D-5L (b=1.68, SE=0.12, p<0.001) and the visual analogue scale (b=0.03, SE=0.003, p<0.001), also highlighting convergent validity with respect to daily changes in symptom severity.ConclusionsThe present findings support the UTI-SIQ-8 questionnaire as an economic, reliable and valid instrument for the assessment of symptom severity and symptom change in women with uncomplicated UTI. The web application helped patients to report symptoms on a daily basis. These findings may encourage primary care physicians to use the UTI-SIQ-8 in their daily practice and researchers to apply it to studies involving patients with uncomplicated UTI.


2021 ◽  
Author(s):  
Jason Hunter ◽  
Mark Thyer ◽  
Dmitri Kavetski ◽  
David McInerney

&lt;p&gt;Probabilistic predictions provide crucial information regarding the uncertainty of hydrological predictions, which are a key input for risk-based decision-making. However, they are often excluded from hydrological modelling applications because suitable probabilistic error models can be both challenging to construct and interpret, and the quality of results are often reliant on the objective function used to calibrate the hydrological model.&lt;/p&gt;&lt;p&gt;We present an open-source R-package and an online web application that achieves the following two aims. Firstly, these resources are easy-to-use and accessible, so that users need not have specialised knowledge in probabilistic modelling to apply them. Secondly, the probabilistic error model that we describe provides high-quality probabilistic predictions for a wide range of commonly-used hydrological objective functions, which it is only able to do by including a new innovation that resolves a long-standing issue relating to model assumptions that previously prevented this broad application. &amp;#160;&lt;/p&gt;&lt;p&gt;We demonstrate our methods by comparing our new probabilistic error model with an existing reference error model in an empirical case study that uses 54 perennial Australian catchments, the hydrological model GR4J, 8 common objective functions and 4 performance metrics (reliability, precision, volumetric bias and errors in the flow duration curve). The existing reference error model introduces additional flow dependencies into the residual error structure when it is used with most of the study objective functions, which in turn leads to poor-quality probabilistic predictions. In contrast, the new probabilistic error model achieves high-quality probabilistic predictions for all objective functions used in this case study.&lt;/p&gt;&lt;p&gt;The new probabilistic error model and the open-source software and web application aims to facilitate the adoption of probabilistic predictions in the hydrological modelling community, and to improve the quality of predictions and decisions that are made using those predictions. In particular, our methods can be used to achieve high-quality probabilistic predictions from hydrological models that are calibrated with a wide range of common objective functions.&lt;/p&gt;


2016 ◽  
Vol 28 (2) ◽  
pp. 241-251 ◽  
Author(s):  
Luciane Lena Pessanha Monteiro ◽  
Mark Douglas de Azevedo Jacyntho

The study addresses the use of the Semantic Web and Linked Data principles proposed by the World Wide Web Consortium for the development of Web application for semantic management of scanned documents. The main goal is to record scanned documents describing them in a way the machine is able to understand and process them, filtering content and assisting us in searching for such documents when a decision-making process is in course. To this end, machine-understandable metadata, created through the use of reference Linked Data ontologies, are associated to documents, creating a knowledge base. To further enrich the process, (semi)automatic mashup of these metadata with data from the new Web of Linked Data is carried out, considerably increasing the scope of the knowledge base and enabling to extract new data related to the content of stored documents from the Web and combine them, without the user making any effort or perceiving the complexity of the whole process.


Sign in / Sign up

Export Citation Format

Share Document