scholarly journals A Method for Information Grabbing, Bypassing Security and Detecting Web Application Vulnerabilities

2018 ◽  
Vol 7 (4.36) ◽  
pp. 762
Author(s):  
B. J. Santhosh Kumar ◽  
B. R. Pushpa

A single file on web contains text, images, audio, video and formatting instructions enclosed within a script. Website files are hosted on servers. The Servers “serve” those files to individual users upon request. Anonymous user with minimum user credentials can request on behalf of legitimate user to grab sensitive, confidential and personal information without legitimate users knowledge.[3] The proposed method makes use of URL as input for finding web vulnerabilities. Testing of proposed method is conducted to evaluate the performance based on the accuracy received. Performance is evaluated based on false negative and false positive results. Experiment is also conducted for web vulnerability assessment and penetration testing. The proposed method also checks for information grabbing from web using Google dork. Google dork helps to enter a network without permission and/or gain access to unauthorized information. Advanced search strings called Google dork queries used to locate sensitive information. This paper describes the method for web application vulnerabilities detection by using google dork, bypass first level security in any web and hack username and password in social networking site.  

2007 ◽  
Vol 9 (2) ◽  
Author(s):  
P. L. Wessels ◽  
L. P. Steenkamp

One of the critical issues in managing information within an organization is to ensure that proper controls exist and are applied in allowing people access to information. Passwords are used extensively as the main control mechanism to identify users wanting access to systems, applications, data files, network servers or personal information. In this article, the issues involved in selecting and using passwords are discussed and the current practices employed by users in creating and storing passwords to gain access to sensitive information are assessed. The results of this survey conclude that information managers cannot rely only on users to employ proper password control in order to protect sensitive information. 


Author(s):  
Ademola Philip Abidoye ◽  
Boniface Kabaso

Phishing is a cyber-attack that uses disguised email as a weapon and has been on the rise in recent times.  Innocent Internet user if peradventure clicking on a fraudulent link may cause him to fall victim of divulging his personal information such as credit card pin, login credentials, banking information and other sensitive information. There are many ways in which the attackers can trick victims to reveal their personal information. In this article, we select important phishing URLs features that can be used by attacker to trick Internet users into taking the attacker’s desired action. We use two machine learning techniques to accurately classify our data sets. We compare the performance of other related techniques with our scheme. The results of the experiments show that the approach is highly effective in detecting phishing URLs and attained an accuracy of 97.8% with 1.06% false positive rate, 0.5% false negative rate, and an error rate of 0.3%. The proposed scheme performs better compared to other selected related work. This shows that our approach can be used for real-time application in detecting phishing URLs.


2020 ◽  
pp. 5-9
Author(s):  
Manasvi Srivastava ◽  
◽  
Vikas Yadav ◽  
Swati Singh ◽  
◽  
...  

The Internet is the largest source of information created by humanity. It contains a variety of materials available in various formats such as text, audio, video and much more. In all web scraping is one way. It is a set of strategies here in which we get information from the website instead of copying the data manually. Many Web-based data extraction methods are designed to solve specific problems and work on ad-hoc domains. Various tools and technologies have been developed to facilitate Web Scraping. Unfortunately, the appropriateness and ethics of using these Web Scraping tools are often overlooked. There are hundreds of web scraping software available today, most of them designed for Java, Python and Ruby. There is also open source software and commercial software. Web-based software such as YahooPipes, Google Web Scrapers and Firefox extensions for Outwit are the best tools for beginners in web cutting. Web extraction is basically used to cut this manual extraction and editing process and provide an easy and better way to collect data from a web page and convert it into the desired format and save it to a local or archive directory. In this paper, among others the kind of scrub, we focus on those techniques that extract the content of a Web page. In particular, we use scrubbing techniques for a variety of diseases with their own symptoms and precautions.


2021 ◽  
Vol 1 ◽  
pp. 84-90
Author(s):  
Rustam Kh. Khamdamov ◽  
◽  
Komil F. Kerimov ◽  

Web applications are increasingly being used in activities such as reading news, paying bills, and shopping online. As these services grow, you can see an increase in the number and extent of attacks on them, such as: theft of personal information, bank data and other cases of cybercrime. All of the above is a consequence of the openness of information in the database. Web application security is highly dependent on database security. Client request data is usually retrieved by a set of requests that request the application user. If the data entered by the user is not scanned very carefully, you can collect a whole host of types of attacks that use web applications to create security threats to the database. Unfortunately, due to time constraints, web application programmers usually focus on the functionality of web applications, but only few worry about security. This article provides methods for detecting anomalies using a database firewall. The methods of penetration and types of hacks are investigated. A database firewall is proposed that can block known and unknown attacks on Web applications. This software can work in various ways depending on the configuration. There are almost no false positives, and the overhead of performance is relatively small. The developed database firewall is designed to protect against attacks on web application databases. It works as a proxy, which means that requests for SQL expressions received from the client will first be sent to the developed firewall, rather than to the database server itself. The firewall analyzes the request: requests that are considered strange are blocked by the firewall and an empty result is returned to the client.


Author(s):  
Samyak Sadanand Shravasti

Abstract: Phishing occurs when people's personal information is stolen via email, phone, or text communications. In Smishing Short Message Service (SMS) is used for cyber-attacks, Smishing is a type of theft of sensitive information. People are more likely to give personal information such as account details and passwords when they receive SMS messages. This data could be used to steal money or personal information from a person or a company. As a result, Smishing is a critical issue to consider. The proposed model uses an Artificial Intelligence to detect smishing. Analysing a SMS and successfully detecting Smishing is possible. Finally, we evaluate and analyse our proposed model to show its efficacy. Keywords: Phishing, Smishing, Artificial Intelligence, LSTM, RNN


Author(s):  
Marco Antonio Ramírez-Hernández ◽  
Randolfo Alberto Santos-Quiroz

In the development of an information platform focused on the daily operations of a physical rehabilitation clinic, two main software components are currently being worked on for their interaction with users. A web application focused on the operations of labor personnel (specialists and training professionals) and a native mobile application focused on the actions of patients, each of them has the need to interact with the same information repository, analyzing the side of the Patient problems arise from permanent connectivity to the main data through the Internet or some other data transmission protocol, the requirement arises to be able to interact with the generated personal information, which could be achieved by synchronizing data between client-server.


Author(s):  
Dr. J. Padmavathi ◽  
Sirvi Ashok Kumar Mohanlal

Today Social Media is an integral part of many people’s lives. Most of us are users of one or many of these such as Facebook, Twitter, Instagram, LinkedIn etc. Social media networks are the most common platform to communicate with our friends, family and share thoughts, photos, videos and lots of other information in the common area of interest. Privacy has become an important concern in social networking sites. Users are not aware of the privacy risks involved on social media sites and they share their sensitive information on social network sites. While these platforms are free and offer unrestricted access to their services, they puzzle the users with many issues such as privacy, security, data harvesting, content censorship, leaking personal information etc. This paper aims at analyzing, the major users of social media networks, namely, the college students. It was intended to assess the extent the consumers’ are aware of the risks of free usage and how to mitigate against these privacy issues.


2018 ◽  
pp. 703-728
Author(s):  
Pradipta Roy ◽  
Debarati Dey ◽  
Debashis De ◽  
Swati Sinha

In today's world, sensitive information like secret message, financial transaction, medical report, personal information is transferred over public communication channel. Since the advancement of communication begins, data security becomes a massive problem. The increasing rate of eavesdropping over communication channel leads the introduction of cryptography algorithm for data transmission. Different traditional cryptographic technique is adopted worldwide for protected data transmission. The recent advancement on this field is DNA based cryptography. This chapter describes the application of DNA as computational tool after the exposure of its capability was discovered by Leonard M. Adleman in 1994. Its random nature also helps the cryptography algorithm to become unbreakable. Conventional cryptography methods are sometimes susceptible to attack by the intruder. Therefore the idea of using codon based DNA as a computational tool is used in this cryptography method as an alternative method that fetches new hope in communication technology.


Author(s):  
Roel During ◽  
Marcel Pleijte ◽  
Rosalie I. van Dam ◽  
Irini E. Salverda

Open data and citizen-led initiatives can be both friends and foes. Where it is available and ‘open', official data not only encourages increased public participation but can also generate the production and scrutiny of new material, potentially of benefit to the original provider and others, official or otherwise. In this way, official open data can be seen to improve democracy or, more accurately, the so-called ‘participative democracy'. On the other hand, the public is not always eager to share their personal information in the most open ways. Private and sometimes sensitive information however is required to initiate projects of societal benefit in difficult times. Many citizens appear content to channel personal information exchange via social media instead of putting it on public web sites. The perceived benefits from sharing and complete openness do not outweigh any disadvantages or fear of regulation. This is caused by various sources of contingency, such as the different appeals on citizens, construed in discourses on the participation society and the representative democracy, calling for social openness in the first and privacy protection in the latter. Moreover, the discourse on open data is an economic argument fighting the rules of privacy instead of the promotion of open data as one of the prerequisites for social action. Civil servants acknowledge that access to open data via all sorts of apps could contribute to the mushrooming of public initiatives, but are reluctant to release person-related sensitive information. The authors will describe and discuss this dilemma in the context of some recent case studies from the Netherlands concerning governmental programmes on open data and citizens' initiatives, to highlight both the governance constraints and uncertainties as well as citizens' concerns on data access and data sharing. It will be shown that openness has a different meaning and understanding in the participation society and representative democracy: i.e. the tension surrounding the sharing of private social information versus transparency. Looking from both sides at openness reveals double contingency: understanding and intentions on this openness invokes mutual enforcing uncertainties. This double contingency hampers citizens' eagerness to participate. The paper will conclude with a practical recommendation for improving data governance.


Author(s):  
Kannan Balasubramanian

The obvious risks to a security breach are that unauthorized individuals: 1) can gain access to restricted information and 2) may be able to escalate their privileges in order to compromise the application and the entire application environment. The areas that can be compromised include user and system administration accounts. In this chapter we identify the major classes of web application vulnerabilities, gives some examples of actual vulnerabilities found in real-life web application audits, and describes some countermeasures for those vulnerabilities. The classes are: 1) authentication 2) session management 3) access control 4) input validation 5) redirects and forwards 6) injection flaws 7) unauthorized view of data 8) error handling 9) cross-site scripting 10) security misconfigurations and 10) denial of service.


Sign in / Sign up

Export Citation Format

Share Document