scholarly journals Web browsers forensic analysis reviewWeb tarayıcılarda adli analiz incelemesi

2015 ◽  
Vol 12 (2) ◽  
pp. 757 ◽  
Author(s):  
Erkan Baran ◽  
Huseyin Çakır ◽  
Çelebi Uluyol

<p>Nowadays, web browser tools are seen ıntensıvely durıng the usage of web applıcatıons. Because of that, browsers provıdes ınfrastructure of a largo majorıty of crımes. Because guılty or suspect can use the browsers to collect ınformatıons, to hıde hıs crıme, learn new crımınal methods or to apply they have learned. In thıs study, ıt ıs also seeked answers of how a process can be monıtored on the computers whıch are used on browsers, ın whıch fıles whıch datas are looked and when and whıch sıtes are accessed. Accordıng to research of W3counter web stats tool, Chrome Web browser, whıch has %43 persentage of across the world ın usage, ıs proses as the most demanded browser ın thıs study by users, and ıt ıs scented out ın thıs browser's related fıles. In these days, ''hıdden mode'' whıch take part ın vast majorıty of browsers ıs also examıned. Thıs feature of the browser, whıch ıs receıved reference, ıs tracked by testıng and ıs sought data ın RAM memory and fıle systems. Thus, '' hıdden mode'' effects are dıscussed ın provıdıng studıes about suspect or crımınal posıtıon people, what kınd of data can be obtaıned ın usıng '' hıdden mode” ıs revealed.</p><p> </p><p><strong>Özet</strong></p><p>Günümüzde internet uygulamalarının kullanımı sırasında web tarayıcı araçlarının yoğun bir şekilde kullanımı görülmektedir. Bu nedenle tarayıcılar, işlenen suçların büyük bir çoğunluğuna altyapı sağlar. Çünkü suçlu ya da şüpheli, tarayıcıları bilgi toplamak, suçunu gizlemek, yeni suç metotları öğrenmek ya da öğrendiklerini uygulamak için kullanabilir.  Bu çalışmada da tarayıcıların kullanıldığı bilgisayarlar üzerinde bırakılan izlerin tespitinde nasıl bir süreç izlenebileceği, hangi dosyalarda hangi verilere bakılabileceği ve ne zaman hangi sitelere erişim sağlandığı gibi çeşitli sorulara cevaplar aranmaktadır. w3counter adlı internet istatistik aracının yaptığı araştırmaya göre, dünya genelinde %43'lük bir kullanım alanına sahip olan Chrome web tarayıcısı, kullanıcılar tarafından en çok talep gören tarayıcı olarak bu araştırma içinde referans alınmaktadır ve bu tarayıcıya ait ilgili dosyalarda izler sürülmektedir. Ayrıca günümüz tarayıcıların büyük bir çoğunluğunda yer alan “<strong>gizli mod</strong>” özelliği incelenmektedir.  Referans alınan tarayıcının bu özelliği test edilerek iz sürülmekte, dosya  sistemlerinde ve RAM bellekte veri aranmaktadır.Böylelikle “gizli mod” kullanımında ne tür veriler elde edilebileceği ortaya konarak şüpheli ya da suçlu konumundaki kişilere ait delillendirme çalışmalarında “gizli mod” kullanımının etkileri tartışılmaktadır. </p>

2019 ◽  
Vol 11 (7) ◽  
pp. 147 ◽  
Author(s):  
Masaki Kohana ◽  
Shinji Sakamoto ◽  
Shusuke Okamoto

Real-time web applications such as a virtual world require considerable computing resources. However, as the number of servers increases, so does the maintenance and financial cost. To share tasks among web browsers, the browsers must share data. Therefore, a network must be constructed among the web browsers. In this paper, we propose the construction of a web browser network based on the Barabasi–Albert model (BA model). We focus on a web-based multiplayer online game that requires higher frequent communication and significant computing resources. We attempt to optimize computing resource utilization for web browsers. We improve upon the method in our previous study, which constructed a network for a web-based virtual world, using only location information. When a new user logged into a world, the web browser connected to two other browsers whose users had a location close to that of the user. The experimental results of that method showed 50% data coverage, which was insufficient to display the game screen because the web browser displays the characters on the virtual world. In this study, we attempt to use the BA model to construct more efficient networks than those in the previous study to increase data coverage. Our new method uses the number of connections of the web browser and location information to calculate the probability of web browser selection. The experimental results show that the data coverage exceeds 90%, indicating significant improvement over the previous method.


Author(s):  
Shashank Gupta ◽  
B. B. Gupta

Cross-Site Scripting (XSS) attack is a vulnerability on the client-side browser that is caused by the improper sanitization of the user input embedded in the Web pages. Researchers in the past had proposed various types of defensive strategies, vulnerability scanners, etc., but still XSS flaws remains in the Web applications due to inadequate understanding and implementation of various defensive tools and strategies. Therefore, in this chapter, the authors propose a security model called Browser Dependent XSS Sanitizer (BDS) on the client-side Web browser for eliminating the effect of XSS vulnerability. Various earlier client-side solutions degrade the performance on the Web browser side. But in this chapter, the authors use a three-step approach to bypass the XSS attack without degrading much of the user's Web browsing experience. While auditing the experiments, this approach is capable of preventing the XSS attacks on various modern Web browsers.


Author(s):  
Shashank Gupta ◽  
B. B. Gupta

Cross-Site Scripting (XSS) attack is a vulnerability on the client-side browser that is caused by the improper sanitization of the user input embedded in the Web pages. Researchers in the past had proposed various types of defensive strategies, vulnerability scanners, etc., but still XSS flaws remains in the Web applications due to inadequate understanding and implementation of various defensive tools and strategies. Therefore, in this chapter, the authors propose a security model called Browser Dependent XSS Sanitizer (BDS) on the client-side Web browser for eliminating the effect of XSS vulnerability. Various earlier client-side solutions degrade the performance on the Web browser side. But in this chapter, the authors use a three-step approach to bypass the XSS attack without degrading much of the user's Web browsing experience. While auditing the experiments, this approach is capable of preventing the XSS attacks on various modern Web browsers.


Computers ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 165
Author(s):  
Kris Hughes ◽  
Pavlos Papadopoulos ◽  
Nikolaos Pitropakis ◽  
Adrian Smales ◽  
Jawad Ahmad ◽  
...  

Web browsers are one of the most used applications on every computational device in our days. Hence, they play a pivotal role in any forensic investigation and help determine if nefarious or suspicious activity has occurred on that device. Our study investigates the usage of private mode and browsing artefacts within four prevalent web browsers and is focused on analyzing both hard disk and random access memory. Forensic analysis on the target device showed that using private mode matched each of the web browser vendors’ claims, such as that browsing activity, search history, cookies and temporary files that are not saved in the device’s hard disks. However, in volatile memory analysis, a majority of artefacts within the test cases were retrieved. Hence, a malicious actor performing a similar approach could potentially retrieve sensitive information left behind on the device without the user’s consent.


Author(s):  
DHANASHREE TAWARE ◽  
NAMRATA ATTKARE ◽  
DIVYA SINGH

As we know internet has become a very important factor in our day today life. It is a wide media for communication and exchange of ideas for people staying in any nook and corner of the world. We have proposed a system in which we are developing a speech interactive web application services. Our main aim is to provide these services to the special ones who are unable to make use of the current system so efficiently. In our proposed work we are mainly focusing on the WEB APPLICATIONS. Many a times the disabled people are unable to access internet, for them this system will help to download news, or even access their mails through speech. Our proposed system mainly deals with the ability to handle web applications along with the O.S, mouse and keyboard control through speech, so that they can be used by persons without the use of the hands to develop an interface between the computer and the user. In our proposal we have used SAPI .It provides commands to the main central application which is handled by the GUI. Thus we look forward to develop web application through speech interaction.


Author(s):  
Firmansyah Adiputra ◽  
Khabib Mustofa

AbstrakAplikasi desktop adalah aplikasi yang berjalan lokal dalam lingkungan desktop dan hanya dapat diakses oleh pengguna desktop. Ini berbeda dengan aplikasi web yang dapat diakses dari manapun melalui jaringan. Namun tidak seperti halnya aplikasi desktop, aplikasi web yang berjalan di atas web browser tidak dapat berintegrasi dengan aplikasi desktop yang berjalan pada sisi klien.Dalam penelitian ini dibangun purwarupa framework yang diberi nama HAF (Hybrid Application Framework). HAF digunakan untuk mengembangkan dan mengeksekusi jenis aplikasi desktop baru yang diberi nama HyApp (Hybrid Application). Melalui HAF, HyApp dibangun menggunakan teknologi web dan dapat diakses secara lokal maupun melalui jaringan. Saat diakses secara lokal, walaupun dikembangkan dengan teknologi web, HyApp dapat berkomunikasi dengan aplikasi desktop lainnya. Selain itu, melalui API yang disediakan oleh HAF, HyApp akan dapat menerapkan perilaku yang berbeda berdasarkan modus pengaksesan yang dilakukannya. Kata kunci—framework, aplikasi desktop, aplikasi web    AbstractDesktop application is an application that runs locally in a desktop environment and can be accessed only by desktop users. It differs from web application which can be accessed from anywhere through networks. But unlike desktop applications, web applications cannot integrate nicely with desktop applications from where it is accessed.This research developes a prototype of framework which is named HAF (Hybrid Application Framework). HAF is used for developing and executing a new type of desktop application, named HyApp (Hybrid Application). Through HAF, HyApp is built using web technologies and can be accessed either locally or from networks. When accessed locally, even though it is built using web technologies, it still can communicate with other desktop applications. Also by using APIs provided by HAF, HyApp is capable to behave differently based on whether it is accessed locally or remotely. Keywords—framework, desktop applications, web applications


2021 ◽  
Author(s):  
Robert Forrest Roddy

After more than one hundred years the Titanic is still probably the most remembered ship in the world. This paper briefly discusses the history of the Titanic from why the White Star Line decided to build the Olympic class ships through the recently signed treaty protecting the ship. It is shown that many of the design features of the ship were far ahead of the rest of the industry but that some compromises were against the naval architect’s desires. A number of myths concerning the ship are dispelled. The circumstances leading up to the collision with the iceberg and the sinking of the ship are examined followed by an analysis of the sinking; the discovery of the ship; and finally after almost thirty-five years, a treaty to protect the ship.


i-com ◽  
2004 ◽  
Vol 3 (1/2004) ◽  
pp. 4-12 ◽  
Author(s):  
Harald Weinreich ◽  
Hartmut Obendorf ◽  
Winfried Lamersdorf

ZusammenfassungDie Benutzung eines Web-Browsers ist einfach zu erlernen, dennoch stellt die Navigation im Web selbst für erfahrene Benutzer immer wieder Herausforderungen bereit. Einer der Gründe dafür liegt in der Vielfalt von Linkarten und Linkzielen, die für den Benutzer oft nicht transparent sind und ihn so vor Überraschungen stellen, nachdem er einen Link angeklickt hat. Das Projekt HyperScout beschäftigt sich mit Möglichkeiten, die Navigation im Web zu vereinfachen, indem man Informationen zum Typ des Links und zum referenzierten Objekt für den Benutzer sichtbar macht. Dieser Bericht stellt die entwickelten Konzepte und die Ergebnisse einer Evaluation des daraus abgeleiteten Prototyps vor. Die Ergebnisse geben Aufschlüsse darüber, welche Informationen Benutzern vor der Anwahl eines Links hilfreich sind und wie sie dargestellt werden könnten.


Author(s):  
Punam Bedi ◽  
Neha Gupta ◽  
Vinita Jindal

The World Wide Web is a part of the Internet that provides data dissemination facility to people. The contents of the Web are crawled and indexed by search engines so that they can be retrieved, ranked, and displayed as a result of users' search queries. These contents that can be easily retrieved using Web browsers and search engines comprise the Surface Web. All information that cannot be crawled by search engines' crawlers falls under Deep Web. Deep Web content never appears in the results displayed by search engines. Though this part of the Web remains hidden, it can be reached using targeted search over normal Web browsers. Unlike Deep Web, there exists a portion of the World Wide Web that cannot be accessed without special software. This is known as the Dark Web. This chapter describes how the Dark Web differs from the Deep Web and elaborates on the commonly used software to enter the Dark Web. It highlights the illegitimate and legitimate sides of the Dark Web and specifies the role played by cryptocurrencies in the expansion of Dark Web's user base.


Author(s):  
Filippo Ricca ◽  
Paolo Tonella

The World Wide Web has become an interesting opportunity for companies to deliver services and products at distance. Correspondingly, the quality of Web applications, responsible for the related transactions, has become a crucial factor. It can be improved by properly modeling the application during its design, but if the whole life cycle is considered, the availability of a consistent model of the application is fundamental also during maintenance and testing. In this chapter, the problem of recovering a model of a Web application from the implementation is faced. Algorithms are provided to obtain it even in presence of a highly dynamic structure. Based upon such a model, several static analysis techniques, among which reaching definitions and slicing, are considered, as well as some restructuring techniques. White box testing exploits the model in that the related coverage levels are based on it, while statistical testing assumes that transitions in the model are labeled with the conditional probabilities of being traversed.


Sign in / Sign up

Export Citation Format

Share Document