scholarly journals The Internet, the World Wide Web, Library Web Browsers, and Library Web Servers

2017 ◽  
Vol 19 (1) ◽  
pp. 50-52 ◽  
Author(s):  
Jian-Zhong (Joe) Zhou

This article first examines the difference between two very familiar and sometimes synonymous terms, the Internet and the Web. The article then explains the relationship between the Web's protocol HTTP and other high-level Internet protocols, such as Telnet and FTP, as well as provides a brief history of Web development. Next, the article analyzes the mechanism in which a Web browser (client) "talks" to a Web server on the Internet. Finally, the article studies the market growth for Web browsers and Web servers between 1993 and 1999. Two statistical sources were used in the Web market analysis: a survey conducted by the University of Delaware Libraries for the 122 members of the Association of Research Libraries, and the data for the entire Web industry from different Web survey agencies.

JOURNAL ASRO ◽  
2019 ◽  
Vol 10 (2) ◽  
pp. 105
Author(s):  
Khairul Huda ◽  
Zaenal Syahlan ◽  
M Syaifi ◽  
Edy Widodo

The development of information technology also developed in line with thedevelopment of human civilization. The development of information technology is veryhelpful, one of which is the internet. The use of the internet has developed into anappropriate means to convey information that is fast, effective and accurate. Submissionof information is not limited to all soldiers and the general public by utilizing technologicalfacilities, namely websites. In conveying the history of Indonesia Warship Raden EddyMartadinata 331 and Indonesia Warship I Gusti Ngurah Rai 332 are still stored in the formof documents on a computer and are still printed in the form of sheets of paper. Inconveying the history of Indonesia Warship, it must be developed further to conveyinformation in the current era. Historical research that executive focuses on the past. Sofar, information on the Indonesia Warship Indonesia Warship's historical informationsystem Raden Eddy Martadinata - 331 and Indonesia Warship I Gusti Ngurah Rai - 332on the web-based Indonesian Armed Forces fleet are still in print. besides usinginformation books, then try to make other alternatives by creating a website, besides thatmembers are expected to access information easily and efficiently. With theineffectiveness in managing Indonesia Warship Indonesia Warship historical data RadenEddy Martadinata - 331 and Indonesia Warship I Gusti Ngurah Rai - 332, a design of theIndonesia Warship historical information system was built in the web-based IndonesianArmada fleet which aims to facilitate the process of Indonesia Warship history search.PHP as a programmer and MySQL as the database.Keywords: Website-Based Indonesia Warship History Information System. PHP MySQL.


Author(s):  
Ibrahim Mahmood Ibrahim ◽  
Siddeeq Y. Ameen ◽  
Hajar Maseeh Yasin ◽  
Naaman Omar ◽  
Shakir Fattah Kak ◽  
...  

Today, web services rapidly increased and are accessed by many users, leading to massive traffic on the Internet. Hence, the web server suffers from this problem, and it becomes challenging to manage the total traffic with growing users. It will be overloaded and show response time and bottleneck, so this massive traffic must be shared among several servers. Therefore, the load balancing technologies and server clusters are potent methods for dealing with server bottlenecks. Load balancing techniques distribute the load among servers in the cluster so that it balances all web servers. The motivation of this paper is to give an overview of the several load balancing techniques used to enhance the efficiency of web servers in terms of response time, throughput, and resource utilization. Different algorithms are addressed by researchers and get good results like the pending job, and IP hash algorithms achieve better performance.


Compiler ◽  
2013 ◽  
Vol 2 (2) ◽  
Author(s):  
Saryanto Saryanto ◽  
Sumarsono Sumarsono ◽  
Nurcahyani Dewi Retnowati

Data communication in the internet today is so complex as an example of the speed factor becomes very important in communicating, everyone wants fast data communication services provided in order to maximum. In relation to the application as a communication liaison with client server applications, web service using a data serialization format to transmit the data. Before the data is sent, either fromthe client to the server or vice versa, should be modified in a specific data format beforehand according to the web service is used. Types of data serialization format used in the web service such as XML and JSON. The method used for testing include data serialization method, data measurement method and data parsing method. Data serialization method is used to calculate the time serialization of data from the database to the form of XML and JSON in applications with PHP platform. Data measurement method used to measure the size of the XML and JSON data which based on many fields of data serialization process. Data parsing method is used to calculate the processing time and JSON parsing XML data. Results o f comparative analysis o f XML and JSON in PHP applications using thearchitecture Rest can be concluded that the test result o f the difference in time and time serialization and JSON parsing XML data is influenced by the number o f records, if the number of records the greater the difference in eating time data serialization and parsing the data the greater the time also itcan be concluded that the faster the process JSON serialization and parsing XML data is compared. Testing results o f the JSON data size is smaller than the size of XML. Data exchange using XML format has a size limit of up to 31456.31 KB while JSON XML exceeds the size limit. Testing results on the Internet when the number o f records up to 50,000 data when the data serialization and parsing time data can not be detected in the database.


2008 ◽  
pp. 1434-1442
Author(s):  
Calin Gurau

The development of the World Wide Web has created new opportunities for interpersonal interaction. The Internet allows one-to-one (e-mail), one-to-many (Web sites, e-mail lists) or many-to-many (online discussion forums) interaction, which represent a unique feature in comparison with traditional communication channels (Armstrong & Hagel, 1996). On the other hand, the Internet has specific characteristics, such as: • Interactivity: The Internet offers multiple possibilities of interactive communication, acting not only as an interface, but also as a communication agent (allowing a direct interaction between individuals and software applications) • Transparency: The information published online can be accessed and viewed by any Internet user, unless this information is specifically protected • Memory: The Web is a channel not only for transmitting information, but also for storing information¾in other words, the information published on the Web remains in the memory of the network until it is erased. These characteristics permit the development of online or virtual communities¾groups of people with similar interests who communicate on the Web in a regular manner (Armstrong & Hagel, 1996; Goldsborough, 1999a, 1999b; Gordon, 2000). Many studies deal with the ethics of research in Cyberspace and Virtual Communities (Bakardjieva, Feenberg, & Goldie, 2004), but very few publications relate with the Codes of Ethics used in Public Discussion Forums (Belilos, 1998; Johnson, 1997). Other specialists have analyzed specific categories or uses of online discussion forums, such as online learning (Blignaut & Trollip, 2003; DeSanctis, Fayard, Roach, & Jiang, 2003) or the creation of professional communities of practice (Bickart & Schindler, 2001; Kling, McKim & King, 2003; Millen, Fontaine, & Muller, 2002), and in this context, have also discussed briefly the importance of netiquette and forum monitoring (Fauske & Wade, 2003, 2004). The difference between these online communities and public discussion forums is the degree of control exercised on the functioning and purpose of the forum by a specific individual or organization. This article attempts to investigate, analyze and present the main patterns of the codes/rules of ethics used in the public discussion forums, otherwise known as Newsgroups, and their influence on the profile and functioning of the community.


Author(s):  
Ted Koppel

Electronic resource management (ERM), as a tool for library management, grows in importance every day. The ERM industry has matured greatly over the past decade. Just ten years ago, the first journals began to be published on the Web in significant volume; by 2007, many smaller colleges and some large research libraries have moved to complete or nearly complete electronic-only access (Ives, 2006). The Association of Research Libraries reports that the average ARL research library now spends over 31% of its materials budget on electronic resources, with a large proportion of these libraries spending more than 50% of their materials budget on electronic resources (Kyrillidou & Young, 2006).


Author(s):  
Suely Fragoso

This chapter proposes that search engines apply a verticalizing pressure on the WWW many-to-many information distribution model, forcing this to revert to a distributive model similar to that of the mass media. The argument for this starts with a critical descriptive examination of the history of search mechanisms for the Internet. Parallel to this there is a discussion of the increasing ties between the search engines and the advertising market. The chapter then presents questions concerning the concentration of traffic on the Web around a small number of search engines which are in the hands of an equally limited number of enterprises. This reality is accentuated by the confidence that users place in the search engine and by the ongoing acquisition of collaborative systems and smaller players by the large search engines. This scenario demonstrates the verticalizing pressure that the search engines apply to the majority of WWW users, that bring it back toward the mass distribution mode.


Author(s):  
José-Fernando. Diez-Higuera ◽  
Francisco-Javier Diaz-Pernas

In the last few years, because of the increasing growth of the Internet, general-purpose clients have achieved a high level of popularity for static consultation of text and pictures. This is the case of the World Wide Web (i.e., the Web browsers). Using a hypertext system, Web users can select and read in their computers information from all around the world, with no other requirement than an Internet connection and a navigation program. For a long time, the information available on the Internet has been series of written texts and 2D pictures (i.e., static information). This sort of information suited many publications, but it was highly unsatisfactory for others, like those related to objects of art, where real volume, and interactivity with the user, are of great importance. Here, the possibility of including 3D information in Web pages makes real sense.


2021 ◽  
Vol 36 ◽  
Author(s):  
Ekaterina Kulinicheva

This paper considers sneakerheads, or sneaker collectors and enthusiasts, as fans. It explores both them and their participatory culture, developing a new approach to researching sneakerheads: I here conceptualize sneaker collecting as an object-inspired fandom to highlight the difference between sneaker fandom and other object-oriented fandoms. This paper demonstrates that sneaker collecting is about both collecting knowledge about the subject of sneakers and collecting sneakers themselves. The materiality of sneakers, the story behind a design, and the cultural history of sneakers attracts sneakerheads to sneakers. As such, I here explore the following characteristics of sneaker collecting: the importance of knowledge and its acquisition, the high value of the community's practices and activities, the high level of emotional involvement, fan art (sneaker art), and anticommercial ideologies and beliefs. The approach demonstrated in this paper could also be useful in research of other communities organized around collecting wearable goods, such as clothes or accessories, including football T-shirts, vintage denim, and bags.


Geophysics ◽  
1982 ◽  
Vol 47 (2) ◽  
pp. 269-270 ◽  
Author(s):  
Kevin M. Barry
Keyword(s):  

The fact that geophysics is alive and well today is quite obvious and this fact has generated a high level of optimism within all of us. The exploration geophysicist must, by nature, be an optimist when he probes the unknown subsurface with various sensing devices and techniques to locate the earth’s hidden resources. The pessimist is not inclined to take the risk of predicting anything he cannot put his finger on. There is an oft‐quoted rhyme that fits the situation well: Between the optimist and the pessimist, The difference is droll: The optimist sees the doughnut, But the pessimist sees the hole. In reviewing present activities and technology in exploration, we can appreciate why the geophysicist is not concerned with the hole — dry, empty, or otherwise — but is more optimistic than ever. In the last several years, our industry and SEG have seen an explosive growth in practically all areas of exploration geophysics, and in many respects 1981 is unique in the history of milestones in our profession.


Organizational web servers reflect the public image of an organization and serve web pages/information to organizational clients via web browsers using HTTP protocol. Some of the web server software may contain web applications that enable users to perform high-level tasks, such as querying a database and delivering the output through the web server to the client browser as an HTML file. Hackers always try to exploit the different vulnerabilities or flaws existing in web servers and web applications, which can pose a big threat for an organization. This chapter provides the importance of protecting web servers and applications along with the different tools used for analyzing the security of web servers and web applications. The chapter also introduces different web attacks that are carried out by an attacker either to gain illegal access to the web server data or reduce the availability of web services. The web server attacks includes denial of service (DOS) attacks, buffer overflow exploits, website defacement with sql injection (SQLi) attacks, cross site scripting (XSS) attacks, remote file inclusion (RFI) attacks, directory traversal attacks, phishing attacks, brute force attacks, source code disclosure attacks, session hijacking, parameter form tampering, man-in-the-middle (MITM) attacks, HTTP response splitting attacks, cross-site request forgery (XSRF), lightweight directory access protocol (LDAP) attacks, and hidden field manipulation attacks. The chapter explains different web server and web application testing tools and vulnerability scanners including Nikto, BurpSuite, Paros, IBM AppScan, Fortify, Accunetix, and ZAP. Finally, the chapter also discusses countermeasures to be implemented while designing any web application for any organization in order to reduce the risk.


Sign in / Sign up

Export Citation Format

Share Document