Best Practices in Search User Interface Design

Author(s):  
Marc L. Resnick ◽  
Jennifer Bandos

The Internet has become a powerful tool for information search and ecommerce. Millions of people use the World Wide Web on a regular basis and the number is increasing rapidly. For many common tasks, users first need to locate a Web site(s) containing needed information from among the estimated 4 trillion existing web pages. The most common method used to search for information is the search engine. However, even sophisticated users often have difficulty navigating through the complexity of search engine interfaces. Designing more effective and efficient search engines is contingent upon a significant improvement in the search user interface.

2019 ◽  
Vol 12 (2) ◽  
pp. 110-119 ◽  
Author(s):  
Jayaraman Sethuraman ◽  
Jafar A. Alzubi ◽  
Ramachandran Manikandan ◽  
Mehdi Gheisari ◽  
Ambeshwar Kumar

Background: The World Wide Web houses an abundance of information that is used every day by billions of users across the world to find relevant data. Website owners employ webmasters to ensure their pages are ranked top in search engine result pages. However, understanding how the search engine ranks a website, which comprises numerous web pages, as the top ten or twenty websites is a major challenge. Although systems have been developed to understand the ranking process, a specialized tool based approach has not been tried. Objective: This paper develops a new framework and system that process website contents to determine search engine optimization factors. Methods: To analyze the web page dynamically by assessing the web site content based on specific keywords, elimination method was used in an attempt to reveal various search engine optimization techniques. Conclusion: Our results lead to conclude that the developed system is able to perform a deeper analysis and find factors which play a role in bringing the site on the top of the list.


2018 ◽  
pp. 742-748
Author(s):  
Viveka Vardhan Jumpala

The Internet, which is an information super high way, has practically compressed the world into a cyber colony through various networks and other Internets. The development of the Internet and the emergence of the World Wide Web (WWW) as common vehicle for communication and instantaneous access to search engines and databases. Search Engine is designed to facilitate search for information on the WWW. Search Engines are essentially the tools that help in finding required information on the web quickly in an organized manner. Different search engines do the same job in different ways thus giving different results for the same query. Search Strategies are the new trend on the Web.


ReCALL ◽  
1999 ◽  
Vol 11 (2) ◽  
pp. 12-19 ◽  
Author(s):  
Stewart Arneil ◽  
Martin Holmes

This discussion paper outlines some of the decisions and issues involved in creating and using authoring tools for language learning through the World Wide Web. In it, we outline the development of Hot Potatoes, our suite of authoring tools, and attempt to draw conclusions from our experience that will be valuable not only to other developers but also to evaluators and users of authoring software. Areas addressed include exercise design, ability to customise and control the output, support for different browser versions, user-interface design, ancillary technology and technical support.


Author(s):  
Viveka Vardhan Jumpala

The Internet, which is an information super high way, has practically compressed the world into a cyber colony through various networks and other Internets. The development of the Internet and the emergence of the World Wide Web (WWW) as common vehicle for communication and instantaneous access to search engines and databases. Search Engine is designed to facilitate search for information on the WWW. Search Engines are essentially the tools that help in finding required information on the web quickly in an organized manner. Different search engines do the same job in different ways thus giving different results for the same query. Search Strategies are the new trend on the Web.


Author(s):  
Leslie S. Hiraoka

Development of the search engine as a major information and marketing channel resulted from innovative technologies that made it capable of presenting rapid, relevant responses to queries. To do this, the search engine compiles an index of web pages of information stored on the World Wide Web, ranks each page according to its incoming links, matches keywords in the query to those in its index, and returns what it determines are the most relevant pages to the searcher. Innovative and cost-effective ad placement algorithms have attracted advertisers to search engine websites and intensified the competitive dynamics among industry leaders. Their interacting software also continues to draw advertisers from traditional, mass marketing channels like television and newspapers to the online medium to cater to customers who have expressed an interest in their products and services.


Author(s):  
Vijay Kasi ◽  
Radhika Jain

In the context of the Internet, a search engine can be defined as a software program designed to help one access information, documents, and other content on the World Wide Web. The adoption and growth of the Internet in the last decade has been unprecedented. The World Wide Web has always been applauded for its simplicity and ease of use. This is evident looking at the extent of the knowledge one requires to build a Web page. The flexible nature of the Internet has enabled the rapid growth and adoption of it, making it hard to search for relevant information on the Web. The number of Web pages has been increasing at an astronomical pace, from around 2 million registered domains in 1995 to 233 million registered domains in 2004 (Consortium, 2004). The Internet, considered a distributed database of information, has the CRUD (create, retrieve, update, and delete) rule applied to it. While the Internet has been effective at creating, updating, and deleting content, it has considerably lacked in enabling the retrieval of relevant information. After all, there is no point in having a Web page that has little or no visibility on the Web. Since the 1990s when the first search program was released, we have come a long way in terms of searching for information. Although we are currently witnessing a tremendous growth in search engine technology, the growth of the Internet has overtaken it, leading to a state in which the existing search engine technology is falling short. When we apply the metrics of relevance, rigor, efficiency, and effectiveness to the search domain, it becomes very clear that we have progressed on the rigor and efficiency metrics by utilizing abundant computing power to produce faster searches with a lot of information. Rigor and efficiency are evident in the large number of indexed pages by the leading search engines (Barroso, Dean, & Holzle, 2003). However, more research needs to be done to address the relevance and effectiveness metrics. Users typically type in two to three keywords when searching, only to end up with a search result having thousands of Web pages! This has made it increasingly hard to effectively find any useful, relevant information. Search engines face a number of challenges today requiring them to perform rigorous searches with relevant results efficiently so that they are effective. These challenges include the following (“Search Engines,” 2004). 1. The Web is growing at a much faster rate than any present search engine technology can index. 2. Web pages are updated frequently, forcing search engines to revisit them periodically. 3. Dynamically generated Web sites may be slow or difficult to index, or may result in excessive results from a single Web site. 4. Many dynamically generated Web sites are not able to be indexed by search engines. 5. The commercial interests of a search engine can interfere with the order of relevant results the search engine shows. 6. Content that is behind a firewall or that is password protected is not accessible to search engines (such as those found in several digital libraries).1 7. Some Web sites have started using tricks such as spamdexing and cloaking to manipulate search engines to display them as the top results for a set of keywords. This can make the search results polluted, with more relevant links being pushed down in the result list. This is a result of the popularity of Web searches and the business potential search engines can generate today. 8. Search engines index all the content of the Web without any bounds on the sensitivity of information. This has raised a few security and privacy flags. With the above background and challenges in mind, we lay out the article as follows. In the next section, we begin with a discussion of search engine evolution. To facilitate the examination and discussion of the search engine development’s progress, we break down this discussion into the three generations of search engines. Figure 1 depicts this evolution pictorially and highlights the need for better search engine technologies. Next, we present a brief discussion on the contemporary state of search engine technology and various types of content searches available today. With this background, the next section documents various concerns about existing search engines setting the stage for better search engine technology. These concerns include information overload, relevance, representation, and categorization. Finally, we briefly address the research efforts under way to alleviate these concerns and then present our conclusion.


2018 ◽  
Vol 8 (3) ◽  
pp. 52-70
Author(s):  
Edwin Mwosa Kivuti

This article describes how search engine optimization(SEO) is becoming an increasing useful technique in online marketing as more people look to the internet to search for information. Search Engine Optimization enables web developers to develop web pages which have a high SERP rankings. A key technique in SEO is through improving the keyword prominence of keywords. In this research, keywords within web pages are extracted, and the correlation between the frequency of these words and their keyword prominence is evaluated. The findings of this research will provide a guideline to SEO practitioners, in that they will have a better understanding of the ratio of keywords they need in to add to web pages in relation to the rest of the content.


Author(s):  
Jing Chen ◽  
Qing Li ◽  
Ling Feng

The abundance of knowledge-rich information on the World Wide Web makes compiling an online etextbook both possible and necessary. In our previous work, we proposed an approach to automatically generate an e-textbook by mining the ranked lists of the search engine. However, the performance of the approach was degraded by Web pages that were relevant but not actually discussing the desired concept. In this article, we extend the previous work by applying a clustering approach before the mining process. The clustering approach serves as a post-processing stage to the original results retrieved by the search engine, and aims to reach an optimum state in which all Web pages assigned to a concept are discussing that exact concept.


2002 ◽  
Vol 7 (1) ◽  
pp. 9-25 ◽  
Author(s):  
Moses Boudourides ◽  
Gerasimos Antypas

In this paper we are presenting a simple simulation of the Internet World-Wide Web, where one observes the appearance of web pages belonging to different web sites, covering a number of different thematic topics and possessing links to other web pages. The goal of our simulation is to reproduce the form of the observed World-Wide Web and of its growth, using a small number of simple assumptions. In our simulation, existing web pages may generate new ones as follows: First, each web page is equipped with a topic concerning its contents. Second, links between web pages are established according to common topics. Next, new web pages may be randomly generated and subsequently they might be equipped with a topic and be assigned to web sites. By repeated iterations of these rules, our simulation appears to exhibit the observed structure of the World-Wide Web and, in particular, a power law type of growth. In order to visualise the network of web pages, we have followed N. Gilbert's (1997) methodology of scientometric simulation, assuming that web pages can be represented by points in the plane. Furthermore, the simulated graph is found to possess the property of small worlds, as it is the case with a large number of other complex networks.


Sign in / Sign up

Export Citation Format

Share Document