scholarly journals Improving Search Engine Efficiency through Contextual Factor Selection

AI Magazine ◽  
2021 ◽  
Vol 42 (2) ◽  
pp. 50-58
Author(s):  
Anxiang Zeng ◽  
Han Yu ◽  
Qing Da ◽  
Yusen Zhan ◽  
Yang Yu ◽  
...  

Learning to rank (LTR) is an important artificial intelligence (AI) approach supporting the operation of many search engines. In large-scale search systems, the ranking results are continually improved with the introduction of more factors to be considered by LTR. However, the more factors being considered, the more computation resources required, which in turn, results in increased system response latency. Therefore, removing redundant factors can significantly improve search engine efficiency. In this paper, we report on our experience incorporating our Contextual Factor Selection (CFS) deep reinforcement learning approach into the Taobao e-commerce platform to optimize the selection of factors based on the context of each search query to simultaneously maintaining search result quality while significantly reducing latency. Online deployment on Taobao.com demonstrated that CFS is able to reduce average search latency under everyday use scenarios by more than 40% compared to the previous approach with comparable search result quality. Under peak usage during the Single’s Day Shopping Festival (November 11th) in 2017, CFS reduced the average search latency by 20% compared to the previous approach.

2020 ◽  
Vol 34 (08) ◽  
pp. 13212-13219
Author(s):  
Anxiang Zeng ◽  
Han Yu ◽  
Qing Da ◽  
Yusen Zan ◽  
Chunyan Miao

In large-scale search systems, the quality of the ranking results is continually improved with the introduction of more factors from complex procedures. Meanwhile, the increase in factors demands more computation resources and increases system response latency. It has been observed that, under some certain context a search instance may require only a small set of useful factors instead of all factors in order to return high quality results. Therefore, removing ineffective factors accordingly can significantly improve system efficiency. In this paper, we report our experience incorporating our Contextual Factor Selection (CFS) approach into the Taobao e-commerce platform to optimize the selection of factors based on the context of each search query in order to simultaneously achieve high quality search results while significantly reducing latency time. This problem is treated as a combinatorial optimization problem which can be tackled through a sequential decision-making procedure. The problem can be efficiently solved by CFS through a deep reinforcement learning method with reward shaping to address the problems of reward signal scarcity and wide reward signal distribution in real-world search engines. Through extensive off-line experiments based on data from the Taobao.com platform, CFS is shown to significantly outperform state-of-the-art approaches. Online deployment on Taobao.com demonstrated that CFS is able to reduce average search latency time by more than 40% compared to the previous approach with negligible reduction in search result quality. Under peak usage during the Single's Day Shopping Festival (November 11th) in 2017, CFS reduced peak load search latency time by 33% compared to the previous approach, helping Taobao.com achieve 40% higher revenue than the same period during 2016.


Author(s):  
Chandran M ◽  
Ramani A. V

<p>The research work is about to test the quality of the website and to improve the quality by analyzing the hit counts, impressions, clicks, count through rates and average positions. This is accomplished using WRPA and SEO technique. The quality of the website mainly lies on the keywords which are present in it. The keywords can be of a search query which is typed by the users in the search engines and based on these keywords, the websites are displayed in the search results. This research work concentrates on bringing the particular websites to the first of the search result in the search engine. The website chosen for research is SRKV. The research work is carried out by creating an index array of Meta tags. This array will hold all the Meta tags. All the search keywords for the website from the users are stored in another array. The index array is matched and compared with the search keywords array. From this, hit to count is calculated for the analysis. Now the calculated hit count and the searched keywords will be analyzed to improve the performance of the website. The matched special keywords from the above comparison are included in the Meta tag to improve the performance of the website. Again all the Meta tags and newly specified keywords in the index array are matched with the SEO keywords. If this matches, then the matched keyword will be stored for improving the quality of the website. Metrics such as impressions, clicks, CTR, average positions are also measured along with the hit counts. The research is carried out under different types of browsers and different types of platforms. Queries about the website from different countries are also measured. In conclusion, if the number of the clicks for the website is more than the average number of clicks, then the quality of the website is good. This research helps in improvising the keywords using WRPA and SEO and thereby improves the quality of the website easily.</p>


1996 ◽  
Vol 76 (06) ◽  
pp. 0939-0943 ◽  
Author(s):  
B Boneu ◽  
G Destelle ◽  

SummaryThe anti-aggregating activity of five rising doses of clopidogrel has been compared to that of ticlopidine in atherosclerotic patients. The aim of this study was to determine the dose of clopidogrel which should be tested in a large scale clinical trial of secondary prevention of ischemic events in patients suffering from vascular manifestations of atherosclerosis [CAPRIE (Clopidogrel vs Aspirin in Patients at Risk of Ischemic Events) trial]. A multicenter study involving 9 haematological laboratories and 29 clinical centers was set up. One hundred and fifty ambulatory patients were randomized into one of the seven following groups: clopidogrel at doses of 10, 25, 50,75 or 100 mg OD, ticlopidine 250 mg BID or placebo. ADP and collagen-induced platelet aggregation tests were performed before starting treatment and after 7 and 28 days. Bleeding time was performed on days 0 and 28. Patients were seen on days 0, 7 and 28 to check the clinical and biological tolerability of the treatment. Clopidogrel exerted a dose-related inhibition of ADP-induced platelet aggregation and bleeding time prolongation. In the presence of ADP (5 \lM) this inhibition ranged between 29% and 44% in comparison to pretreatment values. The bleeding times were prolonged by 1.5 to 1.7 times. These effects were non significantly different from those produced by ticlopidine. The clinical tolerability was good or fair in 97.5% of the patients. No haematological adverse events were recorded. These results allowed the selection of 75 mg once a day to evaluate and compare the antithrombotic activity of clopidogrel to that of aspirin in the CAPRIE trial.


2021 ◽  
Vol 13 (6) ◽  
pp. 3571
Author(s):  
Bogusz Wiśnicki ◽  
Dorota Dybkowska-Stefek ◽  
Justyna Relisko-Rybak ◽  
Łukasz Kolanda

The paper responds to research problems related to the implementation of large-scale investment projects in waterways in Europe. As part of design and construction works, it is necessary to indicate river ports that play a major role within the European transport network as intermodal nodes. This entails a number of challenges, the cardinal one being the optimal selection of port locations, taking into account the new transport, economic, and geopolitical situation that will be brought about by modernized waterways. The aim of the paper was to present an original methodology for determining port locations for modernized waterways based on non-cost criteria, as an extended multicriteria decision-making method (MCDM) and employing GIS (Geographic Information System)-based tools for spatial analysis. The methodology was designed to be applicable to the varying conditions of a river’s hydroengineering structures (free-flowing river, canalized river, and canals) and adjustable to the requirements posed by intermodal supply chains. The method was applied to study the Odra River Waterway, which allowed the formulation of recommendations regarding the application of the method in the case of different river sections at every stage of the research process.


2021 ◽  
Vol 104 (1) ◽  
pp. 003685042098705
Author(s):  
Xinran Wang ◽  
Yangli Zhu ◽  
Wen Li ◽  
Dongxu Hu ◽  
Xuehui Zhang ◽  
...  

This paper focuses on the effects of the off-design operation of CAES on the dynamic characteristics of the triple-gear-rotor system. A finite element model of the system is set up with unbalanced excitations, torque load excitations, and backlash which lead to variations of tooth contact status. An experiment is carried out to verify the accuracy of the mathematical model. The results show that when the system is subjected to large-scale torque load lifting at a high rotating speed, it has two stages of relatively strong periodicity when the torque load is light, and of chaotic when the torque load is heavy, with the transition between the two states being relatively quick and violent. The analysis of the three-dimensional acceleration spectrum and the meshing force shows that the variation in the meshing state and the fluctuation of the meshing force is the basic reasons for the variation in the system response with the torque load. In addition, the three rotors in the triple-gear-rotor system studied show a strong similarity in the meshing states and meshing force fluctuations, which result in the similarity in the dynamic responses of the three rotors.


2021 ◽  
Vol 22 (15) ◽  
pp. 7773
Author(s):  
Neann Mathai ◽  
Conrad Stork ◽  
Johannes Kirchmair

Experimental screening of large sets of compounds against macromolecular targets is a key strategy to identify novel bioactivities. However, large-scale screening requires substantial experimental resources and is time-consuming and challenging. Therefore, small to medium-sized compound libraries with a high chance of producing genuine hits on an arbitrary protein of interest would be of great value to fields related to early drug discovery, in particular biochemical and cell research. Here, we present a computational approach that incorporates drug-likeness, predicted bioactivities, biological space coverage, and target novelty, to generate optimized compound libraries with maximized chances of producing genuine hits for a wide range of proteins. The computational approach evaluates drug-likeness with a set of established rules, predicts bioactivities with a validated, similarity-based approach, and optimizes the composition of small sets of compounds towards maximum target coverage and novelty. We found that, in comparison to the random selection of compounds for a library, our approach generates substantially improved compound sets. Quantified as the “fitness” of compound libraries, the calculated improvements ranged from +60% (for a library of 15,000 compounds) to +184% (for a library of 1000 compounds). The best of the optimized compound libraries prepared in this work are available for download as a dataset bundle (“BonMOLière”).


2021 ◽  
pp. 089443932110068
Author(s):  
Aleksandra Urman ◽  
Mykola Makhortykh ◽  
Roberto Ulloa

We examine how six search engines filter and rank information in relation to the queries on the U.S. 2020 presidential primary elections under the default—that is nonpersonalized—conditions. For that, we utilize an algorithmic auditing methodology that uses virtual agents to conduct large-scale analysis of algorithmic information curation in a controlled environment. Specifically, we look at the text search results for “us elections,” “donald trump,” “joe biden,” “bernie sanders” queries on Google, Baidu, Bing, DuckDuckGo, Yahoo, and Yandex, during the 2020 primaries. Our findings indicate substantial differences in the search results between search engines and multiple discrepancies within the results generated for different agents using the same search engine. It highlights that whether users see certain information is decided by chance due to the inherent randomization of search results. We also find that some search engines prioritize different categories of information sources with respect to specific candidates. These observations demonstrate that algorithmic curation of political information can create information inequalities between the search engine users even under nonpersonalized conditions. Such inequalities are particularly troubling considering that search results are highly trusted by the public and can shift the opinions of undecided voters as demonstrated by previous research.


Land ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 295
Author(s):  
Yuan Gao ◽  
Anyu Zhang ◽  
Yaojie Yue ◽  
Jing’ai Wang ◽  
Peng Su

Suitable land is an important prerequisite for crop cultivation and, given the prospect of climate change, it is essential to assess such suitability to minimize crop production risks and to ensure food security. Although a variety of methods to assess the suitability are available, a comprehensive, objective, and large-scale screening of environmental variables that influence the results—and therefore their accuracy—of these methods has rarely been explored. An approach to the selection of such variables is proposed and the criteria established for large-scale assessment of land, based on big data, for its suitability to maize (Zea mays L.) cultivation as a case study. The predicted suitability matched the past distribution of maize with an overall accuracy of 79% and a Kappa coefficient of 0.72. The land suitability for maize is likely to decrease markedly at low latitudes and even at mid latitudes. The total area suitable for maize globally and in most major maize-producing countries will decrease, the decrease being particularly steep in those regions optimally suited for maize at present. Compared with earlier research, the method proposed in the present paper is simple yet objective, comprehensive, and reliable for large-scale assessment. The findings of the study highlight the necessity of adopting relevant strategies to cope with the adverse impacts of climate change.


Sign in / Sign up

Export Citation Format

Share Document