scholarly journals Review on Real-Time Search Result Minimization using NLP

Author(s):  
Prof. (Dr) Pawan Bhaladhare

At any given time thousands of people are searching about a particular thing and only about a fraction of those people might get the answer that they wanted. Whenever we do a quick search the possibility of getting the right answer is good but when we look into the time required to reach the right answer is not always fast as for every single search query we get about hundreds of search results which is good but also is a bit confusing for the user. The user might have to try many links only to reach his desired answer. So, in our proposed system the user just has to enter the search query and has to select his desired website for the answer and in a few seconds the answer will be displayed to him. When the user inputs the search query and a particular website he data is scrapped from that particular website and is then fed to a NLP sys-tem which is responsible to minimize the size of the answer keeping in mind not to change or lose any valuable data.

Author(s):  
Chandran M ◽  
Ramani A. V

<p>The research work is about to test the quality of the website and to improve the quality by analyzing the hit counts, impressions, clicks, count through rates and average positions. This is accomplished using WRPA and SEO technique. The quality of the website mainly lies on the keywords which are present in it. The keywords can be of a search query which is typed by the users in the search engines and based on these keywords, the websites are displayed in the search results. This research work concentrates on bringing the particular websites to the first of the search result in the search engine. The website chosen for research is SRKV. The research work is carried out by creating an index array of Meta tags. This array will hold all the Meta tags. All the search keywords for the website from the users are stored in another array. The index array is matched and compared with the search keywords array. From this, hit to count is calculated for the analysis. Now the calculated hit count and the searched keywords will be analyzed to improve the performance of the website. The matched special keywords from the above comparison are included in the Meta tag to improve the performance of the website. Again all the Meta tags and newly specified keywords in the index array are matched with the SEO keywords. If this matches, then the matched keyword will be stored for improving the quality of the website. Metrics such as impressions, clicks, CTR, average positions are also measured along with the hit counts. The research is carried out under different types of browsers and different types of platforms. Queries about the website from different countries are also measured. In conclusion, if the number of the clicks for the website is more than the average number of clicks, then the quality of the website is good. This research helps in improvising the keywords using WRPA and SEO and thereby improves the quality of the website easily.</p>


Author(s):  
Rotimi-Williams Bello ◽  
Firstman Noah Otobo

Search Engine Optimization (SEO) is a technique which helps search engines to find and rank one site over another in response to a search query. SEO thus helps site owners to get traffic from search engines. Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. Choosing the right keywords to optimize for is thus the first and most crucial step to a successful SEO campaign. In the context of SEO, keyword density can be used as a factor in determining whether a webpage is relevant to a specified keyword or keyword phrase. SEO is known for its contribution as a process that affects the online visibility of a website or a webpage in a web search engine's results. In general, the earlier (or higher ranked on the search results page), and more frequently a website appears in the search results list, the more visitors it will receive from the search engine's users; these visitors can then be converted into customers. It is the objective of this paper to re-present black hat SEO technique as an unprofessional but profitable method of converting website users to customers. Having studied and understood white hat SEO, black hat SEO, gray hat SEO, crawling, indexing, processing and retrieving methods used by search engines as a web software program or web based script to search for documents and files for keywords over the internet to return the list of results containing those keywords; it would be seen that proper application of SEO gives website a better user experience, SEO helps build brand awareness through high rankings, SEO helps circumvent competition, and SEO gives room for high increased return on investment.


2019 ◽  
Vol 2019 ◽  
Author(s):  
Jan Rensinghoff ◽  
Florian Marius Farke ◽  
Markus Dürmuth ◽  
Tobias Gostomzyk

The new European right to be forgotten (Art. 17 of the European General Data Protection Regulation (GDPR) grants EU citizens the right to demand the erasure of their personal data from anyone who processes their personal data. To enforce this right to erasure may be a problem for many of those data processors. On the one hand, they need to examine any claim to remove search results. On the other hand, they have to balance conflicting rights in order to prevent over-blocking and the accusation of censorship. The paper examines the criteria which are potentially involved in the decision-making process of search engines when it comes to the right to erasure. We present an approach helping search engine operators and individuals to assess and decide whether search results may have to be deleted or not. Our goal is to make this process more transparent and consistent, providing more legal certainty for both the search engine operator and the person concerned by the search result in question. As a result, we develop a model to estimate the chances of success to delete a particular search result for a given person. This is a work in progress.


Stream processing systems need to be elastically scalable to process and respond the unpredictable massive load spike in real-time with high throughput and low latency. Though the modern cloud technologies can help in elastically provisioning the required computing resources on-the-fly, finding out the right point-in-time varies among systems based on their expected QoS characteristics. The latency sensitivity of the stream processing applications varies based on their nature and pre-set requirements. For few applications, even a little latency in the response will have huge impact, whereas for others the little latency will not have that much impact. For the former ones, the processing systems are expected to be highly available, elastically scalable, and fast enough to perform, whenever there is a spike. The time required to elasticity provision the systems under FaaS is very high, comparing to provisioning the Virtual Machines and Containers. However, the current FaaS systems have some limitations that need to be overcome to handle the unexpected spike in real-time. This paper proposes a new algorithm called Elastic-FaaS on top of the existing FaaS to overcome this QoS latency issue. Our proposed algorithm will provision required number of FaaS container instances than any typical FaaS can provision normally, whenever there is a demand to avoid the latency issue. We have experimented our algorithm with an event stream processing system and the result shows that our proposed Elastic-FaaS algorithm performs better than typical FaaS by improving the throughput that meets the high accuracy and low latency requirements.


2019 ◽  
Vol 118 (6) ◽  
pp. 97-99
Author(s):  
Arockia Jeyasheela A ◽  
Dr.S. Chandramohan

This study is discussed about the viral marketing. It is a one of the key success of marketing. This paper gave the techniques of viral marketing. It can be delivered word of mouth. It can be created by both the representatives of a company and consumer (individuals or communities). The right viral message with go to right consumer to the right time. Viral marketing is easy to attract the consumer. It is most important advertising to consumer. It involves consumer perception, organization contribution, blogs, SMO (Social Media Optimize), SEO (Social Engine Optimize). Principles of viral marketing are social profile gathering, Proximity Market, Real time Key word density.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4237
Author(s):  
Hoon Ko ◽  
Kwangcheol Rim ◽  
Isabel Praça

The biggest problem with conventional anomaly signal detection using features was that it was difficult to use it in real time and it requires processing of network signals. Furthermore, analyzing network signals in real-time required vast amounts of processing for each signal, as each protocol contained various pieces of information. This paper suggests anomaly detection by analyzing the relationship among each feature to the anomaly detection model. The model analyzes the anomaly of network signals based on anomaly feature detection. The selected feature for anomaly detection does not require constant network signal updates and real-time processing of these signals. When the selected features are found in the received signal, the signal is registered as a potential anomaly signal and is then steadily monitored until it is determined as either an anomaly or normal signal. In terms of the results, it determined the anomaly with 99.7% (0.997) accuracy in f(4)(S0) and in case f(4)(REJ) received 11,233 signals with a normal or 171anomaly judgment accuracy of 98.7% (0.987).


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
K.L Hong ◽  
O Amirana ◽  
T Ransbury ◽  
B Glover

Abstract Background It has been established in previous animal and human studies that it is possible to assess lesion formation in real-time using optical means during the application of radiofrequency (RF) energy in cardiac ablation procedures. The optical interrogation was accomplished using a novel catheter and instrument system whereby the catheter has embedded optical fibers that transmit and receive light from the instrument. Purpose The aim of this study was to see if there are similar indications of lesion formation, detected by the same optical means, during the application of pulsed field ablation (PFA) energy to cause lesions through electroporation. Methods A series of 3 anesthetized pigs underwent PFA in the right atrium. An 8-electrode circular catheter was placed high in the right atrium, near the superior vena cava, to simulate pulmonary vein isolation as part of an AF ablation procedure. The optical catheter was placed adjacent to the circular catheter between stimulation electrode pairs. A bolus of adenosine was administered to create a window of asystole to avoid stimulation on the T-wave. Bipolar PFA was delivered immediately post drug infusion and the optical signature from the catheter was recorded and displayed in real-time. Electrograms were recorded and the mapping of the lesion was performed with the optical catheter at the following time intervals post PFA delivery: 0 min, 15 min, 1 hour, and 3 hours. Necropsy and histology followed the procedure. Results The optical signal is distinctly higher in intensity during the PFA pulse train. The optical signal showed an immediate significant decrease and a slow but steady decay over the mapping interval. Electrogram reduction accompanied PFA application and also showed a marked reduction over the mapping interval. The optical signal amplitudes were markedly lower when on the lesion compared to healthy non-ablated myocardium as predicted. Conclusions Preliminary results indicate that optical mapping detects immediate tissue changes during PFA at these energy levels and hence could be is a viable method of evaluating lesion formation during and after PFA energy application. The optical signal indicates that cell damage occurs immediately at these energy levels and continues to progress slowly in lesions made by PFA energy compared to those made by RF energy. The findings also suggest that optical mapping can identify acute lesions made with PFA energy in real-time implying that optical mapping could evolve as a PFA gap detector. Funding Acknowledgement Type of funding source: None


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4141
Author(s):  
Wouter Houtman ◽  
Gosse Bijlenga ◽  
Elena Torta ◽  
René van de Molengraft

For robots to execute their navigation tasks both fast and safely in the presence of humans, it is necessary to make predictions about the route those humans intend to follow. Within this work, a model-based method is proposed that relates human motion behavior perceived from RGBD input to the constraints imposed by the environment by considering typical human routing alternatives. Multiple hypotheses about routing options of a human towards local semantic goal locations are created and validated, including explicit collision avoidance routes. It is demonstrated, with real-time, real-life experiments, that a coarse discretization based on the semantics of the environment suffices to make a proper distinction between a person going, for example, to the left or the right on an intersection. As such, a scalable and explainable solution is presented, which is suitable for incorporation within navigation algorithms.


Sign in / Sign up

Export Citation Format

Share Document