Shadow detection and correction using a combined 3D GIS and image processing approach

2019 ◽  
Vol 29 (3-4) ◽  
pp. 241-253
Author(s):  
Safa Ridene ◽  
Reda Yaagoubi ◽  
Imane Sebari ◽  
Audrey Alajouanine

While shadow can give useful information about size and shape of objects, it can pose problems in feature detection and object detection, thereby, it represents one of the major perturbator phenomenons frequently occurring on images and unfortunately, it is inevitable. “Shadows may lead to the failure of image analysis processes and also cause a poor quality of information which in turn leads to problems in implementation of algorithms.” (Mahajan and Bajpayee, 2015). It also affects multiple image analysis applications, whereby shadow cast by buildings deteriorate the spectral values of the surfaces. Therefore, its presence causes a deterioration in the visual image's quality and limits the information that the former could give. Ignoring the existence of shadows in images may cause serious problems in various visual processing applications such as false objects detection. In this context, many researches have been conducted through years. However, it is still a challenge for analysts all over the world to find a fully automated and efficient method for shadow removal from images.

2008 ◽  
Vol 65 (7) ◽  
pp. 1334-1345 ◽  
Author(s):  
H. Dobby ◽  
L. Allan ◽  
M. Harding ◽  
C. H. Laurenson ◽  
H. A. McLay

Abstract Dobby, H., Allan, L., Harding, M., Laurenson, C. H., and McLay, H. A. 2008. Improving the quality of information on Scottish anglerfish fisheries: making use of fishers’ data. – ICES Journal of Marine Science, 65: 1334–1345. In recent years, the International Council for the Exploration of the Sea (ICES) Working Group on the Assessment of Northern Shelf Demersal Stocks has been unable to provide an analytical assessment for anglerfish. One of the reasons for this has been the poor quality of the commercial catch-and-effort data, with ICES and the European Commission’s Scientific, Technical, and Economic Committee for Fisheries (STECF) stressing the need for reliable information on which to base estimates of stock status. In response, and following consultation with the fishing industry, an anglerfish tallybook project was implemented in Scotland as part of a long-term approach to providing better data. Tallybooks are completed on a haul-by-haul basis. Skippers record catches of anglerfish (by size category) and other species where possible, together with information on haul location, duration, and depth. Individual vessel catch rates are calculated and used to provide insights into temporal trends in the stock and the spatial distribution of the fishery. The history of the fishery and management advice are summarized, and an overview of the tallybook project is provided. Catch rates are analysed using a generalized additive modelling approach which incorporates seasonal, annual, spatial, and vessel-dependent effects. The results show increased catch rates between 2006 and 2007.


2001 ◽  
Vol 16 (1) ◽  
pp. 170-172 ◽  
Author(s):  
J.W. Allen ◽  
R.J. Finch ◽  
M.G. Coleman ◽  
L.K. Nathanson ◽  
N.A. O'Rourke ◽  
...  

2021 ◽  
Vol 3 (3) ◽  
pp. 994-1056
Author(s):  
Rodolfo Paolucci ◽  
André Pereira Neto

The Internet is a major source of health information, but the poor quality of the information has been criticized for decades. We looked at methods for assessing the quality of health information, updating the findings of the first systematic review from 2002. We searched 9 Health Sciences, Information Sciences, and multidisciplinary databases for studies. We identified 7,718 studies and included 299. Annual publications increased from 9 (2001) to 53 (2013), with 89% from developed countries. We identified 20 areas of knowledge. Six tools have been used worldwide, but 43% of the studies did not use any of them. The methodological framework of criteria from the first review has been the same. The authors were the evaluators in 80% of the studies. This field of evaluation is expanding. No instrument simultaneously covers the evaluation criteria. There is still a need for a methodology involving experts and users and evidence-based indicators of accuracy.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Mohammed Sultan Al-Ak’hali ◽  
Hytham N. Fageeh ◽  
Esam Halboub ◽  
Mohammed Nasser Alhajj ◽  
Zaihan Ariffin

Abstract Background Currently, the Internet seems to be a helpful tool for obtaining information about everything that we think about, including diseases, their prevention and treatment approaches. However, doubts exist regarding the quality and readability of such information. This study sought to assess the quality and readability of web-based Arabic information on periodontal disease. Methods In this infodemiological study, the Google, Yahoo!, and Bing search engines were searched using specific Arabic terms on periodontal disease. The first 100 consecutive websites from each engine were obtained. The eligible websites were categorized as commercial, health/professional, journalism, and other. The following tools were applied to assess the quality of the information on the included websites: the Health on the Net Foundation Code of Conduct (HONcode), the Journal of the American Medical Association (JAMA) benchmarks, and the DISCERN tool. The readability was assessed using an online readability tool. Results Of the 300 websites, 89 were eligible for quality and readability analyses. Only two websites (2.3%) were HONcode certified. Based on the DISCERN tool, 43 (48.3%) websites had low scores. The mean score of the JAMA benchmarks was 1.6 ± 1.0, but only 3 (3.4%) websites achieved “yes” responses for all four JAMA criteria. Based on the DISCERN tool, health/professional websites revealed the highest quality of information compared to other website categories. Most of the health/professional websites revealed moderate-quality information, while 55% of the commercial websites, 66% of journalism websites, and 43% of other websites showed poor quality information. Regarding readability, most of the analyzed websites presented simple and readable written content. Conclusions Aside from readable content, Arabic health information on the analyzed websites on periodontal disease is below the required level of quality.


2019 ◽  
Vol 49 (6) ◽  
pp. 1142-1152 ◽  
Author(s):  
Olivia Genevieve El Jassar ◽  
Isobel Nadia El Jassar ◽  
Evangelos I. Kritsotakis

Purpose This paper aims to assess the quality of health information available to patients seeking online advice about the vegan diet. Design/methodology/approach A cross-sectional sample of patient-oriented websites was selected by searching for “Vegan diet” in the three most popular search engines. The first 50 websites from each search were examined. Quality of information was assessed using the DISCERN instrument, a questionnaire tool designed to judge the quality of written information on treatment choices. Readability was determined with the Flesch Reading Ease score (FRES) and Flesch–Kincaid Grade Level (FKGL). Relevance to health and disease was assessed by counting the appearances of ten related keywords, generated by searching the query term “Vegan diet” into PubMed and recording the top ten health-related words. Findings Of 150 websites retrieved, 67 (44.7 per cent) met inclusion criteria. Of these, 42 (62.7 per cent) were non-pharmaceutical commercial, 7 (10.4 per cent) institutional, 6 (9.0 per cent) magazines or newspapers, 4 (6.0 per cent) support websites, 4 (6.0 per cent) charitable websites, 2 (3.0 per cent) encyclopedias and 2 (3.0 per cent) personal blogs. The overall DISCERN rating of the websites was fair (mean 41.6 ± 15.4 on an 80-point scale), but nearly half (31/67) of the websites were assessed as having “poor” or “very poor” quality of information. FRES and FKGL readability indices met the recommended standards on average (means 63.3 ± 9.6 and 6.6 ± 1.7, respectively), but did not correlate with high DISCERN ratings. Analysis of variance on DISCERN scores (F(6,60) = 6.536, p < 0.001) and FRES (F(6,60) = 2.733, p = 0.021) yielded significant variation according to website source type. Originality/value Quality standards of health information available on the internet about the vegan diet vary greatly. Patients are at risk of exposure to low quality and potentially misleading information over the internet and should be consulting dietitians or physicians to avoid being misled.


2020 ◽  
Vol 65 (4) ◽  
pp. 32-45
Author(s):  
Józef Oleński

The aim of the paper is to examine the influence of the information environment of a society or economy on public statistics, as well as demonstrating how official statistics can affect the quality of information environments of modern societies and economies in the context of global technologies and IT systems. In modern information societies and knowledge-based economies, the quality of information environments in which citizens, economic entities, public administration institutions and international organisations are functioning has a decisive influence on the political, social and economic order. These environments are shaped by interest groups which control information systems and processes at the local, national, international and global levels. The above-mentioned groups take advantage of the fundamental law of information, i.e. that poor quality information overrides good quality information, to eliminate any information that could make it more difficult for them to control the behavior of people, including social groups, and entities created by people, which participate in the political and economic processes. The paper examines the effects the contamination of the social information environment has on the political and social life and the economy. Attention has been drawn to the influence of the quality of information environment on official statistics and the perception of statistical data, as well as to using reliable statistical data to disinform and contaminate social and economic information environments by manipulating these data. The paper also shows how public statistics can influence information environments and its significance for the safety of the general public.


2011 ◽  
Vol 42 (8) ◽  
pp. 1753-1762 ◽  
Author(s):  
N. J. Reavley ◽  
A. J. Mackinnon ◽  
A. J. Morgan ◽  
M. Alvarez-Jimenez ◽  
S. E. Hetrick ◽  
...  

BackgroundAlthough mental health information on the internet is often of poor quality, relatively little is known about the quality of websites, such as Wikipedia, that involve participatory information sharing. The aim of this paper was to explore the quality of user-contributed mental health-related information on Wikipedia and compare this with centrally controlled information sources.MethodContent on 10 mental health-related topics was extracted from 14 frequently accessed websites (including Wikipedia) providing information about depression and schizophrenia, Encyclopaedia Britannica, and a psychiatry textbook. The content was rated by experts according to the following criteria: accuracy, up-to-dateness, breadth of coverage, referencing and readability.ResultsRatings varied significantly between resources according to topic. Across all topics, Wikipedia was the most highly rated in all domains except readability.ConclusionsThe quality of information on depression and schizophrenia on Wikipedia is generally as good as, or better than, that provided by centrally controlled websites, Encyclopaedia Britannica and a psychiatry textbook.


Author(s):  
Alessandra Perra ◽  
Antonio Preti ◽  
Valerio De Lorenzo ◽  
Antonio Egidio Nardi ◽  
Mauro G. Carta

Abstract Background The Internet is increasingly used as a source of information. This study investigates with a multidimensional methodology the quality of information of websites dedicated to obesity treatment and weight-loss interventions. We compared websites in English, a language that it is used for the international scientific divulgation, and in Italian, a popular local language. Methods Level of Evidence: Level I, systematic review search on four largely used search engines. Duplicated and unrelated websites were excluded. We checked: popularity with PageRank; technological quality with Nibbler; readability with the Flesch Reading Ease test or the Gulpease readability index; quality of information with the DISCERN scale, the JAMA benchmark criteria, and the adherence to the Health on the Net Code. Results 63 Italian websites and 41 English websites were evaluated. English websites invested more in the technological quality especially for the marketing, experience of the user, and mobile accessibility. Both the Italian and English websites were of poor quality and readability. Conclusions These results can inform guidelines for the improvement of health information and help Internet users to achieve a higher level of information. Users must find benefits of treatment, support to the shared decision-making, the sources used, the medical editor's supervision, and the risk of postponing the treatment.


A comprehensive field data were collated analyzed and processed for the validation of open source bathymetry data GEBCO 30 sec arc resolution data in the selected location of the Red sea. Different software and techniques were used to verify the quality of GEBCO data in the field conditions. The image analysis using different software proves the poor quality and resolution of the GEBCO data in the nearshore areas. The analyses also brought out the complex topographical nature of the Red sea and precautions for the usage of open source bathymetric data in the nearshore areas of the Red sea, where high quality and fine resolution data required. The different statistical analyses also verified the results of image analysis. The statistical analysis also shows the poor quality and course resolution of the GEBCO 30sec resolution data in the Red sea especially nearshore areas. The study recommends hydrographic survey data for the nearshore areas, where high quality and resolution data are needed


In this paper a New Binarization technique has been proposed for the image had not uniformly illuminated due to the degradation that may be caused due to the Image corrupted by Noise ,smear etc. Which leads to the poor quality of the image .To overcome from these, Pre processing that performs the certain operations for the degraded image by using Region Based Approach which uses the Global, Local thresholding for the image analysis .The Ni-Black method had been used for the enhancment of the Image.The Enhanced Ni-Black has can be used in the documents and images that had affected with uncertain transtion interms of pixels, Intensity Values.The experimental results are shown in terms of the Accuracy, and the Processing speed which gives the better performance than the previous method.


Sign in / Sign up

Export Citation Format

Share Document