Accuracy of French census population estimates

2021 ◽  
pp. 1-20
Author(s):  
Lionel Espinasse ◽  
Vincent Le Palud ◽  
Julie Prévot ◽  
Gwenna ël Solard ◽  
Lucile Vanotti

The French population census method evolved in the early 2000s. Few studies have been published on the quality of the population estimates produced by this new method, apart from a few observing sample variance resulting from the introduction of a survey in large municipalities. However, the French census is subject to numerous quality controls throughout the process: development of a housing register, preparation of the collection, the collection itself and the post-collection, adjustment and estimation operations. The extensive involvement of stakeholders (municipalities and INSEE) in the preparation and conduct of the census leads to a very good understanding of the process. The many checks carried out throughout the process guarantee that the estimates produced are of a high quality. In addition, the census benefits from a very low non-response rate (3.9% in 2019). However, some features are not yet well known. Although many instructions are included in questionnaires, the answers given by enumerated persons are imperfect due to misunderstandings, an inability to adapt questions to real-life situations, or deliberately incorrect answers.

Author(s):  
Cristina Tassorelli ◽  
Vincenzo Silani ◽  
Alessandro Padovani ◽  
Paolo Barone ◽  
Paolo Calabresi ◽  
...  

Abstract Background The coronavirus disease 2019 (COVID-19) pandemic has severely impacted the Italian healthcare system, underscoring a dramatic shortage of specialized doctors in many disciplines. The situation affected the activity of the residents in neurology, who were also offered the possibility of being formally hired before their training completion. Aims (1) To showcase examples of clinical and research activity of residents in neurology during the COVID-19 pandemic in Italy and (2) to illustrate the point of view of Italian residents in neurology about the possibility of being hired before the completion of their residency program. Results Real-life reports from several areas in Lombardia—one of the Italian regions more affected by COVID-19—show that residents in neurology gave an outstanding demonstration of generosity, collaboration, reliability, and adaptation to the changing environment, while continuing their clinical training and research activities. A very small minority of the residents participated in the dedicated selections for being hired before completion of their training program. The large majority of them prioritized their training over the option of earlier employment. Conclusions Italian residents in neurology generously contributed to the healthcare management of the COVID-19 pandemic in many ways, while remaining determined to pursue their training. Neurology is a rapidly evolving clinical field due to continuous diagnostic and therapeutic progress. Stakeholders need to listen to the strong message conveyed by our residents in neurology and endeavor to provide them with the most adequate training, to ensure high quality of care and excellence in research in the future.


Author(s):  
Tor E. Berg ◽  
Edvard Ringen

This paper describes the need for improved methods for validating numerical models used in shiphandling simulators. Such models vary in complexity, from rather simplistic models used for initial shiphandling training at maritime training centers to high-quality models used in the study of advanced marine operations. High-quality simulation models are also used in investigations of maritime accidents such as collisions and groundings. The SIMMAN 2008 conference presented the results of benchmarking studies of simulation tools currently used by research institutes, universities and training centers around the world. Many of these tools employ models based on numerical calculations using methods based on potential or viscous fluid flow, experiments using scale ship models (free running or captive) or semi empirical expressions based on regression analysis of previous model tests. The organizers of SIMMAN 2008 made the hull characteristics of certain ship types available for a comparative study of simulation maneuvering models. The outcome of the benchmark study (using IMO standard maneuvers as case study maneuvers) showed that simulated results varied significantly. In the opinion of the authors, there is an urgent need for new validation studies. The first part of this paper discusses the concepts of simulation model fidelity, verification and validation and the present guidelines issued by ITTC for validation of maneuvering simulation models. The second part looks at the outcomes of the SIMMAN 2008 conference and describes MARINTEK’s contribution to the benchmark study. The use of real-world measurements in model validation is briefly discussed. The need for registration of actual test conditions, as well as the types of tests that should be included in a test scheme, are presented. Finally, the authors discuss validation requirements with respect to the actual application of the selected simulation model as an engineering tool that can be transferred to training simulators used by maritime training centers. It is assumed that simplified simulation models may reduce the quality of simulator based training for ship officers. It is believed that increased quality of simulator model will improve the transfer of training from simulators to real life operations and remove some of the uncertainties related to investigation of maritime accidents.


2019 ◽  
pp. 1639-1648
Author(s):  
Abeer Sufyan Khalil ◽  
Rawaa Dawoud Al-Dabbagh

The continuous increases in the size of current telecommunication infrastructures have led to the many challenges that existing algorithms face in underlying optimization. The unrealistic assumptions and low efficiency of the traditional algorithms make them unable to solve large real-life problems at reasonable times.The use of approximate optimization techniques, such as adaptive metaheuristic algorithms, has become more prevalent in a diverse research area. In this paper, we proposed the use of a self-adaptive differential evolution (jDE) algorithm to solve the radio network planning (RNP) problem in the context of the upcoming generation 5G. The experimental results prove the jDE with best vector mutation surpassed the other metaheuristic variants, such as DE/rand/1 and classical GA, in term of deployment cost, coverage rate and quality of service (QoS).


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4486 ◽  
Author(s):  
Mohan Li ◽  
Yanbin Sun ◽  
Yu Jiang ◽  
Zhihong Tian

In sensor-based systems, the data of an object is often provided by multiple sources. Since the data quality of these sources might be different, when querying the observations, it is necessary to carefully select the sources to make sure that high quality data is accessed. A solution is to perform a quality evaluation in the cloud and select a set of high-quality, low-cost data sources (i.e., sensors or small sensor networks) that can answer queries. This paper studies the problem of min-cost quality-aware query which aims to find high quality results from multi-sources with the minimized cost. The measurement of the query results is provided, and two methods for answering min-cost quality-aware query are proposed. How to get a reasonable parameter setting is also discussed. Experiments on real-life data verify that the proposed techniques are efficient and effective.


PEDIATRICS ◽  
1965 ◽  
Vol 36 (4) ◽  
pp. 648-657
Author(s):  
Charles U. Lowe ◽  
David B. Coursin ◽  
George N. Donnell ◽  
Felix P. Heald ◽  
Robert Kaye ◽  
...  

Only a very small percentage of American and Canadian families produce all their own food. Under the circumstances, over 200 million people and close to 5 milion infants are dependent, to a greater or lesser extent, upon codes, ordinances, and laws enacted to insure that produce in the market place is sound, wholesome, and free of noxious or toxic contaminants. In most instances, neither the consuming public nor the medical profession is aware of the complex and comprehensive nature of the regulatory network which controls the quality of the food supply. Various governmental agencies, federal, state, provincial in Canada, and municipal, provide minimum standards of quality for food and nutritional products and promulgate codes governing manufacturing procedures. Some codes have the force of law whereas others are prepared as guides, particularly for those industries manufacturing food which does not enter interstate commernce, and compliance may be voluntary unless local legislation exists to enforce these codes. Compliance with these standards is determined both by governmental agencies and manufacturers using selective sampling and assay of products at various stages prior to marketing. However, governmental agencies are able to spot check only a small number of the many lots of products and foods under their jurisdiction. The high quality of today's commodity foods and nutritional products has resulted in large part from widespread compliance, self-policing, and desire by industry to surpass the minimum requirements set by law and to provide the consumer with superior products. Through the full co-operation of industry with governmental agencies, a great number and variety of food and nutrition products of high quality and uniformity is available to the consumer, and especially to the infant.


2021 ◽  
Vol 19 (1) ◽  
pp. 2284
Author(s):  
Fernando Fernandez-Llimos

Scholarly publishing is in a crisis, with the many stakeholders complaining about different aspects of the system. Authors want fast publication times, high visibility and publications in high-impact journals. Readers want freely accessible, high-quality articles. Peer reviewers want recognition for the work they perform to ensure the quality of the published articles. However, authors, peer reviewers, and readers are three different roles played by the same group of individuals, the users of the scholarly publishing system—and this system could work based on a collaborative publishing principle where “nobody pays, and nobody gets paid”.


Author(s):  
Evan W. Duggan ◽  
Han Reichgelt

Business organizations are still struggling to improve the quality of information systems (IS) after many research efforts and years of accumulated experience in delivering them. The IS community is not short on prescriptions for improving quality; however the utterances are somewhat cacophonous as proponents of quality-enhancing approaches hyperbolize claims of their efficacy and/or denigrate older approaches, often ignoring the importance of context. In this chapter we undertake an extensive review of the IS quality literature to balance the many perspectives of stakeholders in this heterogeneous community with the necessarily varied prescriptions for producing high-quality systems. We develop an IS quality model, which distills determinants of IS product quality into effects attributable to people, processes, and practices and denote that IS success results from the combination of discernible IS quality and stakeholders’ perceptions of IS quality. This chapter serves as a general introduction to the detailed analyses of topics that follow in subsequent chapters but also provides insights that are not covered elsewhere in the book.


2010 ◽  
Vol 49 (06) ◽  
pp. 550-570 ◽  
Author(s):  
H. Tange ◽  
H. J. van den Herik ◽  
A. Hasman ◽  
A. Latoszek-Berendsen

Summary Background: Guidelines are among us for over 30 years. Initially they were used as algorithmic protocols by nurses and other ancillary personnel. Many physicians regarded the use of guidelines as cookbook medicine. However, quality and patient safety issues have changed the attitude towards guidelines. Implementing formalized guidelines in a decision support system with an interface to an electronic patient record (EPR) makes the application of guidelines more personal and therefore acceptable at the moment of care. Objective: To obtain, via a literature review, an insight into factors that influence the design and implementation of guidelines. Methods: An extensive search of the scientific literature in PubMed was carried out with a focus on guideline characteristics, guideline development and implementation, and guideline dissemination. Results: We present studies that enable us to explain the characteristics of high-quality guidelines, and new advanced methods for guideline formalization, computerization, and implementation. We show how the guidelines affect processes of care and the patient outcome. We discuss the reasons of low guideline adherence as presented in the literature and comment upon them. Conclusions: Developing high-quality guidelines requires a skilled team of people and sufficient budget. The guidelines should give personalized advice. Computer-interpretable guidelines (CIGs) that have access to the patient’s EPR are able to give personal advice. Because of the costs, sharing of CIGs is a critical requirement for guideline development, dissemination, and implementation. Until now this is hardly possible, because of the many models in use. However, some solutions have been proposed. For instance, a standardized terminology should be imposed so that the terms in guidelines can be matched with terms in an EPR. Also, a dissemination model for easy updating of guidelines should be established. The recommendations should be based on evidence instead of on consensus. To test the quality of the guideline, appraisal instruments should be used to assess the guideline as a whole, as well as checking the quality of the recommendations individually. Only in this way optimal guideline advice can be given on an individual basis at a reasonable cost.


Sign in / Sign up

Export Citation Format

Share Document