The transfer of infiltration measurement results to other sewers by means of discriminant analysis

2008 ◽  
Vol 57 (9) ◽  
pp. 1429-1435
Author(s):  
Torsten Franz ◽  
Peter Krebs

The cost-oriented and sustainable operation of sewer systems requires a comprehensive knowledge about the infiltration situation in the catchment. Owing to the high expenditures for infiltration measurements a reliable transfer of measurement results to other sewer sections would be highly beneficial. Assuming a functional relationship between sewer characteristics and infiltration rates can be identified, such a transfer can be realised by means of classification techniques. In this paper a method is introduced which is based on discriminant analysis and which allows for a transfer of measurement results to similar sub-catchments. The method was applied using two data sets with measured or virtual infiltration rates. It yields acceptable results as a total fraction of 50% to 75% of the investigated sub-catchments was assigned correctly. Furthermore, additional information to assess the results was provided. The quality of the transfer results depends strongly on the homogeneity of the considered sub-catchments. Due to this restriction the practical applicability of the method is restricted. Nevertheless, it might be used as a screening procedure for planning of effective detailed infiltration investigations.

1980 ◽  
Vol 3 (4) ◽  
pp. 421-421
Author(s):  
Jerome Joffe

Numerous problem areas were encountered in the evaluation of a voluntary second surgical opinion program. Some problems could be handled only on a conceptual level, while others require for their solution the integration of existing with new data sets. (1) Reliance on program participants to provide medical data was sensitivity tested and found to have minimal impact. (2) Program impact on hospital bed reduction was considered in the context of the total market and political environment in the region, and the anticipated duration of the program. A reasonable judgment was that almost the entire fixed component in the per diem cost of anticipated patient days foregone could be written off as part of the cost savings of the program. (3) As hospital reporting systems identify only average per diem costs, a methodology was developed to separate surgical from nonsurgical case costs. (4) Quality of care evaluation will incorporate a substratum of cases for which there exists a control group within the program user population. Outcome measures obtainable through survey interviews were identified.


2017 ◽  
Vol 33 (2) ◽  
pp. 455-475
Author(s):  
Mary H. Mulry ◽  
Andrew D. Keller

Abstract The U.S. Census Bureau is currently conducting research on ways to use administrative records to reduce the cost and improve the quality of the 2020 Census Nonresponse Followup (NRFU) at addresses that do not self-respond electronically or by mail. Previously, when a NRFU enumerator was unable to contact residents at an address, he/she found a knowledgeable person, such as a neighbor or apartment manager, who could provide the census information for the residents. This was called a proxy response. The Census Bureau’s recent advances in merging federal and third-party databases raise the question: Are proxy responses for NRFU addresses more accurate than the administrative records available for the housing unit? Our study attempts to answer this question by comparing the quality of proxy responses and the administrative records for those housing units in the same timeframe using the results of 2010 Census Coverage Measurement (CCM) Program. The assessment of the quality of the proxy responses and the administrative records in the CCM sample of block clusters takes advantage of the extensive fieldwork, processing, and clerical matching conducted for the CCM.


Author(s):  
Abhilasha Rangra ◽  
Vivek Kumar Sehgal ◽  
Shailendra Shukla

Cloud computing represents a new era of using high quality and a lesser quantity of resources in a number of premises. In cloud computing, especially infrastructure base resources (IAAS), cost denotes an important factor from the service provider. So, cost reduction is the major challenge but at the same time, the cost reduction increases the time which affects the quality of the service provider. This challenge in depth is related to the balance between time and cost resulting in a complex decision-based problem. This analysis helps in motivating the use of learning approaches. In this article, the proposed multi-tasking convolution neural network (M-CNN) is used which provides learning of task-based deadline and cost. Further, provides a decision for the process of task scheduling. The experimental analysis uses two types of dataset. One is the tweets and the other is Genome workflow and the comparison of the method proposed has been done with the use of distinct approaches such as PSO and PSO-GA. Simulated results show significant improvement in the use of both the data sets.


2014 ◽  
Vol 11 (2) ◽  
Author(s):  
Pavol Král’ ◽  
Lukáš Sobíšek ◽  
Mária Stachová

Data quality can be seen as a very important factor for the validity of information extracted from data sets using statistical or data mining procedures. In the paper we propose a description of data quality allowing us to characterize data quality of the whole data set, as well as data quality of particular variables and individual cases. On the basis of the proposed description, we define a distance based measure of data quality for individual cases as a distance of the cases from the ideal one. Such a measure can be used as additional information for preparation of a training data set, fitting models, decision making based on results of analyses etc. It can be utilized in different ways ranging from a simple weighting function to belief functions.


2013 ◽  
Vol 23 (2) ◽  
pp. 463-471 ◽  
Author(s):  
Tomasz Górecki ◽  
Maciej Łuczak

The Linear Discriminant Analysis (LDA) technique is an important and well-developed area of classification, and to date many linear (and also nonlinear) discrimination methods have been put forward. A complication in applying LDA to real data occurs when the number of features exceeds that of observations. In this case, the covariance estimates do not have full rank, and thus cannot be inverted. There are a number of ways to deal with this problem. In this paper, we propose improving LDA in this area, and we present a new approach which uses a generalization of the Moore-Penrose pseudoinverse to remove this weakness. Our new approach, in addition to managing the problem of inverting the covariance matrix, significantly improves the quality of classification, also on data sets where we can invert the covariance matrix. Experimental results on various data sets demonstrate that our improvements to LDA are efficient and our approach outperforms LDA.


2008 ◽  
Vol 84 (3) ◽  
pp. 375-377
Author(s):  
Paul M. Woodard

Provincial forest management agencies across Canada are attempting to recover suppression costs plus losses to real property due to human-caused fires when negligence is involved. These agencies are responsible for investigating these fires, and they commonly restrict all access to the fire origin area. These agencies commonly employ well trained fire investigators, who are well aware of standards for documenting wildland fires. However, in many cases, the quality of the investigations is poor, and the cost of finding this additional information is great. In this paper, I identify the minimum information required before an investigation file should be considered complete and charges can be laid. Key words: wildland fire, investigation, reports, litigation, standards


Author(s):  
G. Lehmpfuhl

Introduction In electron microscopic investigations of crystalline specimens the direct observation of the electron diffraction pattern gives additional information about the specimen. The quality of this information depends on the quality of the crystals or the crystal area contributing to the diffraction pattern. By selected area diffraction in a conventional electron microscope, specimen areas as small as 1 µ in diameter can be investigated. It is well known that crystal areas of that size which must be thin enough (in the order of 1000 Å) for electron microscopic investigations are normally somewhat distorted by bending, or they are not homogeneous. Furthermore, the crystal surface is not well defined over such a large area. These are facts which cause reduction of information in the diffraction pattern. The intensity of a diffraction spot, for example, depends on the crystal thickness. If the thickness is not uniform over the investigated area, one observes an averaged intensity, so that the intensity distribution in the diffraction pattern cannot be used for an analysis unless additional information is available.


2012 ◽  
pp. 24-47
Author(s):  
V. Gimpelson ◽  
G. Monusova

Using different cross-country data sets and simple econometric techniques we study public attitudes towards the police. More positive attitudes are more likely to emerge in the countries that have better functioning democratic institutions, less prone to corruption but enjoy more transparent and accountable police activity. This has a stronger impact on the public opinion (trust and attitudes) than objective crime rates or density of policemen. Citizens tend to trust more in those (policemen) with whom they share common values and can have some control over. The latter is a function of democracy. In authoritarian countries — “police states” — this tendency may not work directly. When we move from semi-authoritarian countries to openly authoritarian ones the trust in the police measured by surveys can also rise. As a result, the trust appears to be U-shaped along the quality of government axis. This phenomenon can be explained with two simple facts. First, publicly spread information concerning police activity in authoritarian countries is strongly controlled; second, the police itself is better controlled by authoritarian regimes which are afraid of dangerous (for them) erosion of this institution.


Author(s):  
Nur Maimun ◽  
Jihan Natassa ◽  
Wen Via Trisna ◽  
Yeye Supriatin

The accuracy in administering the diagnosis code was the important matter for medical recorder, quality of data was the most important thing for health information management of medical recorder. This study aims to know the coder competency for accuracy and precision of using ICD 10 at X Hospital in Pekanbaru. This study was a qualitative method with case study implementation from five informan. The result show that medical personnel (doctor) have never received a training about coding, doctors writing that hard and difficult to read, failure for making diagnoses code or procedures, doctor used an usual abbreviations that are not standard, theres still an officer who are not understand about the nomenclature and mastering anatomy phatology, facilities and infrastructure were supported for accuracy and precision of the existing code. The errors of coding always happen because there is a human error. The accuracy and precision in coding very influence against the cost of INA CBGs, medical and the committee did most of the work in the case of severity level III, while medical record had a role in monitoring or evaluation of coding implementation. If there are resumes that is not clearly case mix team check file needed medical record the result the diagnoses or coding for conformity. Keywords: coder competency, accuracy and precision of coding, ICD 10


Sign in / Sign up

Export Citation Format

Share Document