A hybrid approach to auto-insurance claim processing system

Author(s):  
Jang-Hee Yoo ◽  
Byoung-Ho Kang ◽  
Jong-Uk Choi
2001 ◽  
Vol 28 (5) ◽  
pp. 804-812 ◽  
Author(s):  
Paul de Leur ◽  
Tarek Sayed

Road safety analysis is typically undertaken using traffic collision data. However, the collision data often suffer from quality and reliability problems. These problems can inhibit the ability of road safety engineers to evaluate and analyze road safety performance. An alternate source of data that characterize the events of a traffic collision is the records that become available from an auto insurance claim. In settling an auto insurance claim, a claim adjuster must make an assessment and determination of the circumstances of the event, recording important contributing factors that led to the crash occurrence. As such, there is an opportunity to access and use the claims data in road safety engineering analysis. This paper presents the results of an initial attempt to use auto insurance claims records in road safety evaluation by developing and applying a claim prediction model. The prediction model will provide an estimate of the number of auto insurance claims that can be expected at signalized intersections in the Vancouver area of British Columbia, Canada. A discussion of the usefulness and application of the claim prediction model will be provided together with a recommendation on how the claims data could be utilized in the future.Key words: road safety improvement programs, auto insurance claims, road safety analysis, prediction models.


2018 ◽  
Vol 2018 ◽  
pp. 1-12
Author(s):  
Hui-Juan Zhu ◽  
Zheng-Wei Zhu ◽  
Tong-Hai Jiang ◽  
Li Cheng ◽  
Wei-Lei Shi ◽  
...  

In data integration, entity resolution is an important technique to improve data quality. Existing researches typically assume that the target dataset only contain string-type data and use single similarity metric. For larger high-dimensional dataset, redundant information needs to be verified using traditional blocking or windowing techniques. In this work, we propose a novel ER-resolving method using a hybrid approach, including type-based multiblocks, varying window size, and more flexible similarity metrics. In our new ER workflow, we reduce the searching space for entity pairs by the constraint of redundant attributes and matching likelihood. We develop a reference implementation of our proposed approach and validate its performance using real-life dataset from one Internet of Things project. We evaluate the data processing system using five standard metrics including effectiveness, efficiency, accuracy, recall, and precision. Experimental results indicate that the proposed approach could be a promising alternative for entity resolution and could be feasibly applied in real-world data cleaning for large datasets.


Author(s):  
J. Hefter

Semiconductor-metal composites, formed by the eutectic solidification of silicon and a metal silicide have been under investigation for some time for a number of electronic device applications. This composite system is comprised of a silicon matrix containing extended metal-silicide rod-shaped structures aligned in parallel throughout the material. The average diameter of such a rod in a typical system is about 1 μm. Thus, characterization of the rod morphology by electron microscope methods is necessitated.The types of morphometric information that may be obtained from such microscopic studies coupled with image processing are (i) the area fraction of rods in the matrix, (ii) the average rod diameter, (iii) an average circularity (roundness), and (iv) the number density (Nd;rods/cm2). To acquire electron images of these materials, a digital image processing system (Tracor Northern 5500/5600) attached to a JEOL JXA-840 analytical SEM has been used.


Author(s):  
A. V. Crewe ◽  
M. Ohtsuki

We have assembled an image processing system for use with our high resolution STEM for the particular purpose of working with low dose images of biological specimens. The system is quite flexible, however, and can be used for a wide variety of images.The original images are stored on magnetic tape at the microscope using the digitized signals from the detectors. For low dose imaging, these are “first scan” exposures using an automatic montage system. One Nova minicomputer and one tape drive are dedicated to this task.The principal component of the image analysis system is a Lexidata 3400 frame store memory. This memory is arranged in a 640 x 512 x 16 bit configuration. Images are displayed simultaneously on two high resolution monitors, one color and one black and white. Interaction with the memory is obtained using a Nova 4 (32K) computer and a trackball and switch unit provided by Lexidata.The language used is BASIC and uses a variety of assembly language Calls, some provided by Lexidata, but the majority written by students (D. Kopf and N. Townes).


Author(s):  
G.Y. Fan ◽  
J.M. Cowley

In recent developments, the ASU HB5 has been modified so that the timing, positioning, and scanning of the finely focused electron probe can be entirely controlled by a host computer. This made the asynchronized handshake possible between the HB5 STEM and the image processing system which consists of host computer (PDP 11/34), DeAnza image processor (IP 5000) which is interfaced with a low-light level TV camera, array processor (AP 400) and various peripheral devices. This greatly facilitates the pattern recognition technique initiated by Monosmith and Cowley. Software called NANHB5 is under development which, instead of employing a set of photo-diodes to detect strong spots on a TV screen, uses various software techniques including on-line fast Fourier transform (FFT) to recognize patterns of greater complexity, taking advantage of the sophistication of our image processing system and the flexibility of computer software.


Author(s):  
Rudolf Oldenbourg

The recent renaissance of the light microsope is fueled in part by technological advances in components on the periphery of the microscope, such as the laser as illumination source, electronic image recording (video), computer assisted image analysis and the biochemistry of fluorescent dyes for labeling specimens. After great progress in these peripheral parts, it seems timely to examine the optics itself and ask how progress in the periphery facilitates the use of new optical components and of new optical designs inside the microscope. Some results of this fruitful reflection are presented in this symposium.We have considered the polarized light microscope, and developed a design that replaces the traditional compensator, typically a birefringent crystal plate, with a precision universal compensator made of two liquid crystal variable retarders. A video camera and digital image processing system provide fast measurements of specimen anisotropy (retardance magnitude and azimuth) at ALL POINTS of the image forming the field of view. The images document fine structural and molecular organization within a thin optical section of the specimen.


Author(s):  
P. Pradère ◽  
J.F. Revol ◽  
R. St. John Manley

Although radiation damage is the limiting factor in HREM of polymers, new techniques based on low dose imaging at low magnification have permitted lattice images to be obtained from very radiation sensitive polymers such as polyethylene (PE). This paper describes the computer averaging of P4MP1 lattice images. P4MP1 is even more sensitive than PE (total end point dose of 27 C m-2 as compared to 100 C m-2 for PE at 120 kV). It does, however, have the advantage of forming flat crystals from dilute solution and no change in d-spacings is observed during irradiation.Crystals of P4MP1 were grown at 60°C in xylene (polymer concentration 0.05%). Electron microscopy was performed with a Philips EM 400 T microscope equipped with a Low Dose Unit and operated at 120 kV. Imaging conditions were the same as already described elsewhere. Enlarged micrographs were digitized and processed with the Spider image processing system.


Author(s):  
S. Lehner ◽  
H.E. Bauer ◽  
R. Wurster ◽  
H. Seiler

In order to compare different microanalytical techniques commercially available cation exchange membrane SC-1 (Stantech Inc, Palo Alto), was loaded with biologically relevant elements as Na, Mg, K, and Ca, respectively, each to its highest possible concentration, given by the number concentration of exchangeable binding sites (4 % wt. for Ca). Washing in distilled water, dehydration through a graded series of ethanol, infiltration and embedding in Spurr’s low viscosity epoxy resin was followed by thin sectioning. The thin sections (thickness of about 50 nm) were prepared on carbon foils and mounted on electron microscopical finder grids.The samples were analyzed with electron microprobe JXA 50A with transmitted electron device, EDX system TN 5400, and on line operating image processing system SEM-IPS, energy filtering electron microscope CEM 902 with EELS/ESI and Auger spectrometer 545 Perkin Elmer.With EDX, a beam current of some 10-10 A and a beam diameter of about 10 nm, a minimum-detectable mass of 10-20 g Ca seems within reach.


Sign in / Sign up

Export Citation Format

Share Document