Novel technique to improve optical fibre sensor's response for industrial applications

1995 ◽  
Author(s):  
M.N. Taib
Author(s):  
YI DENG ◽  
JIACUN WANG ◽  
XUDONG HE ◽  
JEFFREY J. P. TSAI

System assembly is one of the major issues in engineering complex component-based systems. This is especially true when heterogeneous, COTS and GOTS distributed systems, typical in industrial applications, are involved. The goal of system assembly is not only to make constituent components work together, but also to ensure that the components as a whole behave consistently and guarantee certain end-to-end properties. Despite recent advances, there is a lack of understanding about software composability, as well as theory and techniques for checking and verifying component-based systems. A theory of software system constraints about components, their environment and about system as a whole is the necessary foundation toward solid understanding of the composability of component-based systems. In this paper, we present a systematic approach for constraint specification and constraint propagation in concert with design refinement with a novel technique to ensure consistency between system-wide and component constraints in a design composition process of component-based systems. The consistent constraint propagation is used in our approach to drive progressive verification of the design. It allows us to verify overall design composition without interference of internal details of component designs. Verification is done separately at architectural and component levels without having to compose results of component analyses. A component can be safely replaced with alternative design without re-verifying the overall system composition so long as the replacement conforms to the corresponding interface and component constraint(s).


Author(s):  
C. F. Oster

Although ultra-thin sectioning techniques are widely used in the biological sciences, their applications are somewhat less popular but very useful in industrial applications. This presentation will review several specific applications where ultra-thin sectioning techniques have proven invaluable.The preparation of samples for sectioning usually involves embedding in an epoxy resin. Araldite 6005 Resin and Hardener are mixed so that the hardness of the embedding medium matches that of the sample to reduce any distortion of the sample during the sectioning process. No dehydration series are needed to prepare our usual samples for embedding, but some types require hardening and staining steps. The embedded samples are sectioned with either a prototype of a Porter-Blum Microtome or an LKB Ultrotome III. Both instruments are equipped with diamond knives.In the study of photographic film, the distribution of the developed silver particles through the layer is important to the image tone and/or scattering power. Also, the morphology of the developed silver is an important factor, and cross sections will show this structure.


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Author(s):  
C J R Sheppard

The confocal microscope is now widely used in both biomedical and industrial applications for imaging, in three dimensions, objects with appreciable depth. There are now a range of different microscopes on the market, which have adopted a variety of different designs. The aim of this paper is to explore the effects on imaging performance of design parameters including the method of scanning, the type of detector, and the size and shape of the confocal aperture.It is becoming apparent that there is no such thing as an ideal confocal microscope: all systems have limitations and the best compromise depends on what the microscope is used for and how it is used. The most important compromise at present is between image quality and speed of scanning, which is particularly apparent when imaging with very weak signals. If great speed is not of importance, then the fundamental limitation for fluorescence imaging is the detection of sufficient numbers of photons before the fluorochrome bleaches.


Sign in / Sign up

Export Citation Format

Share Document