scholarly journals Basic Principles of Surfactant Chemistry for Industrial Applications

2003 ◽  
Vol 76 (1) ◽  
pp. 9-14
Author(s):  
Yasuo ISHII
1999 ◽  
Vol 26 (4) ◽  
pp. 402-424 ◽  
Author(s):  
Hasnaa Jorio ◽  
Michèle Heitz

During several decades, there have been numerous studies and attempts in the field of the treatment of volatile organic solvent contaminated air, with the aim of finding a more efficient and less expensive process. In parallel with the traditional air treatment technologies, biological processes have emerged in recent years. Biofiltration appears to be a particularly preferred path due to its efficiency, its environmental aspects, and its lower costs. In this paper, the biofiltration technology is positioned in relation to conventional techniques and other biological air treatments. Subsequently, after a short historical account of biofiltration, the focus is put on the main objective of this literature review, presenting the current knowledge about the basic principles of the process, its applicability, operational conditions that influence performance and reliability of this process, and recent developments in mathematical biofilter modeling. Finally, industrial applications and biofiltration processing costs are briefly discussed.Key words: biofilter, VOC, biodegradation, modeling, kinetics, humidity, temperature, pH, nutrients, oxygen.[Journal translation]


2015 ◽  
Vol 59 ◽  
pp. 1-41 ◽  
Author(s):  
Peter K. Robinson

Enzymes are biological catalysts (also known as biocatalysts) that speed up biochemical reactions in living organisms, and which can be extracted from cells and then used to catalyse a wide range of commercially important processes. This chapter covers the basic principles of enzymology, such as classification, structure, kinetics and inhibition, and also provides an overview of industrial applications. In addition, techniques for the purification of enzymes are discussed.


2009 ◽  
Vol 67 (2-3) ◽  
pp. 457-461 ◽  
Author(s):  
Pedro Llovera ◽  
Philippe Molinié ◽  
Anabel Soria ◽  
Alfredo Quijano

2011 ◽  
Vol 04 (01) ◽  
pp. 183-212
Author(s):  
Ragnar Hellborg ◽  
Harry J. Whitlow

Direct current accelerators form the basis of many front-line industrial processes. They have many advantages that have kept them at the forefront of technology for many decades, such as a small and easily managed environmental footprint. In this article, the basic principles of the different subsystems (ion and electron sources, high voltage generation, control, etc.) are overviewed. Some well-known (ion implantation and polymer processing) and lesser-known (electron beam lithography and particle-induced X-ray aerosol mapping) applications are reviewed.


Author(s):  
C. F. Oster

Although ultra-thin sectioning techniques are widely used in the biological sciences, their applications are somewhat less popular but very useful in industrial applications. This presentation will review several specific applications where ultra-thin sectioning techniques have proven invaluable.The preparation of samples for sectioning usually involves embedding in an epoxy resin. Araldite 6005 Resin and Hardener are mixed so that the hardness of the embedding medium matches that of the sample to reduce any distortion of the sample during the sectioning process. No dehydration series are needed to prepare our usual samples for embedding, but some types require hardening and staining steps. The embedded samples are sectioned with either a prototype of a Porter-Blum Microtome or an LKB Ultrotome III. Both instruments are equipped with diamond knives.In the study of photographic film, the distribution of the developed silver particles through the layer is important to the image tone and/or scattering power. Also, the morphology of the developed silver is an important factor, and cross sections will show this structure.


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Author(s):  
C J R Sheppard

The confocal microscope is now widely used in both biomedical and industrial applications for imaging, in three dimensions, objects with appreciable depth. There are now a range of different microscopes on the market, which have adopted a variety of different designs. The aim of this paper is to explore the effects on imaging performance of design parameters including the method of scanning, the type of detector, and the size and shape of the confocal aperture.It is becoming apparent that there is no such thing as an ideal confocal microscope: all systems have limitations and the best compromise depends on what the microscope is used for and how it is used. The most important compromise at present is between image quality and speed of scanning, which is particularly apparent when imaging with very weak signals. If great speed is not of importance, then the fundamental limitation for fluorescence imaging is the detection of sufficient numbers of photons before the fluorochrome bleaches.


Sign in / Sign up

Export Citation Format

Share Document