Pharmaceutical Biotechnology and Industrial Applications-Learning Lessons from Molecular Biology

2012 ◽  
pp. 1-13 ◽  
Author(s):  
Oliver Kayser ◽  
Heribert Warzecha
1995 ◽  
Vol 22 (4) ◽  
pp. 647 ◽  
Author(s):  
MK Morell ◽  
S Rahman ◽  
SL Abrahams ◽  
R Appels

Starch is a key constituent of plant products finding utility as both a major component of a wide range of staple and processed foods, and as a feedstock for industrial processes. While there has traditionally been a focus on the quantity of starch production, starch quality is of increasing importance to the end-user as consumer demands become more sophisticated and as the range of industrial applications of starch broadens. Determinants of starch quality include the amylose to amylopectin ratio, the distribution of molecular structures within these fractions, and the packaging of the starch in granules. The biochemical processes involved in the transformation of the sucrose delivered to the endosperm cytosol to starch in the amyloplast are understood in broad outline. The importance of particular isoenzymes or processes to the production of starches of specific structures are, however, not well understood. This paper reviews aspects of the physiology, biochemistry and molecular biology of starch in plants, with an emphasis on the synthesis of starch in the cereal endosperm. Progress in understanding the linkages between the molecular events in starch synthesis and developing strategies for the manipulation of starch quantity and quality in cereals are discussed.


The completion of the first draft of the human genome has led to an explosion of interest in genetics and molecular biology. The view of the genome as a network of interacting computational components is well-established, but researchers are now trying to reverse the analogy, by using living organisms to construct logic circuits. The potential applications for such technologies is huge, ranging from bio-sensors, through industrial applications to drug delivery and diagnostics. This book would be the first to deal with the implementation of this technology, describing several working experimental demonstrations using cells as components of logic circuits, building toward computers incorporating biological components in their functioning.


Viruses ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 722
Author(s):  
Modesto Redrejo-Rodríguez ◽  
Pilar García

The Spanish Network of Bacteriophages and Transducer Elements (FAGOMA) was created to answer the need of Spanish scientists working on phages to exchange knowledge and find synergies. Seven years and five meetings later, the network has become a fruitful forum where groups working on distinct aspects of phage research (structural and molecular biology, diversity, gene transfer and evolution, virus–host interactions, clinical, biotechnological and industrial applications) present their work and find new avenues for collaboration. The network has recently increased its visibility and activity by getting in touch with the French Phage Network (Phages.fr) and with different national and international scientific institutions. Here, we present a summary of the fifth meeting of the FAGOMA network, held in October 2018 in Alcalá de Henares (Madrid), in which the participants shared some of their latest results and discussed future challenges of phage research.


Author(s):  
Cecil E. Hall

The visualization of organic macromolecules such as proteins, nucleic acids, viruses and virus components has reached its high degree of effectiveness owing to refinements and reliability of instruments and to the invention of methods for enhancing the structure of these materials within the electron image. The latter techniques have been most important because what can be seen depends upon the molecular and atomic character of the object as modified which is rarely evident in the pristine material. Structure may thus be displayed by the arts of positive and negative staining, shadow casting, replication and other techniques. Enhancement of contrast, which delineates bounds of isolated macromolecules has been effected progressively over the years as illustrated in Figs. 1, 2, 3 and 4 by these methods. We now look to the future wondering what other visions are waiting to be seen. The instrument designers will need to exact from the arts of fabrication the performance that theory has prescribed as well as methods for phase and interference contrast with explorations of the potentialities of very high and very low voltages. Chemistry must play an increasingly important part in future progress by providing specific stain molecules of high visibility, substrates of vanishing “noise” level and means for preservation of molecular structures that usually exist in a solvated condition.


Author(s):  
C. F. Oster

Although ultra-thin sectioning techniques are widely used in the biological sciences, their applications are somewhat less popular but very useful in industrial applications. This presentation will review several specific applications where ultra-thin sectioning techniques have proven invaluable.The preparation of samples for sectioning usually involves embedding in an epoxy resin. Araldite 6005 Resin and Hardener are mixed so that the hardness of the embedding medium matches that of the sample to reduce any distortion of the sample during the sectioning process. No dehydration series are needed to prepare our usual samples for embedding, but some types require hardening and staining steps. The embedded samples are sectioned with either a prototype of a Porter-Blum Microtome or an LKB Ultrotome III. Both instruments are equipped with diamond knives.In the study of photographic film, the distribution of the developed silver particles through the layer is important to the image tone and/or scattering power. Also, the morphology of the developed silver is an important factor, and cross sections will show this structure.


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Author(s):  
C J R Sheppard

The confocal microscope is now widely used in both biomedical and industrial applications for imaging, in three dimensions, objects with appreciable depth. There are now a range of different microscopes on the market, which have adopted a variety of different designs. The aim of this paper is to explore the effects on imaging performance of design parameters including the method of scanning, the type of detector, and the size and shape of the confocal aperture.It is becoming apparent that there is no such thing as an ideal confocal microscope: all systems have limitations and the best compromise depends on what the microscope is used for and how it is used. The most important compromise at present is between image quality and speed of scanning, which is particularly apparent when imaging with very weak signals. If great speed is not of importance, then the fundamental limitation for fluorescence imaging is the detection of sufficient numbers of photons before the fluorochrome bleaches.


Sign in / Sign up

Export Citation Format

Share Document