scholarly journals Determination of Foam Stability in Lager Beers Using Digital Image Analysis of Images Obtained Using RGB and 3D Cameras

Fermentation ◽  
2021 ◽  
Vol 7 (2) ◽  
pp. 46
Author(s):  
Emmanuel Karlo Nyarko ◽  
Hrvoje Glavaš ◽  
Kristina Habschied ◽  
Krešimir Mastanjević

Foam stability and retention is an important indicator of beer quality and freshness. A full, white head of foam with nicely distributed small bubbles of CO2 is appealing to the consumers and the crown of the production process. However, raw materials, production process, packaging, transportation, and storage have a big impact on foam stability, which marks foam stability monitoring during all these stages, from production to consumer, as very important. Beer foam stability is expressed as a change of foam height over a certain period. This research aimed to monitor the foam stability of lager beers using image analysis methods on two different types of recordings: RGB and depth videos. Sixteen different commercially available lager beers were subjected to analysis. The automated image analysis method based only on the analysis of RGB video images proved to be inapplicable in real conditions due to problems such as reflection of light through glass, autofocus, and beer lacing/clinging, which make it impossible to accurately detect the actual height of the foam. A solution to this problem, representing a unique contribution, was found by introducing the use of a 3D camera in estimating foam stability. According to the results, automated analysis of depth images obtained from a 3D camera proved to be a suitable, objective, repeatable, reliable, and sufficiently sensitive method for measuring foam stability of lager beers. The applied model proved to be suitable for predicting changes in foam retention of lager beers.

2021 ◽  
Vol 3 ◽  
Author(s):  
Christopher Schmied ◽  
Tolga Soykan ◽  
Svenja Bolz ◽  
Volker Haucke ◽  
Martin Lehmann

Neuronal synapses are highly dynamic communication hubs that mediate chemical neurotransmission via the exocytic fusion and subsequent endocytic recycling of neurotransmitter-containing synaptic vesicles (SVs). Functional imaging tools allow for the direct visualization of synaptic activity by detecting action potentials, pre- or postsynaptic calcium influx, SV exo- and endocytosis, and glutamate release. Fluorescent organic dyes or synapse-targeted genetic molecular reporters, such as calcium, voltage or neurotransmitter sensors and synapto-pHluorins reveal synaptic activity by undergoing rapid changes in their fluorescence intensity upon neuronal activity on timescales of milliseconds to seconds, which typically are recorded by fast and sensitive widefield live cell microscopy. The analysis of the resulting time-lapse movies in the past has been performed by either manually picking individual structures, custom scripts that have not been made widely available to the scientific community, or advanced software toolboxes that are complicated to use. For the precise, unbiased and reproducible measurement of synaptic activity, it is key that the research community has access to bio-image analysis tools that are easy-to-apply and allow the automated detection of fluorescent intensity changes in active synapses. Here we present SynActJ (Synaptic Activity in ImageJ), an easy-to-use fully open-source workflow that enables automated image and data analysis of synaptic activity. The workflow consists of a Fiji plugin performing the automated image analysis of active synapses in time-lapse movies via an interactive seeded watershed segmentation that can be easily adjusted and applied to a dataset in batch mode. The extracted intensity traces of each synaptic bouton are automatically processed, analyzed, and plotted using an R Shiny workflow. We validate the workflow on time-lapse images of stimulated synapses expressing the SV exo-/endocytosis reporter Synaptophysin-pHluorin or a synapse-targeted calcium sensor, Synaptophysin-RGECO. We compare the automatic workflow to manual analysis and compute calcium-influx and SV exo-/endocytosis kinetics and other parameters for synaptic vesicle recycling under different conditions. We predict SynActJ to become an important tool for the analysis of synaptic activity and synapse properties.


Fermentation ◽  
2021 ◽  
Vol 7 (3) ◽  
pp. 113
Author(s):  
Kristina Habschied ◽  
Hrvoje Glavaš ◽  
Emmanuel Karlo Nyarko ◽  
Krešimir Mastanjević

The aim of this research is to investigate the possibility of applying a laser distance meter (LDM) as a complementary measurement method to image analysis during beer foam stability monitoring. The basic optical property of foam, i.e., its high reflectivity, is the main reason for using LDM. LDM measurements provide relatively precise information on foam height, even in the presence of lacing, and provide information as to when foam is no longer visible on the surface of the beer. Sixteen different commercially available lager beers were subjected to analysis. A camera and LDM display recorded the foam behavior; the LDM display which was placed close to the monitored beer glass. Measurements obtained by the image analysis of videos provided by the visual camera were comparable to those obtained independently by LDM. However, due to lacing, image analysis could not accurately detect foam disappearance. On the other hand, LDM measurements accurately detected the moment of foam disappearance since the measurements would have significantly higher values due to multiple reflections in the glass.


Author(s):  
V. Y. Tarasov ◽  
S. S. Korobko

Today great attention is paid to development of advanced technologies for production of ecologically safe, nonpolluting and biodegradable products, including without limitation cosmetic-hygiene detergents and household products. One of the main ingredients in formulation of such products is surfactants. For the purpose of widening of the assortment of such products it is essential to create new types of biodegradable surfactants derived from renewable, as a rule, plant raw materials. The object of this paper is development of technology for production of non-ionic surfactant, alkyl polyglycoside (APG), with improved characteristics on the basis of the alternative plant raw material, sunflower husks, being the waste by-product of sunflower processing, which is the most commonly available raw material in our country. The output of sunflower processing aiming at sunflower oil production is growing year by year and takes the leading place in the oil-and-fat industry, therefore processing of the waste product in the form of husks is of particular interest now. In the course of work the existing technologies of APG production were studied and their shortcomings were identified. According to such technologies alkyl polyglycoside is produced by combining glucose or aqueous syrupy solution of glucose with C10- C16 alcohol. As the sources of starch, from which glucose is produced further, there are used rice, corn, potatoes or wheat. Such products represent no wastes and have rather high production cost. Fatty alcohols are produced from imported palm or coconut oil. The new technology suggested by us is based on usage of the available and cheap raw materials. Glucose syrup is made with the help of the method of hydrolysis of sunflower husks cellulose, and fatty acids are derived from the sunflower processing cycle at the stage of alkali refining of sunflower oil, comprising C16-C18 atoms. Analysis of organoleptic, physical-and-chemical characteristics and evaluation of consumer properties of the resulting alkyl polyglycoside were performed. It was established that according to the suggested method it is possible to produce a non-ionic surfactant with improved detergent (CCM) and foaming power (foam height, foam stability), and also having soft dermatological action. The alkyl polyglycoside, created and produced with the help of our technology, can be used as an alternate substitute of expensive foreign non-ionic surfactants, can be helpful for extension of the assortment of biodegradable foam detergents, nonpolluting and safe for the environment.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Narendra Narisetti ◽  
Michael Henke ◽  
Christiane Seiler ◽  
Rongli Shi ◽  
Astrid Junker ◽  
...  

AbstractQuantitative characterization of root system architecture and its development is important for the assessment of a complete plant phenotype. To enable high-throughput phenotyping of plant roots efficient solutions for automated image analysis are required. Since plants naturally grow in an opaque soil environment, automated analysis of optically heterogeneous and noisy soil-root images represents a challenging task. Here, we present a user-friendly GUI-based tool for semi-automated analysis of soil-root images which allows to perform an efficient image segmentation using a combination of adaptive thresholding and morphological filtering and to derive various quantitative descriptors of the root system architecture including total length, local width, projection area, volume, spatial distribution and orientation. The results of our semi-automated root image segmentation are in good conformity with the reference ground-truth data (mean dice coefficient = 0.82) compared to IJ_Rhizo and GiAroots. Root biomass values calculated with our tool within a few seconds show a high correlation (Pearson coefficient = 0.8) with the results obtained using conventional, pure manual segmentation approaches. Equipped with a number of adjustable parameters and optional correction tools our software is capable of significantly accelerating quantitative analysis and phenotyping of soil-, agar- and washed root images.


2010 ◽  
Vol 15 (7) ◽  
pp. 726-734 ◽  
Author(s):  
Aabid Shariff ◽  
Joshua Kangas ◽  
Luis Pedro Coelho ◽  
Shannon Quinn ◽  
Robert F. Murphy

The field of high-content screening and analysis consists of a set of methodologies for automated discovery in cell biology and drug development using large amounts of image data. In most cases, imaging is carried out by automated microscopes, often assisted by automated liquid handling and cell culture. Image processing, computer vision, and machine learning are used to automatically process high-dimensional image data into meaningful cell biological results. The key is creating automated analysis pipelines typically consisting of 4 basic steps: (1) image processing (normalization, segmentation, tracing, tracking), (2) spatial transformation to bring images to a common reference frame (registration), (3) computation of image features, and (4) machine learning for modeling and interpretation of data. An overview of these image analysis tools is presented here, along with brief descriptions of a few applications.


Author(s):  
S.F. Stinson ◽  
J.C. Lilga ◽  
M.B. Sporn

Increased nuclear size, resulting in an increase in the relative proportion of nuclear to cytoplasmic sizes, is an important morphologic criterion for the evaluation of neoplastic and pre-neoplastic cells. This paper describes investigations into the suitability of automated image analysis for quantitating changes in nuclear and cytoplasmic cross-sectional areas in exfoliated cells from tracheas treated with carcinogen.Neoplastic and pre-neoplastic lesions were induced in the tracheas of Syrian hamsters with the carcinogen N-methyl-N-nitrosourea. Cytology samples were collected intra-tracheally with a specially designed catheter (1) and stained by a modified Papanicolaou technique. Three cytology specimens were selected from animals with normal tracheas, 3 from animals with dysplastic changes, and 3 from animals with epidermoid carcinoma. One hundred randomly selected cells on each slide were analyzed with a Bausch and Lomb Pattern Analysis System automated image analyzer.


Author(s):  
F. A. Heckman ◽  
E. Redman ◽  
J.E. Connolly

In our initial publication on this subject1) we reported results demonstrating that contrast is the most important factor in producing the high image quality required for reliable image analysis. We also listed the factors which enhance contrast in order of the experimentally determined magnitude of their effect. The two most powerful factors affecting image contrast attainable with sheet film are beam intensity and KV. At that time we had only qualitative evidence for the ranking of enhancing factors. Later we carried out the densitometric measurements which led to the results outlined below.Meaningful evaluations of the cause-effect relationships among the considerable number of variables in preparing EM negatives depend on doing things in a systematic way, varying only one parameter at a time. Unless otherwise noted, we adhered to the following procedure evolved during our comprehensive study:Philips EM-300; 30μ objective aperature; magnification 7000- 12000X, exposure time 1 second, anti-contamination device operating.


Author(s):  
P. Hagemann

The use of computers in the analytical electron microscopy today shows three different trends (1) automated image analysis with dedicated computer systems, (2) instrument control by microprocessors and (3) data acquisition and processing e.g. X-ray or EEL Spectroscopy.While image analysis in the T.E.M. usually needs a television chain to get a sequential transmission suitable as computer input, the STEM system already has this necessary facility. For the EM400T-STEM system therefore an interface was developed, that allows external control of the beam deflection in TEM as well as the control of the STEM probe and video signal/beam brightness on the STEM screen.The interface sends and receives analogue signals so that the transmission rate is determined by the convertors in the actual computer periphery.


Científica ◽  
2016 ◽  
Vol 44 (3) ◽  
pp. 412 ◽  
Author(s):  
Rafael Marani Barbosa ◽  
Bruno Guilherme Torres Licursi Vieira ◽  
Francisco Guilhien Gomes-Junior ◽  
Roberval Daiton Vieira

2020 ◽  
Vol 11 (2) ◽  
pp. 73-82
Author(s):  
А. Trubnikova ◽  
О. Chabanova ◽  
S. Bondar ◽  
Т. Sharakhmatova ◽  
Т. Nedobijchuk

Optimization of the formulation of synbiotic yogurt ice cream low-lactose using lactose-free protein concentrate of buttermilk and yogurt with low lactose content is the goal of expanding the range of low-lactose dairy products and improving the functional and health properties of ice cream. Low-lactose ice cream formulation optimization was performed using a gradient numerical method, namely conjugated gradients (Conjugate Gradient). The optimization algorithm is implemented in Mathcad. An array of data with a set of indicators for the choice of a rational ratio of lactose-free protein concentrate of buttermilk and yogurt base and inulin content for ice cream mixtures is presented. The influence of the ratio of the main components of the mixtures on the foaming ability, which determines the quality of the finished product, has been studied. An important indicator is taken into account - the concentration factor of buttermilk, which is additionally purified from lactose by diafiltration. The graphic material presented in the work clearly demonstrates that the rational ratio of yogurt base and lactose-free protein concentrate of buttermilk, obtained by ultrafiltration with diafiltration purification at a concentration factor of FC = 5 is 40.6: 59.4. The content of additional components included in the recipe of a new type of ice cream is optimized in the work, the mass fractions of which were: inulin - 3.69 %; lactulose – 1 %; ginger - 0.3 %; citric acid - 0.15 %; stabilization system - 0.2 %. The chemical composition and quality indicators of the mixture for ice cream low-lactose synbiotic yogurt, consisting of raw materials in the optimal ratio, were determined. The lactose content in the test sample of the ice cream mixture was 0.99%, the antioxidant activity was 3.1 times higher than in the mixture for traditional yogurt ice cream. The most likely number of lactic acid microorganisms, CFU / cm3 is (2.8 ± 0.9) · 108, the number of bifidobacteria, CFU / cm3 is (2.5 ± 0.2) · 109. The results of the research will be implemented in dairy companies in the production of ice cream.


Sign in / Sign up

Export Citation Format

Share Document