scholarly journals High-Quality and Easy-to-Regenerate Personal Filter

Author(s):  
Max Fraenkl ◽  
Milos Krbal ◽  
Jakub Houdek ◽  
Zuzana Olmrova Zmrhalova ◽  
Borivoj Prokes ◽  
...  

Proper respiratory tract protection is the key factor to limiting the rate of COVID-19 spread and providing a safe environment for health care workers. Traditional N95 (FFP2) respirators are not easy to regenerate and thus create certain financial and ecological burdens; moreover, their quality may vary significantly. A solution that would overcome these disadvantages is desirable. In this study a commercially available knit polyester fleece fabric was selected as the filter material, and a total of 25 filters of different areas and thicknesses were prepared. Then, the size-resolved filtration efficiency (40-400 nm) and pressure drop were evaluated at a volumetric flow rate of 95 L/min. We showed the excellent synergistic effect of expanding the filtration area and increasing the number of filtering layers on the filtration efficiency; a filter cartridge with 8 layers of knit polyester fabric with a surface area of 900 cm2 and sized 25 x 14 x 8 cm achieved filtration efficiencies of 98 % at 95 L/min and 99.5 % at 30 L/min. The assembled filter kit consists of a filter cartridge (14 Pa) carried in a small backpack connected to a half mask with a total pressure drop of 84 Pa at 95 L/min. In addition, it is reusable, and the filter material can be regenerated at least ten times by simple methods, such as boiling. We have demonstrated a novel approach for creating high-quality and easy-to-breathe-through respiratory protective equipment that reduces operating costs and is a green solution because it is easy to regenerate.

2020 ◽  
pp. 152808372097508
Author(s):  
Manish Joshi ◽  
Arshad Khan ◽  
BK Sapra

Recent crisis in the form of COVID-19 has rendered wearing of mask mandatory for patients, health care workers and members of public worldwide. This has caused a sudden shift of focus on availability, effectiveness, re-use and development of face masks/respirators. In the current pandemic situation, the shortage of masks has also led to rethinking on strategies of reuse of masks after due sterilization. This work discusses a quick laboratory methodology to test/determine the particle filtration efficiency of face masks/respirators. The testing parameters include the particle capture efficiency of the mask material/full mask, pressure drop and the fit factor. Two different, simple, make-shift set-ups have been adopted for the present context. The first is used to measure the intrinsic particle capture efficiency and pressure drop of the filter material and the second as a ‘full mask sampler’ to assess the leakages through seams and joints of the mask. Experiments conducted with atomized NaCl test particles on three types of mask viz. commercial N-95 respirator, surgical mask and cloth mask have been used for evolving the methodology. The differences in terms of capture efficiency of aerosol particles for the filter material and for the full mask in face fix/sealed fixture have been linked to improvement of the mask design in development phase. This paper hopes to provide a crucial laboratory link between the mask developers and the certification agencies in the times of urgency. Needless to mention that commercialization of the same is subject to certification from authorized agencies, following standard procedures.


2019 ◽  
Vol 9 (12) ◽  
pp. 2560 ◽  
Author(s):  
Yunkon Kim ◽  
Eui-Nam Huh

This paper explores data caching as a key factor of edge computing. State-of-the-art research of data caching on edge nodes mainly considers reactive and proactive caching, and machine learning based caching, which could be a heavy task for edge nodes. However, edge nodes usually have relatively lower computing resources than cloud datacenters as those are geo-distributed from the administrator. Therefore, a caching algorithm should be lightweight for saving computing resources on edge nodes. In addition, the data caching should be agile because it has to support high-quality services on edge nodes. Accordingly, this paper proposes a lightweight, agile caching algorithm, EDCrammer (Efficient Data Crammer), which performs agile operations to control caching rate for streaming data by using the enhanced PID (Proportional-Integral-Differential) controller. Experimental results using this lightweight, agile caching algorithm show its significant value in each scenario. In four common scenarios, the desired cache utilization was reached in 1.1 s on average and then maintained within a 4–7% deviation. The cache hit ratio is about 96%, and the optimal cache capacity is around 1.5 MB. Thus, EDCrammer can help distribute the streaming data traffic to the edge nodes, mitigate the uplink load on the central cloud, and ultimately provide users with high-quality video services. We also hope that EDCrammer can improve overall service quality in 5G environment, Augmented Reality/Virtual Reality (AR/VR), Intelligent Transportation System (ITS), Internet of Things (IoT), etc.


1996 ◽  
Vol 118 (1) ◽  
pp. 29-35 ◽  
Author(s):  
K. Minemura ◽  
K. Egashira ◽  
K. Ihara ◽  
H. Furuta ◽  
K. Yamamoto

A turbine flowmeter is employed in this study in connection with offshore oil field development, in order to measure simultaneously both the volumetric flow rates of air-water two-phase mixture. Though a conventional turbine flowmeter is generally used to measure the single-phase volumetric flow rate by obtaining the rotational rotor speed, the method proposed additionally reads the pressure drop across the meter. After the pressure drop and rotor speed measured are correlated as functions of the volumetric flow ratio of the air to the whole fluid and the total volumetric flow rate, both the flow rates are iteratively evaluated with the functions on the premise that the liquid density is known. The evaluated flow rates are confirmed to have adequate accuracy, and thus the applicability of the method to oil fields.


2015 ◽  
Vol 24 (02) ◽  
pp. 1540010 ◽  
Author(s):  
Patrick Arnold ◽  
Erhard Rahm

We introduce a novel approach to extract semantic relations (e.g., is-a and part-of relations) from Wikipedia articles. These relations are used to build up a large and up-to-date thesaurus providing background knowledge for tasks such as determining semantic ontology mappings. Our automatic approach uses a comprehensive set of semantic patterns, finite state machines and NLP techniques to extract millions of relations between concepts. An evaluation for different domains shows the high quality and effectiveness of the proposed approach. We also illustrate the value of the newly found relations for improving existing ontology mappings.


2021 ◽  
Author(s):  
Marina Polevaya ◽  
Igor' Belogrud ◽  
Irina Ivanova ◽  
Elena Kamneva ◽  
Valentina Maslova ◽  
...  

In the modern economy, high-quality personnel is a key factor for the success of an organization. The success of the organization directly depends on the degree of qualification of the staff. The textbook presents technologies, methods and types of personnel training and development; legal and organizational aspects of professional training; socio-psychological features of personnel training and development; the basics of forming and managing the personnel reserve in the organization, as well as methods for evaluating the effectiveness of personnel training and development in the organization. It is intended for students studying in the direction of training "Personnel Management", students of institutes and advanced training courses, employees of personnel management services, managers of enterprises and organizations.


2011 ◽  
Vol 11 (1) ◽  
pp. 31-38
Author(s):  
Angayar K. Pavanasam ◽  
Ali Abbas ◽  
Vicki Chen

In water treatment, virus removal using ultrafiltration is a major step towards better water quality. In this paper, we study virus filtration efficiency using surrogate virus particles and via statistical surface-response approach. We focus on the effect of particle size (20–100 nm range) as a key factor along with the effects of transmembrane pressure (20–60 kPa range) and feed flowrate (0.3–1.0 L/F;min range) on the filtration virus removal efficiency (LRV). The particle size is shown to impart a great deal of influence on surrogate particle removal. The effect of particle-to-pore-size ratio is reported for comparison of membrane molecular weight cut off (MWCO) performance. It was shown experimentally and through the developed empirical regression model that transmembrane pressure plays a major role in controlling the filtration efficiency along with flowrate. In the studied experimental range, higher LRV values are obtained at lower transmembrane pressure (20 kPa) and at higher feed flowrate (1 L/F;min). Further the effect on LRV of the interaction between transmembrane pressure and particle size seems to be more significant than that of the interaction of flowrate with particle size.


Author(s):  
A. S. M. Yudin ◽  
A. N. Oumer ◽  
N. F. M. Roslan ◽  
M. A. Zulkarnain

Fluidised bed combustion (FBC) has been recognised as a suitable technology for converting a wide variety of fuels into energy. In a fluidised bed, the air is passed through a bed of granular solids resting on a distributor plate. Distributor plate plays an essential role as it determines the gas-solid movement and mixing pattern in a fluidised bed. It is believed that the effect of distributor configurations such as variation of free area ratio and air inclination angle through the distributor will affect the operational pressure drop of the fluidised bed. This paper presents an investigation on pressure drop in fluidised bed without the presence of inert materials using different air distributor designs; conventional perforated plate, multi-nozzles, and two newly proposed slotted distributors (45° and 90° inclined slotted distributors). A 3-dimensional Computational Fluid Dynamics (CFD) model is developed and compared with the experimental results. The flow model is based on the incompressible isothermal RNG k-epsilon turbulent model. In the present study, systematic grid-refinement is conducted to make sure that the simulation results are independent of the computational grid size. The non-dimensional wall distance,  is examined as a key factor to verify the grid independence by comparing results obtained at different grid resolutions. The multi-nozzles distributor yields higher distributor pressure drop with the averaged maximum value of 749 Pa followed by perforated, 45° and 90° inclined distributors where the maximum pressure drop recorded to be about one-fourth of the value of the multi-nozzles pressure drop. The maximum pressure drop was associated with the higher kinetic head of the inlet air due to the restricted and minimum number of distributor openings and low free area ratio. The results suggested that low-pressure drop operation in a fluidised bed can be achieved with the increase of open area ratio of the distributor.


2001 ◽  
Vol 5 (1) ◽  
pp. 92-103
Author(s):  
Hua Li ◽  
Wenyu Liu ◽  
Guangxi Zhu ◽  
Yaoting Zhu

A metamorphosis or a morphing is the process of continuously transforming one object into another, and are popular in computer animation, industrial design, and growth simulation. In this paper, a novel approach is presented for computing continuous shape transformation between polyhedral objects in this paper. Metamorphosis can be achieved by decomposing two objects into sets of individual convex sub-objects respectively and constructing the mapping between two sets, this method can solve the metamorphosis problem of two non-homotopic objects (including concave objects and holey objects). The results of object metamorphosis are discussed in this paper. The experiments show that this method can generate natural, high quality metamorphosis results with simple computation. This method can also be used in font composition and interpolation between two keyframes in 2D and 3D computer animation automatically.


2020 ◽  
Vol 34 (06) ◽  
pp. 9892-9899
Author(s):  
Michael Katz ◽  
Shirin Sohrabi

The need for multiple plans has been established by various planning applications. In some, solution quality has the predominant role, while in others diversity is the key factor. Most recent work takes both plan quality and solution diversity into account under the generic umbrella of diverse planning. There is no common agreement, however, on a collection of computational problems that fall under that generic umbrella. This in particular might lead to a comparison between planners that have different solution guarantees or optimization criteria in mind. In this work we revisit diverse planning literature in search of such a collection of computational problems, classifying the existing planners to these problems. We formally define a taxonomy of computational problems with respect to both plan quality and solution diversity, extending the existing work. We propose a novel approach to diverse planning, exploiting existing classical planners via planning task reformulation and choosing a subset of plans of required size in post-processing. Based on that, we present planners for two computational problems, that most existing planners solve. Our experiments show that the proposed approach significantly improves over the best performing existing planners in terms of coverage, the overall solution quality, and the overall diversity according to various diversity metrics.


Sign in / Sign up

Export Citation Format

Share Document