simple filter
Recently Published Documents


TOTAL DOCUMENTS

61
(FIVE YEARS 9)

H-INDEX

10
(FIVE YEARS 1)

PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e12364
Author(s):  
Hongdou Liu ◽  
Liqiang Zhang ◽  
Yu Sun ◽  
Guangbo Xu ◽  
Weidong Wang ◽  
...  

In composting, the degradation of lignocellulose in straw is problematic due to its complex structures such as lignin. A common solution to this problem is the addition of exogenous inoculants. AC-1, a stable thermophilic microbial composite, was isolated from high temperature compost samples that can decompose lignocellulose at 50–70 °C. AC-1 had a best degradation efficiency of rice straw at 60 °C (78.92%), of hemicellulose, cellulose and lignin were 82.49%, 97.20% and 20.12%, respectively. It showed degrad-ability on both simple (filter paper, absorbent cotton) and complex (rice straw) cellulose materials. It produced acetic and formic acid during decomposition process and the pH had a trend of first downward then upward. High throughput sequencing revealed the main bacterial components of AC-1 were Tepidimicrobium, Haloplasma, norank-f-Limnochordaceae, Ruminiclostridium and Rhodothermus which provides major theoretical basis for further application of AC-1.


PLoS ONE ◽  
2021 ◽  
Vol 16 (2) ◽  
pp. e0246728
Author(s):  
Breanne Hobden ◽  
Mariko Carey ◽  
Rob Sanson-Fisher ◽  
Andrew Searles ◽  
Christopher Oldmeadow ◽  
...  

Background This study aimed to illustrate the potential utility of a simple filter model in understanding the patient outcome and cost-effectiveness implications for depression interventions in primary care. Methods Modelling of hypothetical intervention scenarios during different stages of the treatment pathway was conducted. Results Three scenarios were developed for depression related to increasing detection, treatment response and treatment uptake. The incremental costs, incremental number of successes (i.e., depression remission) and the incremental costs-effectiveness ratio (ICER) were calculated. In the modelled scenarios, increasing provider treatment response resulted in the greatest number of incremental successes above baseline, however, it was also associated with the greatest ICER. Increasing detection rates was associated with the second greatest increase to incremental successes above baseline and had the lowest ICER. Conclusions The authors recommend utility of the filter model to guide the identification of areas where policy stakeholders and/or researchers should invest their efforts in depression management.


RSC Advances ◽  
2021 ◽  
Vol 11 (63) ◽  
pp. 39838-39847
Author(s):  
My Uyen Dao ◽  
Hien Y Hoang ◽  
Anh Khoa Tran ◽  
Hong Hanh Cong

In this study, a simple filter system based on silver nanoparticles coated onto activated carbon derived from rice husk (AgNPs@AC) has been proposed for treating floodwater from the Hau Giang River.


Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1149
Author(s):  
G. J. Baxter ◽  
R. A. da Costa ◽  
S. N. Dorogovtsev ◽  
J. F. F. Mendes

Compression, filtering, and cryptography, as well as the sampling of complex systems, can be seen as processing information. A large initial configuration or input space is nontrivially mapped to a smaller set of output or final states. We explored the statistics of filtering of simple patterns on a number of deterministic and random graphs as a tractable example of such information processing in complex systems. In this problem, multiple inputs map to the same output, and the statistics of filtering is represented by the distribution of this degeneracy. For a few simple filter patterns on a ring, we obtained an exact solution of the problem and numerically described more difficult filter setups. For each of the filter patterns and networks, we found three key numbers that essentially describe the statistics of filtering and compared them for different networks. Our results for networks with diverse architectures are essentially determined by two factors: whether the graphs structure is deterministic or random and the vertex degree. We find that filtering in random graphs produces much richer statistics than in deterministic graphs, reflecting the greater complexity of such graphs. Increasing the graph’s degree reduces this statistical richness, while being at its maximum at the smallest degree not equal to two. A filter pattern with a strong dependence on the neighbourhood of a node is much more sensitive to these effects.


2020 ◽  
Vol 8 (1) ◽  
pp. 1-11
Author(s):  
Kurniawan M. Nur ◽  
Halil Halil ◽  
Driyanto Wahyu Wicaksono

Undergravel airlift pump is a simple filter system that uses air. The purpose of this study was to analyze the efficiency of using Undergravel Airlift Pump in Aquaculture Oreochromis sp. The approach used is an experimental approach with 2 treatments and 1 control. Analysis of the data used is the analysis of variance (ANOVA). The results showed that the application of undergravel airlift pump had an impact on water quality. The treatment of middle airlift  position is more effective compared to the bottom airlift  and control. The temperature and pH parameters were not significantly different; dissolved oxygen and total dissolved solid were significantly different.


2020 ◽  
Vol 169 ◽  
pp. 107305
Author(s):  
K. Karthikeyan ◽  
Ravi Saranya ◽  
Raja Bharath ◽  
R. Vidya ◽  
Toshiaki Itami ◽  
...  

Author(s):  
Mykola Sysyn ◽  
Dimitri Gruen ◽  
Ulf Gerber ◽  
Olga Nabochenko ◽  
Vitalii Kovalchuk

A machine learning approach for the recent detection of crossing faults is presented in the paper. The basis for the research are the data of the axle box inertial measurements on operational trains with the system ESAH-F. Within the machine learning approach the signal processing methods, as well as data reduction classification methods, are used. The wavelet analysis is applied to detect the spectral features at measured signals. The simple filter approach and sequential feature selection is used to find the most significant features and train the classification model. The validation and error estimates are presented and its relation to the number of selected features is analysed, as well.


2018 ◽  
Vol 27 (07) ◽  
pp. 1860014
Author(s):  
Ke Xu ◽  
Crystal Maung ◽  
Hiromasa Arai ◽  
Haim Schweitzer

Feature selection is a common dimensionality reduction technique of fundamental importance in big data. A common approach for reducing the running time of feature selection is to perform it in two stages. In the first stage a fast and simple filter is applied to select good candidates. The number of candidates is further reduced in the second stage by an accurate algorithm that may run significantly slower. There are two main variants of feature selection: unsupervised and supervised. In the supervised variant features are selected for predicting labels, while the unsupervised variant does not use labels at all. We describe a general framework that can use an arbitrary off-the-shelf unsupervised algorithm for the second stage. The algorithm is applied to the selection obtained in the first stage weighted appropriately. Our main technical result is a method for calculating weights for the columns that need to be selected in the second stage. We show that these weights can be computed as the solution to a constrained quadratic optimization problem. The solution is deterministic, and improves on previously published studies that use probabilistic ideas to compute similar weights. To the best of our knowledge our approach is the first technique for converting a supervised feature selection problem into an unsupervised problem. Complexity analysis shows that the proposed technique is very fast, can be implemented in a single pass over the data, and can take advantage of data sparsity. Experimental results show that the accuracy of the proposed method is comparable to that of much slower techniques.


Sign in / Sign up

Export Citation Format

Share Document