scholarly journals PIIKA 2.5: Enhanced quality control of peptide microarrays for kinome analysis

PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0257232
Author(s):  
Connor Denomy ◽  
Conor Lazarou ◽  
Daniel Hogan ◽  
Antonio Facciuolo ◽  
Erin Scruten ◽  
...  

Peptide microarrays consisting of defined phosphorylation target sites are an effective approach for high throughput analysis of cellular kinase (kinome) activity. Kinome peptide arrays are highly customizable and do not require species-specific reagents to measure kinase activity, making them amenable for kinome analysis in any species. Our group developed software, Platform for Integrated, Intelligent Kinome Analysis (PIIKA), to enable more effective extraction of meaningful biological information from kinome peptide array data. A subsequent version, PIIKA2, unveiled new statistical tools and data visualization options. Here we introduce PIIKA 2.5 to provide two essential quality control metrics and a new background correction technique to increase the accuracy and consistency of kinome results. The first metric alerts users to improper spot size and informs them of the need to perform manual resizing to enhance the quality of the raw intensity data. The second metric uses inter-array comparisons to identify outlier arrays that sometimes emerge as a consequence of technical issues. In addition, a new background correction method, background scaling, can sharply reduce spatial biases within a single array in comparison to background subtraction alone. Collectively, the modifications of PIIKA 2.5 enable identification and correction of technical issues inherent to the technology and better facilitate the extraction of meaningful biological information. We show that these metrics demonstrably enhance kinome analysis by identifying low quality data and reducing batch effects, and ultimately improve clustering of treatment groups and enhance reproducibility. The web-based and stand-alone versions of PIIKA 2.5 are freely accessible at via http://saphire.usask.ca.

2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Luca Alessandri ◽  
Francesca Cordero ◽  
Marco Beccuti ◽  
Nicola Licheri ◽  
Maddalena Arigoni ◽  
...  

AbstractSingle-cell RNA sequencing (scRNAseq) is an essential tool to investigate cellular heterogeneity. Thus, it would be of great interest being able to disclose biological information belonging to cell subpopulations, which can be defined by clustering analysis of scRNAseq data. In this manuscript, we report a tool that we developed for the functional mining of single cell clusters based on Sparsely-Connected Autoencoder (SCA). This tool allows uncovering hidden features associated with scRNAseq data. We implemented two new metrics, QCC (Quality Control of Cluster) and QCM (Quality Control of Model), which allow quantifying the ability of SCA to reconstruct valuable cell clusters and to evaluate the quality of the neural network achievements, respectively. Our data indicate that SCA encoded space, derived by different experimentally validated data (TF targets, miRNA targets, Kinase targets, and cancer-related immune signatures), can be used to grasp single cell cluster-specific functional features. In our implementation, SCA efficacy comes from its ability to reconstruct only specific clusters, thus indicating only those clusters where the SCA encoding space is a key element for cells aggregation. SCA analysis is implemented as module in rCASC framework and it is supported by a GUI to simplify it usage for biologists and medical personnel.


1983 ◽  
Vol 37 (5) ◽  
pp. 419-424 ◽  
Author(s):  
S. B. Smith ◽  
G. M. Hieftje

A new method is described and tested for background correction in atomic absorption spectrometry. Applicable to flame or furnace atomizers, the method is capable of correcting backgrounds caused by molecular absorption, particulate scattering, and atomic-line overlap, even up to an absorbance value of 3. Like the Zeeman approach, the new method applies its correction very near the atomic line of interest, can employ single-beam optics, and requires no auxiliary source. However, no ancillary magnet or other costly peripherals are required and working curves are single-valued. The new technique is based on the broadening which occurs in a hollow-cathode spectral line when the lamp is operated at high currents. Under such conditions, the absorbance measured for a narrow (atomic) line is low, whereas the apparent absorbance caused by a broad-band background contributor remains as high as when the lamp is operated at conventional current levels. Background correction can therefore be effected by taking the difference in absorbances measured with the lamp operated at high and low currents. The new technique is evaluated in its ability to correct several different kinds of background interference and is critically compared with competitive methods.


2021 ◽  
pp. 147807712110121
Author(s):  
Adam Tamas Kovacs ◽  
Andras Micsik

This article discusses a BIM Quality Control Ecosystem that is based on Requirement Linked Data in order to create a framework where automated BIM compliance checking methods can be widely used. The meaning of requirements is analyzed in a building project context as a basis for data flow analysis: what are the main types of requirements, how they are handled, and what sources they originate from. A literature review has been conducted to find the present development directions in quality checking, besides a market research on present, already widely used solutions. With the conclusions of these research and modern data management theory, the principles of a holistic approach have been defined for quality checking in the Architecture, Engineering and Construction (AEC) industry. A comparative analysis has been made on current BIM compliance checking solutions according to our review principles. Based on current practice and ongoing research, a state-of-the-art BIM quality control ecosystem is proposed that is open, enables automation, promotes interoperability, and leaves the data governing responsibility at the sources of the requirements. In order to facilitate the flow of requirement and quality data, we propose a model for requirements as Linked Data and provide example for quality checking using Shapes Constraint Language (SHACL). As a result, an opportunity is given for better quality and cheaper BIM design methods to be implemented in the industry.


2010 ◽  
Vol 07 (04) ◽  
pp. 573-594 ◽  
Author(s):  
JUGAL MOHAPATRA ◽  
SRINIVASAN NATESAN

In this article, we consider a defect-correction method based on finite difference scheme for solving a singularly perturbed delay differential equation. We solve the equation using upwind finite difference scheme on piecewise-uniform Shishkin mesh, then apply the defect-correction technique that combines the stability of the upwind scheme and the higher-order central difference scheme. The method is shown to be convergent uniformly in the perturbation parameter and almost second-order convergence measured in the discrete maximum norm is obtained. Numerical results are presented, which are in agreement with the theoretical findings.


2021 ◽  
Vol 13 (20) ◽  
pp. 4081
Author(s):  
Peter Weston ◽  
Patricia de Rosnay

Brightness temperature (Tb) observations from the European Space Agency (ESA) Soil Moisture Ocean Salinity (SMOS) instrument are passively monitored in the European Centre for Medium-range Weather Forecasts (ECMWF) Integrated Forecasting System (IFS). Several quality control procedures are performed to screen out poor quality data and/or data that cannot accurately be simulated from the numerical weather prediction (NWP) model output. In this paper, these quality control procedures are reviewed, and enhancements are proposed, tested, and evaluated. The enhancements presented include improved sea ice screening, coastal and ambiguous land-ocean screening, improved radio frequency interference (RFI) screening, and increased usage of observation at the edge of the satellite swath. Each of the screening changes results in improved agreement between the observations and model equivalent values. This is an important step in advance of future experiments to test the direct assimilation of SMOS Tbs into the ECMWF land data assimilation system.


2017 ◽  
Vol 38 ◽  
pp. 88-92 ◽  
Author(s):  
Masoumeh Parsi ◽  
Mehdi Sohrabi ◽  
Fereidoun Mianji ◽  
Reza Paydar

2021 ◽  
Author(s):  
Elke Rustemeier ◽  
Udo Schneider ◽  
Markus Ziese ◽  
Peter Finger ◽  
Andreas Becker

<p><span>Since its founding in 1989, the Global Precipitation Climatology Centre (GPCC) has been producing global precipitation analyses based on land surface in-situ measurements. </span><span>In the now over 30 years the underlying database has been continuously expanded and includes a high station density and large temporal coverage. Due to the semi-automatic quality control routinely performed on the incoming station data, the GPCC database has a very high quality.</span> <span>Today, the GPCC holds data from </span><span>123,000 stations, about three quarters of them having long time series.</span></p><p><span>The core of the analyses is formed by data from the global meteorological and hydrological services, which provided their records to the GPCC, as well as global and regional data collections.  </span><span>In addition, the GPCC receives SYNOP and CLIMAT reports via the WMO-GTS. These form a supplement for the high quality precipitation analyses and the basis for the near real-time evaluations.</span></p><p><span>Quality control activities include cross-referencing stations from different sources, flagging of data errors, and correcting temporally or spatially offset data. This data then forms the basis for the following interpolation and product generation.</span></p><p><span>In near real time, the 'First Guess Monthly', 'First Guess Daily', 'Monitoring Product', ‘Provisional Daily Precipitation Analysis’ and the 'GPCC Drought Index' are generated. These are based on WMO-GTS data and monthly data generated by the CPC (NOAA). </span></p><p><span>With a 2-3 year update cycle, the high quality data products are generated with intensive quality control and built on the entire GPCC data base. These non-real time products consist of the 'Full Data Monthly', 'Full Data Daily', 'Climatology', and 'HOMPRA-Europe' and are now available in the 2020 version. </span></p><p><span>A</span><span>ll gridded datasets presented in this paper are freely available in netcdf format on the GPCC website https://gpcc.dwd.de and referenced by a digital object identifier (DOI). The site also provides an overview of all datasets, as well as a detailed description and further references for each dataset.</span></p>


Sensors ◽  
2019 ◽  
Vol 19 (5) ◽  
pp. 1206
Author(s):  
Shang-Yuan Chen ◽  
Cheng-Yen Chen

Taiwan has suffered from widespread haze and poor air quality during recent years, and the control of indoor air quality has become an important topic. This study relies on Multi-Agent theory in which collected air quality was used in calculations and after agents make decisions in accordance with pre-written rules to construct and indoor air quality control system and conflict resolution mechanism, which will serve to maintain a healthy and comfortable indoor environment. As for implementation, the simulated system used the Arduino open source microcontroller system to collect air quality data and turn on building equipment in order to improve indoor air quality. This study also used the graphic control program LabVIEW to write a control program and user interface. The implementation verifies the feasibility of applying multi-agent theory to air quality control systems, and an Individual intelligent agent has the basic ability to resolve their own conflicts autonomously. However, when there are multiple factors and user status are simultaneously involved in the decision-making, it is difficult for the system to exhaust all conflict conditions, and when context control surpassing the restrictions of binary logic rule-based reasoning, it is necessary to change the algorithm and redesign the system.


Sign in / Sign up

Export Citation Format

Share Document