data variation
Recently Published Documents


TOTAL DOCUMENTS

77
(FIVE YEARS 31)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 26 (5) ◽  
pp. 3002-3007
Author(s):  
MARIANA-GRATIELA SOARE (VLADU) ◽  
◽  
MARIA-MONICA PETRESCU ◽  
MIHAELA-CARMEN EREMIA

The aim of this comparative study was to obtain a model for production of inulinase and invertase by species Saccharomyces, Candida and Hansenula, strains from culture collection of INCDCF-ICCF, using submerged fermentation in a medium containing inulin as source of C. This model explained the data variation and the actual relationships between the parameters and responses. The dry biomass content as well as the production of inulinase and invertase in the bioprocess medium was influenced by inulin concentration and microelement composition. The main parameters for bioprocesses were: inoculum size 2% (v/v), pH 6, temperature 280 C and 220 rpm agitation speed. Following comparative study for production of extracellular inulinase (exo and endo inulinase) and invertase were obtained for Candida arborea the best results, invertase production having significantly higher concentrations than inulinase (35.92 U/mL invertase activity vs. 8.01 U/mL inulinase activity), on M5 medium. These results could be useful for industrial applications such as food industry, pharmaceutical.


2021 ◽  
Vol 25 (8) ◽  
pp. 4549-4565
Author(s):  
Michael Stoelzle ◽  
Lina Stein

Abstract. Nowadays color in scientific visualizations is standard and extensively used to group, highlight or delineate different parts of data in visualizations. The rainbow color map (also known as jet color map) is famous for its appealing use of the full visual spectrum with impressive changes in chroma and luminance. Besides attracting attention, science has for decades criticized the rainbow color map for its non-linear and erratic change of hue and luminance along the data variation. The missed uniformity causes a misrepresentation of data values and flaws in science communication. The rainbow color map is scientifically incorrect and hardly decodable for a considerable number of people due to color vision deficiency (CVD) or other vision impairments. Here we aim to raise awareness of how widely used the rainbow color map still is in hydrology. To this end, we perform a paper survey scanning for color issues in around 1000 scientific publications in three different journals including papers published between 2005 and 2020. In this survey, depending on the journal, 16 %–24 % of the publications have a rainbow color map and around the same ratio of papers (18 %–29 %) uses red–green elements often in a way that color is the only possibility to decode the visualized groups of data. Given these shares, there is a 99.6 % chance to pick at least one visual problematic publication in 10 randomly chosen papers from our survey. To overcome the use of the rainbow color maps in science, we propose some tools and techniques focusing on improvement of typical visualization types in hydrological science. We give guidance on how to avoid, improve and trust color in a proper and scientific way. Finally, we outline an approach how the rainbow color map flaws should be communicated across different status groups in science.


2021 ◽  
Vol 10 (4) ◽  
pp. 2110-2118
Author(s):  
Priati Assiroj ◽  
Harco Leslie Hendric Spits Warnars ◽  
Edi Abdurachman ◽  
Achmad Imam Kistijantoro ◽  
Antoine Doucet

The fingerprint is one kind of biometric. This biometric unique data have to be processed well and secure. The problem gets more complicated as data grows. This work is conducted to process image fingerprint data with a memetic algorithm, a simple and reliable algorithm. In order to achieve the best result, we run this algorithm in a parallel environment by utilizing a multi-thread feature of the processor. We propose a high-performance computing memetic algorithm (HPCMA) to process a 7200 image fingerprint dataset which is divided into fifteen specimens based on its characteristics based on the image specification to get the detail of each image. A combination of each specimen generates a new data variation. This algorithm runs in two different operating systems, Windows 7 and Windows 10 then we measure the influence of data size on processing time, speed up, and efficiency of HPCMA with simple linear regression. The result shows data size is very influencing to processing time more than 90%, to speed up more than 30%, and to efficiency more than 19%.


2021 ◽  
Vol 13 (13) ◽  
pp. 2520
Author(s):  
Dongdong Ma ◽  
Tanzeel U. Rehman ◽  
Libo Zhang ◽  
Hideki Maki ◽  
Mitchell R. Tuinstra ◽  
...  

Aerial imaging technologies have been widely applied in agricultural plant remote sensing. However, an as yet unexplored challenge with field imaging is that the environmental conditions, such as sun angle, cloud coverage, temperature, and so on, can significantly alter plant appearance and thus affect the imaging sensor’s accuracy toward extracting plant feature measurements. These image alterations result from the complicated interaction between the real-time environments and plants. Analysis of these impacts requires continuous monitoring of the changes through various environmental conditions, which has been difficult with current aerial remote sensing systems. This paper aimed to propose a modeling method to comprehensively understand and model the environmental influences on hyperspectral imaging data. In 2019, a fixed hyperspectral imaging gantry was constructed in Purdue University’s research farm, and over 8000 repetitive images of the same corn field were taken with a 2.5 min interval for 31 days. Time-tagged local environment data, including solar zenith angle, solar irradiation, temperature, wind speed, and so on, were also recorded during the imaging time. The images were processed for phenotyping data, and the time series decomposition method was applied to extract the phenotyping data variation caused by the changing environments. An artificial neural network (ANN) was then built to model the relationship between the phenotyping data variation and environmental changes. The ANN model was able to accurately predict the environmental effects in remote sensing results, and thus could be used to effectively eliminate the environment-induced variation in the phenotyping features. The test of the normalized difference vegetation index (NDVI) calculated from the hyperspectral images showed that variance in NDVI was reduced by 79%. A similar performance was confirmed with the relative water content (RWC) predictions. Therefore, this modeling method shows great potential for application in aerial remote sensing applications in agriculture, to significantly improve the imaging quality by effectively eliminating the effects from the changing environmental conditions.


2021 ◽  
Vol 11 (13) ◽  
pp. 5950
Author(s):  
Alexandre F. Santos ◽  
Pedro D. Gaspar ◽  
Heraldo J. L. de Souza

As the world data traffic increasingly grows, the need for computer room air conditioning (CRAC)-type equipment grows proportionally. The air conditioning equipment is responsible for approximately 38% of the energy consumption of data centers. The energy efficiency of these pieces of equipment is compared according to the Energy Standard ASHRAE 90.1-2019, using the index Net Sensible Coefficient Of Performance (NetSCOP). This method benefits fixed-speed compressor equipment with a constant inlet temperature air-cooled condenser (35 °C). A new method, COP WEUED (COP–world energy usage effectiveness design), is proposed based on the IPLV (integrated part load value) methodology. The IPLV is an index focused on partial thermal loads and outdoor temperature data variation for air intake in the condenser. It is based on the average temperatures of the USA’s 29 major cities. The new method is based on the 29 largest cities worldwide and with data-center-specific indoor temperature conditions. For the same inverter compressor, efficiencies of 4.03 and 4.92 kW/kW were obtained, using ASHRAE 90.1-2019 and the proposed method, respectively. This difference of almost 20% between methods is justified because, during less than 5% of the annual hours, the inlet air temperature in the condenser is close to the NetSCOP indication.


2021 ◽  
Author(s):  
Michael Stoelzle ◽  
Lina Stein

Abstract. Nowadays color in scientific visualizations is standard and extensively used to group, highlight or delineate different parts of data in visualizations. The rainbow color map (also known as jet color map) is famous for its appealing use of the full visual spectrum with impressive changes in chroma and luminance. Beside attracting attention, science has for decades criticized the rainbow color map for its non-linear and erratic change of hue and luminance along the data variation. The missed uniformity causes a misrepresentation of data values and flaws in science communication. The rainbow color map is scientifically incorrect and hardly decodable for a considerable number of people due to color-vision deficiency (CVD) or other vision impairments. Here we aim to raise awareness how widely used the rainbow color maps still is in hydrology. To this end we perform a paper survey scanning for color issues in around 1000 scientific publications in three different journals including papers published between 2005 and 2020. In this survey, depending on the journal, 16–24 % of the publications have a rainbow color map and around the same ratio of papers (18–29 %) use red-green elements often in a way that color is the only possibility to decode the visualized groups of data. Given these shares, there is a 99.6 % chance to pick at least one visual problematic publication in 10 randomly chosen papers from our survey. To overcome the use of the rainbow color maps in science, we propose some tools and techniques focusing on improvement of typical visualization types in hydrological science. Consequently, color should be used with more care to highlight most important aspects of a visualization and the identification of correct data types such as categorical or sequential data is essential to pick appropriate color maps. We give guidance how to avoid, improve and trust and color in a proper and scientific way. Finally, we sketch a way to improve the communication of rainbow flaws between different status groups in science, publishers, and the media.


2021 ◽  
Author(s):  
Morteza Khodagholi ◽  
Razieh saboohi ◽  
Ehasan Zandi Esfahani

Abstract The relationship between plant species and climatic factors has always been a fundamental issue in plant ecology, and the use of multivariate statistical methods can be effective in revealing the relationship between climatic factors and plant species distribution. Therefore, in this study, climatic factors affecting the distribution of Artemisia sieberi and Artemisia aucheri, widely distributed in Iran, were investigated. For this purpose, 117 climatic factors were used, and to reduce the number of factors and determine the most important effective ones, a factor analysis was used by principal component analysis. The results showed that six factors including heating temperature, spring and summer precipitation, wind, autumn-winter precipitation, and dusty and cloudiness days explained 37.32%, 22.54%, 7.18%, 6.6%, 4.22%, and 4.15% of data variation, respectively. Together these seven factors account for 82% of data variation. The autumn-winter precipitation and heating temperature had the greatest impact on the presence of Artemisia sieberi and Artemisia aucheri, respectively, so that the autumn-winter precipitation was negative in areas where Ar.sieberi is observed. The heating temperature factor is negative in areas where Ar.aucheri is present, while it is positive in areas lacking Ar.aucheri. The study of the effect of environmental factors on Artemisia species distribution is very important in planning and management of natural resources, and Artemisia is one of the most important plants in the country's rangelands; therefore, the results of this research can be used for practical planning, management and reclamation of these rangelands.


Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 279
Author(s):  
Vincent Vigneron ◽  
Hichem Maaref ◽  
Tahir Q. Syed

The pooling layer is at the heart of every convolutional neural network (CNN) contributing to the invariance of data variation. This paper proposes a pooling method based on Zeckendorf’s number series. The maximum pooling layers are replaced with Z pooling layer, which capture texels from input images, convolution layers, etc. It is shown that Z pooling properties are better adapted to segmentation tasks than other pooling functions. The method was evaluated on a traditional image segmentation task and on a dense labeling task carried out with a series of deep learning architectures in which the usual maximum pooling layers were altered to use the proposed pooling mechanism. Not only does it arbitrarily increase the receptive field in a parameterless fashion but it can better tolerate rotations since the pooling layers are independent of the geometric arrangement or sizes of the image regions. Different combinations of pooling operations produce images capable of emphasizing low/high frequencies, extract ultrametric contours, etc.


Econometrica ◽  
2021 ◽  
Vol 89 (2) ◽  
pp. 591-614
Author(s):  
Alexei Onatski ◽  
Chen Wang

This paper draws parallels between the principal components analysis of factorless high‐dimensional nonstationary data and the classical spurious regression. We show that a few of the principal components of such data absorb nearly all the data variation. The corresponding scree plot suggests that the data contain a few factors, which is corroborated by the standard panel information criteria. Furthermore, the Dickey–Fuller tests of the unit root hypothesis applied to the estimated “idiosyncratic terms” often reject, creating an impression that a few factors are responsible for most of the nonstationarity in the data. We warn empirical researchers of these peculiar effects and suggest to always compare the analysis in levels with that in differences.


Author(s):  
M. Martinsen ◽  
K. O. Hed ◽  
J. S. Diget ◽  
H. L. Lein

AbstractAtmospheric icing on structures and equipment represents a challenge for operation and safety. Passive ice removal by ice-phobic coatings has received much attention over the last decades. The current state-of-the-art methods for quantifying the ice-release properties of such coatings suffer from a range of drawbacks, including poor reproducibility and high complexity test setups. Here, a facile rotational tribometer approach for measuring the static friction between polymeric coatings and ice is presented. The torque necessary to initiate motion at the coating-ice interphase was used as a measure of ice release. For a polydimethylsiloxane-based coating (Sylgard 184), the effects of ice-temperature, normal force, coating thickness, and dwell time (contact time between coating and ice at rest with fully applied normal force prior to applying torque) were established along with the conditions resulting in least data variation. With these conditions, tribology-based friction measurements were carried out on two additional coatings; a two-component polyurethane, and a commercial foul release coating. The outcome of the method, i.e., grading of the coatings in terms of antiicing effect, matched those obtained with a widely used ice adhesion test method based on ice shear adhesion testing. The same trends are revealed by the two methods. However, the findings from the proposed tribology-based method result in consistently lower variation in outcomes and offer more detail on the ice adhesion and friction mechanisms.


Sign in / Sign up

Export Citation Format

Share Document