test elements
Recently Published Documents


TOTAL DOCUMENTS

101
(FIVE YEARS 21)

H-INDEX

10
(FIVE YEARS 2)

2021 ◽  
Vol 1203 (2) ◽  
pp. 022052
Author(s):  
Łukasz Drobiec ◽  
Radosław Jasiński ◽  
Wojciech Mazur ◽  
Remigiusz Jokiel

Abstract This paper describes a comparison of results from testing shear strength of autoclaved aerated concrete (AAC) walls strengthened with superficial strengthening with the results of tests of walls made of various types of joints and mortar. The initial and characteristic shear strength and the angle of internal friction were compared. The test elements were made using two types of mortars, three types of joints, and two methods of reinforcement. The models were made using masonry units in the SOLBET OPTIMAL system. SOLBET 0.1 mortar, intended for thin joints, as well as SOLBET SMART polyurethane adhesive were used. Typical joints with a width equal to the thickness of the wall, shell bedded joints and joints without mortar were made. Models with typical joints were also tested as reinforced on one and two sides with the FRCM system, using the mineral cement matrix PBO-MX GOLD MASONRY and the PBO-MESH GOLD 22/22 mesh. A total of 56 models were tested in accordance with the requirements of PN-EN 1052-3: 2004. A significant influence of with superficial strengthening as well as the type of mortar and the construction of joints on the individual parameters of shear strength was demonstrated.


2021 ◽  
pp. 1-7
Author(s):  
Yuan Gong ◽  
Lu Mao ◽  
Changliang Li

Abstract Currently, as a basic task of military document information extraction, Named Entity Recognition (NER) for military documents has received great attention. In 2020, China Conference on Knowledge Graph and Semantic Computing (CCKS) and System Engineering Research Institute of Academy of Military Sciences (AMS) issued the NER task for test evaluation, which requires the recognition of four types of entities including Test Elements (TE), Performance Indicators (PI), System Components (SC) and Task Scenarios (TS). Due to the particularity and confidentiality of the military field, only 400 items of annotated data are provided by the organizer. In this paper, the task is regarded as a few-shot learning problem for NER, and a method based on BERT and two-level model fusion is proposed. Firstly, the proposed method is based on several basic models fine tuned by BERT on the training data. Then, a two-level fusion strategy applied to the prediction results of multiple basic models is proposed to alleviate the over-fitting problem. Finally, the labeling errors are eliminated by post-processing. This method achieves F1 score of 0.7203 on the test set of the evaluation task.


Biology ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 345
Author(s):  
Kinga Proc ◽  
Piotr Bulak ◽  
Monika Kaczor ◽  
Andrzej Bieganowski

Bioaccumulation, expressed as the bioaccumulation factor (BAF), is a phenomenon widely investigated in the natural environment and at laboratory scale. However, the BAF is more suitable for ecological studies, while in small-scale experiments it has limitations, which are discussed in this article. We propose a new indicator, the bioaccumulation index (BAI). The BAI takes into account the initial load of test elements, which are added to the experimental system together with the biomass of the organism. This offers the opportunity to explore the phenomena related to the bioaccumulation and, contrary to the BAF, can also reveal the dilution of element concentration in the organism. The BAF can overestimate bioaccumulation, and in an extremal situation, when the dilution of element concentration during organism growth occurs, the BAF may produce completely opposite results to the BAI. In one of the examples presented in this work (Tschirner and Simon, 2015), the concentration of phosphorous in fly larvae was lower after the experiment than in the younger larvae before the experiment. Because the phosphorous concentration in the feed was low, the BAF indicated a high bioaccumulation of this element (BAF = 14.85). In contrast, the BAI showed element dilution, which is a more realistic situation (BAI = −0.32). By taking more data into account, the BAI seems to be more valid in determining bioaccumulation, especially in the context of entomoremediation research.


Materials ◽  
2021 ◽  
Vol 14 (4) ◽  
pp. 747
Author(s):  
Mariusz Jaśniok ◽  
Jacek Kołodziej ◽  
Krzysztof Gromysz

This article describes the comparative analysis of tests on bond strength of hot-dip galvanized and black steel to concrete with and without chlorides. The bond effect was evaluated with six research methods: strength, electrochemical (measurements of potential, EIS and LPR), optical, and 3D scanning. The tests were conducted within a long period of 18 months on 48 test elements reinforced with smooth rebars ϕ8 mm from steel grade S235JR+AR and ribbed rebars ϕ8 mm and ϕ16 mm from steel grade B500SP. The main strength tests on the reinforcement bond to concrete were used to compare forces pulling out galvanized and black steel rebars from concrete. This comparative analysis was performed after 28, 180, and 540 days from the preparation of the elements. The electrochemical tests were performed to evaluate corrosion of steel rebars in concrete, particularly in chloride contaminated concrete. The behaviour of concrete elements while pulling out the rebar was observed using the system of digital cameras during the optical tests. As regards 3D scanning of ribbed rebars ϕ8 mm and ϕ16 mm, this method allowed the detailed identification of their complex geometry in terms of determining the polarization area to evaluate the corrosion rate of reinforcement in concrete. The test results indicated that the presence of zinc coating on rebars had an impact on the parameters of anchorage. In the case of ribbed rebars of 16 mm in diameter, the maximum values of adhesive stress and bond stiffness were reduced over time when compared to black steel rebars. Moreover, it was noticed that the stiffness of rebar anchorage in chloride contaminated concrete was considerably higher than in concrete without chlorides.


Materials ◽  
2020 ◽  
Vol 13 (24) ◽  
pp. 5671
Author(s):  
Jacek Szpetulski ◽  
Bohdan Stawiski

During compaction of a concrete mix, when thin slabs are formed in a horizontal position, the components of this mix become segregated. Heavy components fall to the bottom, and light components (air and water) move to the top. This process may suggest that the upper layers of concrete elements formed in a horizontal position may have lower compressive strength than the remaining part of the element. This problem is recognized and documented in many publications, but there was a publication whose test results indicate a lack of variability in the compressive strength of concrete across the thickness of tested elements. The discrepancies appearing in the evaluation of concrete homogeneity was the reason for conducting destructive tests of the compressive strength of concrete across the thickness of horizontally concreted test elements that imitate thin slabs. The obtained results of the destructive compressive strength confirmed previous results regarding the heterogeneity of concrete. They clearly indicate that there is a differentiation of the compressive strength of concrete across the thickness of a thin element, which remained in a liquefied state for a certain time during its formation. The longer the duration of this state across the entire thickness of the formed element, the greater the differentiation of the compressive strength between the top and bottom layers.


Author(s):  
Magdolna Pál ◽  
◽  
Bojan Banjanin ◽  
Sandra Dedijer ◽  
Gojko Vladić ◽  
...  

The embossing process in graphic industry utilizes custom made dies to create raised or lowered permanent relief patterns according to the design. It can be combined with other print finishing embellishments, such as foil stamping or pearlescent coating, but the simplest version of this process, the blind embossing is also a very effective technique to create a distinguished and sophisticated look. The quality control of embossing features was done only visually for a long time, but in the recent years it became an important target for graphic instrument manufacturers focusing on the embossing depth, as one of the most important parameters for high quality processing. This analysis was aiming to investigate the applicability of a simple flatbed scanner and the developed image processing algorithm for embossing quality evaluation. The results of detailed visual assessment of all scanned paper samples and the obtained values of average greyscale difference (shadow-based contrast), showed that single-level embossed samples can be realistically digitalized using a simple flatbed scanner, as an image acquisition equipment. Additionally, the proposed image feature, the shadow-based contrast had increasing tendency by increasing the applied compression force, in general, suggesting that it has a potential as an objective measure for the deformation rate in the embossing quality evaluation. The obtained results gathered for different combinations of observed parameters (paper grade, basis weight, type of test elements on the embossing dies) with the optimal compression forces, implied that the shadow-based contrast could be used as a reference parameter to ensure adequate visibility of embossed elements by defining the minimum value of needed contrast.


2020 ◽  
Vol 1 (1) ◽  
pp. 53-58
Author(s):  
V.V. Korchynskyi ◽  
◽  
V.I. Kildishev ◽  
A.M. Berdnikov ◽  
K.O. Smazhenko

Recently, much attention has been paid to the research of the properties and methods of forming complex noise-like signals to increase the noise immunity of radio communication systems operating in the conditions of electronic conflict. Using such signals, the tasks are resolved to improve the noise immunity and the main indicators of transmission stealth (energy, structural and information). For the task of the synthesis noise-like signals using timer signal designs in combination with the method of spreading the spectrum based on linear frequency modulation is proposed. The advisability of using timer signals is justified by their properties, which increase the noise immunity and stealth transmission. In contrast to positional codes, timer signals have more complex structure for construct signal construction. The initial parameters for constructing timer signals allow the formation of various sets of signal constructions. Such variational possibilities of constructing timer signals substantially increase the potential structural stealth of the transmission. Also based on timer signals, noise immunity coding is implemented without additional test elements. In the article A method for synthesizing noise-like signals based on linear frequency modulation and timer signal designs is proposed.


Materials ◽  
2020 ◽  
Vol 13 (15) ◽  
pp. 3315
Author(s):  
Mariusz Jaśniok ◽  
Maria Sozańska ◽  
Jacek Kołodziej ◽  
Bartosz Chmiela

Corrosion-induced damage to concrete reinforced with bars is a serious problem regarding technical and economic aspects and strongly depends on used materials, corrosion environment, and service life. Tests described in this paper refer to a two-year evaluation of the effectiveness of protection provided by zinc-coated low-carbon reinforcing steel of grade B500SP in concrete against chloride corrosion. Performed tests were comparative and included measurements conducted on four groups of concrete test elements with dimensions of 40 mm × 40 mm × 140 mm reinforced with a bar having a diameter of ϕ8 mm. Particular groups were a combination of different types of concrete with or without chloride additives, with galvanized or black steel. Chlorides as CaCl2 were added to the concrete mix in the amount of 3% of cement weight in concrete. Reinforced concrete specimens were periodically monitored within two years using the following techniques: linear polarization resistance (LPR) and electrochemical impedance spectroscopy (EIS). Polarization measurements were conducted in a three-electrode arrangement, in which a rebar in concrete served as a working electrode, stainless steel sheet was used as an auxiliary electrode, and Cl−/AgCl,Ag was a reference electrode. Comparative tests of changes in the density of corrosion current in concrete specimens without chloride additives basically demonstrated no development of corrosion, and possible passivation was expected in case of black steel. Higher densities of corrosion current were observed for galvanized steel during first days of testing. The reason was the dissolution of zinc after the contact with initially high pH of concrete pore solution. Six-month measurements demonstrated a higher density of corrosion current in concrete specimens with high concentration of chlorides, which unambiguously indicated corrosion in concrete reinforced with galvanized or black steel. Densities of corrosion current determined for selected specimens dramatically decreased after an 18-month interval in measurements. Corrosion was even inhibited on black steel as an insulating barrier of corrosion products was formed. The above observations were confirmed with structural studies using scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS) techniques. Results obtained from corrosion (LPR, EIS) and structural (SEM, EDS) tests on specimens of concrete reinforced with steel B500SP demonstrated a very favorable impact of zinc coating on steel by providing two-year protection against corrosion in the environment with very high chloride content.


2020 ◽  
Vol 5 ◽  
pp. 174
Author(s):  
Sarah H. Needs ◽  
Stephanie P. Bull ◽  
Josefina Bravo ◽  
Sue Walker ◽  
Gemma Little ◽  
...  

Both home sample collection and home testing using rapid point-of-care diagnostic devices can offer benefits over attending a clinic/hospital to be tested by a healthcare professional. Usability is critical to ensure that in-home sampling or testing by untrained users does not compromise analytical performance. Usability studies can be laborious and rely on participants attending a research location or a researcher visiting homes; neither has been appropriate during COVID-19 outbreak control restrictions. We therefore developed a remote research usability methodology using videolink observation of home users. This avoids infection risks from home visits and ensures the participant follows the test protocol in their home environment. In this feasibility study, volunteers were provided with models of home blood testing and home blood sampling kits including a model lancet, sampling devices for dried blood spot collection, and model lateral flow device. After refining the study protocol through an initial pilot (n = 7), we compared instructions provided either as written instructions (n = 5), vs addition of video instructions (n = 5), vs written and video instructions plus videolink supervision by the researcher (n = 5). All users were observed via video call to define which test elements could be assessed remotely. All 22 participants in the study accessed the video call and configured their videolink allowing the researcher to clearly observe all testing tasks. The video call allowed the researcher to assess distinct errors during use including quantitative (volume of blood) and qualitative (inaccurate interpretation of results) errors many of which could compromise test accuracy. All participants completed the tasks and returned images of their completed tests (22/22) and most returned completed questionnaires (20/22). We suggest this remote observation via videolink methodology is a simple, rapid and powerful methodology to assess and optimise usability of point-of-care testing methods in the home setting.


Author(s):  
Rod Downey ◽  
Noam Greenberg

This chapter examines presentations of left–c.e. reals, proving Theorem 1.4. One of the main ideas of this book is unifying the combinatorics of constructions in various subareas of computability theory. The chapter looks at one such subarea: algorithmic randomness. It provides a brief account of the basics of algorithmic randomness, and includes the basic definitions required in the chapter. While algorithmic randomness has a history going back to the early work of Borel on normal numbers, von Mises, and even Turing, the key concept in the modern incarnation of algorithmic information theory is Martin-Löf randomness. A notion of randomness is determined by a countable collection of null sets, with each null set considered a statistical test. Elements of the null sets are those which have failed the test; they are atypical, in the sense of measure. One of the reasons the notion of ML-randomness is central is that it is robust.


Sign in / Sign up

Export Citation Format

Share Document