suggested procedure
Recently Published Documents


TOTAL DOCUMENTS

90
(FIVE YEARS 12)

H-INDEX

11
(FIVE YEARS 1)

Physics ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 998-1014
Author(s):  
Mikhail Tamm ◽  
Dmitry Koval ◽  
Vladimir Stadnichuk

Experimentally observed complex networks are often scale-free, small-world and have an unexpectedly large number of small cycles. An Apollonian network is one notable example of a model network simultaneously having all three of these properties. This network is constructed by a deterministic procedure of consequentially splitting a triangle into smaller and smaller triangles. In this paper, a similar construction based on the consequential splitting of tetragons and other polygons with an even number of edges is presented. The suggested procedure is stochastic and results in the ensemble of planar scale-free graphs. In the limit of a large number of splittings, the degree distribution of the graph converges to a true power law with an exponent, which is smaller than three in the case of tetragons and larger than three for polygons with a larger number of edges. It is shown that it is possible to stochastically mix tetragon-based and hexagon-based constructions to obtain an ensemble of graphs with a tunable exponent of degree distribution. Other possible planar generalizations of the Apollonian procedure are also briefly discussed.


2021 ◽  
Vol 2063 (1) ◽  
pp. 012008
Author(s):  
S A Zakaria ◽  
R A Zakaria ◽  
N S Othman

Abstract A selective and sensitive spectrophotometric method has been suggested for the quantitative assay of atenolol (ATNL) as pure and in its manufactural formulation(Tablet). The suggested procedure included oxidation of ATNL with an excess quantity of the oxidant N-bromosuccinimide (NBS), and then the excess of NBS was occupied in bleaching the color of methyl red dye(MRD), then measuring the absorbance of remaining MRD at 518 nm. The absorbance of the unbleached color of MRD corresponds to the ATNL concentration in the sample solution. Beer’s law was followed in the range of 0.1-2.0 μg.ml−1with molar absorptivity value equal to 8.8864x104 l.mol−1. cm−1. The suggested method was applied to the assay of ATNL in commercial tablets, with satisfactory results.


2021 ◽  
pp. 001316442110289
Author(s):  
Sooyong Lee ◽  
Suhwa Han ◽  
Seung W. Choi

Response data containing an excessive number of zeros are referred to as zero-inflated data. When differential item functioning (DIF) detection is of interest, zero-inflation can attenuate DIF effects in the total sample and lead to underdetection of DIF items. The current study presents a DIF detection procedure for response data with excess zeros due to the existence of unobserved heterogeneous subgroups. The suggested procedure utilizes the factor mixture modeling (FMM) with MIMIC (multiple-indicator multiple-cause) to address the compromised DIF detection power via the estimation of latent classes. A Monte Carlo simulation was conducted to evaluate the suggested procedure in comparison to the well-known likelihood ratio (LR) DIF test. Our simulation study results indicated the superiority of FMM over the LR DIF test in terms of detection power and illustrated the importance of accounting for latent heterogeneity in zero-inflated data. The empirical data analysis results further supported the use of FMM by flagging additional DIF items over and above the LR test.


Author(s):  
E. N. Guseva

The innovative management today is to promote and support organizational strategic goals through the rational use of material, labor and financial resources. The main goal of innovative management in libraries is to build the management system which enables focused search of options, development and implementation of novelties to increase the library’s competitiveness and sustainability and, in its turn, the sustainability of the environment. The specific features of innovations in the library and information sphere are as follows: comparatively rare radical while frequent modernizing innovations; consistency and systemacity (change in any element inevitably changes the library system overall); irreversibility. To assess the value of library innovations, the author suggests to apply the following criteria (characteristics) correlated with the world experience: innovative relevance, financial effectiveness, cultural efficiency, social significance, global sustainability. Each criterion has its own weight while the expertise has to be accomplished within the suggested procedure. This methodology enables to assess the library knowledgeably. It was tested thrice by expert teams – participants in the All-Russia Contest of Library Innovations initiated by the Russian State Library in 2013, 2015, and 2019.


2021 ◽  
pp. 17-32
Author(s):  
V.F. Obesnyuk ◽  

The present work focuses on describing a procedure for assessing intensive and cumulative parameters of specific risk when observing cohorts under combined exposure to several external or internal factors. The research goal was to reveal how to use well-known heuristic-descriptive parameters accepted in remote consequences epidemiology for analyzing dynamics of countable events in a cohort; analysis should be performed on quite strict statistic-probabilistic grounds based on Bayesian approach to explaining conditional probabilities that such countable events might occur. The work doesn’t contain any new or previously unknown epidemiologic concept or parameters; despite that, it is not a simple literature review. It is the suggested procedure itself that is comparatively new as it combines techniques used to process conventional epidemiologic information and a correct metrological approach based on process description. The basic result is providing a reader with understanding that all basic descriptive epidemiologic parameters within cohort description framework turn out to be quantitatively interlinked in case they are considered as conditional group processes. It allows simultaneous inter-consistent assessment of annual risk parameters and Kaplan-Meier (Fleming-Harrington) and Nelson-Aalen cumulative parameters as well as other conditional risk parameters or their analogues. It is shown that when a basic descriptive characteristic of cumulative parameters is chosen as a measure for measurable long-term external exposure, it is only natural to apply such a concept as a dose of this risk factor which is surrogate in its essence. Operability of the procedure was confirmed with an example. The suggested procedure was proven to differ from its prototype that previously allowed achieving only substantially shifted estimates, up to ~100 % even in case an operation mode was normal. Application requires creating specific but quite available PC software.


2021 ◽  
pp. 17-32
Author(s):  
V.F. Obesnyuk ◽  

The present work focuses on describing a procedure for assessing intensive and cumulative parameters of specific risk when observing cohorts under combined exposure to several external or internal factors. The research goal was to reveal how to use well-known heuristic-descriptive parameters accepted in remote consequences epidemiology for analyzing dynamics of countable events in a cohort; analysis should be performed on quite strict statistic-probabilistic grounds based on Bayesian approach to explaining conditional probabilities that such countable events might occur. The work doesn’t contain any new or previously unknown epidemiologic concept or parameters; despite that, it is not a simple literature review. It is the suggested procedure itself that is comparatively new as it combines techniques used to process conventional epidemiologic information and a correct metrological approach based on process description. The basic result is providing a reader with understanding that all basic descriptive epidemiologic parameters within cohort description framework turn out to be quantitatively interlinked in case they are considered as conditional group processes. It allows simultaneous inter-consistent assessment of annual risk parameters and Kaplan-Meier (Fleming-Harrington) and Nelson-Aalen cumulative parameters as well as other conditional risk parameters or their analogues. It is shown that when a basic descriptive characteristic of cumulative parameters is chosen as a measure for measurable long-term external exposure, it is only natural to apply such a concept as a dose of this risk factor which is surrogate in its essence. Operability of the procedure was confirmed with an example. The suggested procedure was proven to differ from its prototype that previously allowed achieving only substantially shifted estimates, up to ~100 % even in case an operation mode was normal. Application requires creating specific but quite available PC software.


Nanomaterials ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 543
Author(s):  
Tom Lindström

This short investigation deals with a review of the tensile strength properties of six different types of nanocellulose films (carboxymethylated, carboxymethylcellulose-grafted, enzymatically pretreated, phosphorylated, sulfoethylated, and alkoxylated nanocellulose films) manufactured using identical protocols and the determination of the apparent nanocellulose yield of the same nanocelluloses and their tensile strength properties at different extents of delamination (microfluidization). The purpose was to test a previously suggested procedure to estimate the maximum tensile strength on these different procedures. A second goal was to investigate the impact of the nanocellulose yield on the tensile strength properties. The investigations were limited to the nanocellulose research activities at RISE in Stockholm, because these investigations were made with identical experimental laboratory protocols. The importance of such protocols is also stressed. This review shows that the suggested procedure to estimate the maximum tensile strength is a viable proposition, albeit not scientifically proven. Secondly, there is a relationship between the nanocellulose yield and tensile strength properties, although there may not be a linear relationship between the two measures.


2020 ◽  
Author(s):  
Ghislain B D Aihounton ◽  
Arne Henningsen

Summary The inverse hyperbolic sine (IHS) transformation is frequently applied in econometric studies to transform right-skewed variables that include zero or negative values. We show that regression results can heavily depend on the units of measurement of IHS-transformed variables. Hence, arbitrary choices regarding the units of measurement for these variables can have a considerable effect on recommendations for policies or business decisions. In order to address this problem, we suggest a procedure for choosing units of measurement for IHS-transformed variables. A Monte Carlo simulation assesses this procedure under various scenarios, and an empirical illustration shows the relevance and applicability of our suggested procedure.


Sign in / Sign up

Export Citation Format

Share Document