fixed sample
Recently Published Documents


TOTAL DOCUMENTS

176
(FIVE YEARS 32)

H-INDEX

21
(FIVE YEARS 2)

2021 ◽  
Vol 11 (1) ◽  
pp. 1
Author(s):  
Oluwole A Nuga ◽  
Abba Zakirai Abdulhamid ◽  
Shobanke Emmanuel Omobola Kayode

This study examines design preference in Completely Randomized (CR) split-plot experiments involving random whole plot factor effect and fixed sub-plot factor effect. Many previous works on optimally designing split-plot experiments assumed only factors with fixed levels. However, the cases where interests are on random factors have received little attention. These problems have similarities with optimal design of experiments for fixed parameters of non-linear models because the solution rely on the unknown parameters.  Design Space (DS) containing exhaustive list of balanced designs for a fixed sample size were compared for optimality using the product of determinants of derived information matrices of the Maximum Likelihood (ML) estimators equivalent to random and fixed effect in the model. Different magnitudes of components of variance configurations where variances of factor effects are larger than variances of error term were empirically used for the comparisons. The results revealed that the D-optimal designs are those with whole plot factor levels greater than replicates within each level of whole plot.


Author(s):  
Umberto Amato ◽  
Anestis Antoniadis ◽  
Italia De Feis ◽  
Irène Gijbels

AbstractNonparametric univariate regression via wavelets is usually implemented under the assumptions of dyadic sample size, equally spaced fixed sample points, and i.i.d. normal errors. In this work, we propose, study and compare some wavelet based nonparametric estimation methods designed to recover a one-dimensional regression function for data that not necessary possess the above requirements. These methods use appropriate regularizations by penalizing the decomposition of the unknown regression function on a wavelet basis of functions evaluated on the sampling design. Exploiting the sparsity of wavelet decompositions for signals belonging to homogeneous Besov spaces, we use some efficient proximal gradient descent algorithms, available in recent literature, for computing the estimates with fast computation times. Our wavelet based procedures, in both the standard and the robust regression case have favorable theoretical properties, thanks in large part to the separability nature of the (non convex) regularization they are based on. We establish asymptotic global optimal rates of convergence under weak conditions. It is known that such rates are, in general, unattainable by smoothing splines or other linear nonparametric smoothers. Lastly, we present several experiments to examine the empirical performance of our procedures and their comparisons with other proposals available in the literature. An interesting regression analysis of some real data applications using these procedures unambiguously demonstrate their effectiveness.


Author(s):  
Junyu Zhang ◽  
Lin Xiao ◽  
Shuzhong Zhang

The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.


2021 ◽  
Author(s):  
Kit Neikirk ◽  
Zer Vue ◽  
Prasanna Katti ◽  
Jianqiang Shao ◽  
Trace A. Christensen ◽  
...  

Autophagosomes and lysosomes work in tandem to conduct autophagy, an intracellular degradation system which is crucial for cellular homeostasis. Altered autophagy contributes to the pathophysiology of various diseases, including cancers and metabolic diseases. Although many studies have investigated autophagy to elucidate disease pathogenesis, specific identification of the various components of the cellular degradation machinery remains difficult. The goal of this paper is to describe an approach to reproducibly identify and distinguish subcellular structures involved in autophagy. We provide methods that avoid common pitfalls, including a detailed explanation for how to distinguish lysosomes and lipid droplets and discuss the differences between autophagosomes and inclusion bodies. These methods are based on using transmission electron microscopy (TEM), capable of generating nanometer-scale micrographs of cellular degradation components in a fixed sample. In addition to TEM, we discuss other imaging techniques, such as immunofluorescence and immunogold labeling, which can be utilized for the reliable and accurate classification of cellular organelles. Our results show how these methods may be employed to accurately quantify the cellular degradation machinery under various conditions, such as treatment with the endoplasmic reticulum stressor thapsigargin or the ablation of dynamin-related protein 1.


2021 ◽  
Vol 9 (5) ◽  
pp. 15-22
Author(s):  
Moshe Sharabi ◽  
Ola Abu-Hasan Nabwani ◽  
Tal Shahor ◽  
Javier Simonovich

Purpose: The purpose of this study is to examine the changes in work centrality of individuals who experienced meaningful adverse occupational events (dismissal from the workplace, prolonged unemployment, and retirement), as compared to employees who did not experience such events over 12 years. Methodology: By implementing a fixed-sample panel /longitudinal research,12 years after conducting the Meaning of Work questioner, 411 individuals were located and re-conducted.  The respondents were asked about life and work events they had experienced between the first and second time. The data was analysed by regular and multivariate analysis of variance. Main Findings: The work centrality of individuals who experienced prolonged unemployment did not change, while it increased among those who did not experience these events. Experiencing dismissal from work increased work centrality. Unexpectedly, work centrality continues to increase among individuals after retirement.  Applications: There are several suggestions for the social and welfare and policymakers regarding adverse occupational events and the impact these policies may have on the magnitude of these events on work centrality. Novelty/Originality: This is a unique longitudinal study over twelve years, that compared the change in work centrality among individuals who did and did not experience adverse occupational events.


Biomolecules ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 711
Author(s):  
Yuan Qin ◽  
Wenting Jiang ◽  
Anqi Li ◽  
Meng Gao ◽  
Hanyu Liu ◽  
...  

Mitochondria are highly dynamic organelles, constantly undergoing shape changes, which are controlled by mitochondrial movement, fusion, and fission. Mitochondria play a pivotal role in various cellular processes under physiological and pathological conditions, including metabolism, superoxide generation, calcium homeostasis, and apoptosis. Abnormal mitochondrial morphology and mitochondrial protein expression are always closely related to the health status of cells. Analysis of mitochondrial morphology and mitochondrial protein expression in situ is widely used to reflect the abnormality of cell function in the chemical fixed sample. Paraformaldehyde (PFA), the most commonly used fixative in cellular immunostaining, still has disadvantages, including loss of antigenicity and disruption of morphology during fixation. We tested the effect of ethanol (ETHO), PFA, and glutaraldehyde (GA) fixation on cellular mitochondria. The results showed that 3% PFA and 1.5% GA (PFA-GA) combination reserved mitochondrial morphology better than them alone in situ in cells. Mitochondrial network and protein antigenicity were well maintained, indicated by preserved MitoTracker and mitochondrial immunostaining after PFA-GA fixation. Our results suggest that the PFA-GA combination is a valuable fixative for the study of mitochondria in situ.


2021 ◽  
Vol 73 (1) ◽  
pp. 62-67
Author(s):  
Ibrahim A. Ahmad ◽  
A. R. Mugdadi

For a sequence of independent, identically distributed random variable (iid rv's) [Formula: see text] and a sequence of integer-valued random variables [Formula: see text], define the random quantiles as [Formula: see text], where [Formula: see text] denote the largest integer less than or equal to [Formula: see text], and [Formula: see text] the [Formula: see text]th order statistic in a sample [Formula: see text] and [Formula: see text]. In this note, the limiting distribution and its exact order approximation are obtained for [Formula: see text]. The limiting distribution result we obtain extends the work of several including Wretman[Formula: see text]. The exact order of normal approximation generalizes the fixed sample size results of Reiss[Formula: see text]. AMS 2000 subject classification: 60F12; 60F05; 62G30.


Author(s):  
K.G. Kim ◽  
S. Toepfer

First-event sampling models for monitoring diamondback moth Plutella xylostella (Lepidoptera: Plutellidae) and small white butterfly, Pieris rapae (Pieridae) are used in integrated production systems of cabbage. Decision-making accuracy and reduced labour needs of those models were unknown compared to fixed-sample monitoring. This we addressed through computer simulations of the currently most used first-event sampling plan for cabbage in DPR Korea. Indeed, this sampling plan in five subplots of a cabbage field at a sampling limit of a maximum 10 plants each, appeared less labour intense than many fixed-sample monitoring plans. However, only a medium accuracy of infestation estimates and correct decision-making for or against pest control was achieved, particularly at high pest densities. If accepting such medium accuracy, the current sampling plan could be reduced from five to three subplots at a sampling limit of 10 plants each, or to a maximum of five assessed plants per each of five subplots, this is, without further loosing accuracy whilst saving labour. Such sampling requires little investment in time and might be therefore applied and validated across more cabbage productions systems of East Asia. Ultimately, first-event sampling, as other sampling plans will remain a compromise between accuracy and practicability.


Author(s):  
Nikolai Hofmann ◽  
Jon Hasselgren ◽  
Petrik Clarberg ◽  
Jacob Munkberg

We combine state-of-the-art techniques into a system for high-quality, interactive rendering of participating media. We leverage unbiased volume path tracing with multiple scattering, temporally stable neural denoising and NanoVDB [Museth 2021], a fast, sparse voxel tree data structure for the GPU, to explore what performance and image quality can be obtained for rendering volumetric data. Additionally, we integrate neural adaptive sampling to significantly improve image quality at a fixed sample budget. Our system runs at interactive rates at 1920 × 1080 on a single GPU and produces high quality results for complex dynamic volumes.


Sign in / Sign up

Export Citation Format

Share Document