Novel Approach for Reducing Transformer Inrush Currents: Laboratory Measurements, Analytical Interpretation and Simulation Studies

2010 ◽  
Vol 25 (4) ◽  
pp. 2609-2616 ◽  
Author(s):  
Nicola Chiesa ◽  
Hans Kristian Høidalen
Author(s):  
Amir M. Aboutaleb ◽  
Linkan Bian ◽  
Prahalad K. Rao ◽  
Mark A. Tschopp

Despite recent advances in improving mechanical properties of parts fabricated by Additive Manufacturing (AM) systems, optimizing geometry accuracy of AM parts is still a major challenge for pushing this cutting-edge technology into the mainstream. This work proposes a novel approach for improving geometry accuracy of AM parts in a systematic and efficient manner. Initial experimental data show that different part geometric features are not necessary positively correlated. Hence, it may not be possible to optimize them simultaneously. The proposed methodology formulates the geometry accuracy optimization problem as a multi-objective optimization problem. The developed method targeted minimizing deviations within part’s major Geometric Dimensioning and Tolerancing (GD&T) features (i.e., Flatness, Circularity, Cylindricity, Concentricity and Thickness). First, principal component analysis (PCA) is applied to extract key components within multi-geometric features of parts. Then, experiments are sequentially designed in an accelerated and integrated framework to achieve sets of process parameters resulting in acceptable level of deviations within principal components of multi-geometric features of parts. The efficiency of proposed method is validated using simulation studies coupled with a real world case study for geometry accuracy optimization of parts fabricated by fused filament fabrication (FFF) system. The results show that optimal designs are achieved by fewer numbers of experiments compared with existing methods.


Author(s):  
El Sayed M. Tag Eldin

The role of a power transformer protective relay is to rapidly operate the tripping during internal faults and block the tripping during magnetizing inrush. This paper presents a new approach for classifying transient phenomena in power transformer, which may be implemented in digital relaying for transformer differential protection. Discrimination between internal faults, external faults with current transformer saturation and magnetizing inrush currents is achieved by combining wavelet transforms and fuzzy logic. The wavelet transform is applied for the analysis of the power transformer transient phenomena because of its ability to extract information from the transient signals in both time and frequency domain. Fuzzy logic is used because of the uncertainty in the differential current signals and relay settings. MATLAB power system toolbox is used to generate current signals at both sides of a power transformer in a typical system with various conditions. The simulation results obtained show that the new algorithm provides a high operating sensitivity for internal faults and remains stable for external faults and inrush currents.


2005 ◽  
Vol 05 (01) ◽  
pp. 191-207
Author(s):  
LITAO GANG ◽  
ALI N. AKANSU

In this paper, we investigate the general problem of data hiding and propose an approach for effective cover noise interference rejection in oblivious applications. We first evaluate the performance in the commonly used direct sequence modulation approach where a low-power signal is embedded into the original cover signal. The optimal detection is derived and its performance is analyzed. Second, we study a novel approach in oblivious data hiding and evaluate its performance and compare it with existing algorithms. Both simulation studies and empirical data hiding results validate its efficiency in the multimedia oblivious applications.


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3245
Author(s):  
Ding-Geng Chen ◽  
Haipeng Gao ◽  
Chuanshu Ji

The purpose of this paper is to develop a data augmentation technique for statistical inference concerning stochastic cusp catastrophe model subject to missing data and partially observed observations. We propose a Bayesian inference solution that naturally treats missing observations as parameters and we validate this novel approach by conducting a series of Monte Carlo simulation studies assuming the cusp catastrophe model as the underlying model. We demonstrate that this Bayesian data augmentation technique can recover and estimate the underlying parameters from the stochastic cusp catastrophe model.


2018 ◽  
Vol 214 (3) ◽  
pp. 1783-1799 ◽  
Author(s):  
M S Devi ◽  
S Garambois ◽  
D Brito ◽  
M Dietrich ◽  
V Poydenot ◽  
...  

10.29007/xvx6 ◽  
2018 ◽  
Author(s):  
Prof. Dr. P. N. Tekwani ◽  
Kinjal Macwan ◽  
Vidhi Patel

This paper proposes a new topology for ac-to-ac power conversion, which is a three-stage conversion. It comprises of a diode rectifier (ac-to-dc), a buck- boost converter (dc-to-dc) and anH-bridge inverter (dc-to-ac) working as an ac chopper. The topology works as V/f drive wherein the frequency is varied by the buck-boost converter and the voltage is varied by the inverter, which is used as a chopper. Thus, it provides variable output voltage and frequency for all three- phases, which can be used for V/f control of induction motor. As compared to the conventional two-stage conversion i.e. ac-dc-ac (ac-to-ac conversion with intermediate stiff dc-link), proposed topology has advantage of improved THD in output voltage, as the input to the inverter is not a stiffed dc but it is a pulsating dc, provided from output of buck-boost converter. Moreover the blocking voltage of each switch of inverter is not constant voltage but varies according to the pulsating input of inverter, thus the stress across switch, as well as machine winding will reduce as compared to two stage conversion system. The proposed scheme offers linear variation of output voltage from zero to rated, avoiding nonlinear overmodulation range used in conventional inverters. The simulation studies are carried out in Matlab/Simulink 2014 and various results are presented.


2016 ◽  
Author(s):  
Maarten van Iterson ◽  
Erik van Zwet ◽  
P. Eline Slagboom ◽  
Bastiaan T. Heijmans ◽  

ABSTRACTAssociation studies on omic-level data other then genotypes (GWAS) are becoming increasingly common, i.e., epigenome-and transcriptome-wide association studies (EWAS/TWAS). However, a tool box for the analysis of EWAS and TWAS studies is largely lacking and often approaches from GWAS are applied despite the fact that epigenome and transcriptome data have vedifferent characteristics than genotypes. Here, we show that EWASs and TWASs are prone not only to significant inflation but also bias of the test statistics and that these are not properly addressed by GWAS-based methodology (i.e. genomic control) and state-of-the-art approaches to control for unmeasured confounding (i.e. RUV, sva and cate). We developed a novel approach that is based on the estimation of the empirical null distribution using Bayesian statistics. Using simulation studies and empirical data, we demonstrate that our approach maximizes power while properly controlling the false positive rate. Finally, we illustrate the utility of our method in the application of meta-analysis by performing EWASs and TWASs on age and smoking which highlighted an overlap in differential methylation and expression of associated genes. An implementation of our new method is available from http://bioconductor.org/packages/bacon/.


Author(s):  
Panagiotis Papastamoulis ◽  
James Hensman ◽  
Peter Glaus ◽  
Magnus Rattray

AbstractRNA-seq studies allow for the quantification of transcript expression by aligning millions of short reads to a reference genome. However, transcripts share much of their sequence, so that many reads map to more than one place and their origin remains uncertain. This problem can be dealt using mixtures of distributions and transcript expression reduces to estimating the weights of the mixture. In this paper, variational Bayesian (VB) techniques are used in order to approximate the posterior distribution of transcript expression. VB has previously been shown to be more computationally efficient for this problem than Markov chain Monte Carlo. VB methodology can precisely estimate the posterior means, but leads to variance underestimation. For this reason, a novel approach is introduced which integrates the latent allocation variables out of the VB approximation. It is shown that this modification leads to a better marginal likelihood bound and improved estimate of the posterior variance. A set of simulation studies and application to real RNA-seq datasets highlight the improved performance of the proposed method.


Energies ◽  
2021 ◽  
Vol 14 (20) ◽  
pp. 6585
Author(s):  
Piotr Kuwałek

The current study presents a novel approach to the selective identification and localization of voltage fluctuation sources in power grids, considering individual disturbing loads changing their state with a frequency of up to 150 Hz. The implementation of the proposed approach in the existing infrastructure of smart metering allows for the identification and localization of the individual sources of disturbances in real time. The proposed approach first performs the estimation of the modulation signal using a carrier signal estimator, which allows for a modulation signal with a frequency greater than the power frequency to be estimated. In the next step, the estimated modulating signal is decomposed into component signals associated with individual sources of voltage fluctuations using an enhanced empirical wavelet transform. In the last step, a statistical evaluation of the propagation of component signals with a comparable fundamental frequency is performed, which allows for the supply point of a particular disturbing load to be determined. The proposed approach is verified in numerical simulation studies using MATLAB/SIMULINK and in experimental studies carried out in a real low-voltage power grid. The research carried out shows that the proposed approach allows for the selective identification and localization of voltage fluctuation sources changing their state with a frequency of up to 150 Hz, unlike other methods currently used in practice.


2019 ◽  
Vol 79 (6) ◽  
pp. 1133-1155
Author(s):  
Emre Gönülateş

This article introduces the Quality of Item Pool (QIP) Index, a novel approach to quantifying the adequacy of an item pool of a computerized adaptive test for a given set of test specifications and examinee population. This index ranges from 0 to 1, with values close to 1 indicating the item pool presents optimum items to examinees throughout the test. This index can be used to compare different item pools or diagnose the deficiencies of a given item pool by quantifying the amount of deviation from a perfect item pool. Simulation studies were conducted to evaluate the capacity of this index for detecting the inadequacies of two simulated item pools. The value of this index was compared with the existing methods of evaluating the quality of computerized adaptive tests (CAT). Results of the study showed that the QIP Index can detect even slight deviations between a proposed item pool and an optimal item pool. It can also uncover shortcomings of an item pool that other outcomes of CAT cannot detect. CAT developers can use the QIP Index to diagnose the weaknesses of the item pool and as a guide for improving item pools.


Sign in / Sign up

Export Citation Format

Share Document