scholarly journals Temporal Convolution Network Based Joint Optimization of Acoustic-to-Articulatory Inversion

2021 ◽  
Vol 11 (19) ◽  
pp. 9056
Author(s):  
Guolun Sun ◽  
Zhihua Huang ◽  
Li Wang ◽  
Pengyuan Zhang

Articulatory features are proved to be efficient in the area of speech recognition and speech synthesis. However, acquiring articulatory features has always been a difficult research hotspot. A lightweight and accurate articulatory model is of significant meaning. In this study, we propose a novel temporal convolution network-based acoustic-to-articulatory inversion system. The acoustic feature is converted into a high-dimensional hidden space feature map through temporal convolution with frame-level feature correlations taken into account. Meanwhile, we construct a two-part target function combining prediction’s Root Mean Square Error (RMSE) and the sequences’ Pearson Correlation Coefficient (PCC) to jointly optimize the performance of the specific inversion model from both aspects. We also further conducted an analysis on the impact of the weight between the two parts on the final performance of the inversion model. Extensive experiments have shown that our, temporal convolution networks (TCN) model outperformed the Bi-derectional Long Short Term Memory model by 1.18 mm in RMSE and 0.845 in PCC with 14 model parameters when optimizing evenly with RMSE and PCC aspects.

Kerntechnik ◽  
2021 ◽  
Vol 86 (2) ◽  
pp. 152-163
Author(s):  
T.-C. Wang ◽  
M. Lee

Abstract In the present study, a methodology is developed to quantify the uncertainties of special model parameters of the integral severe accident analysis code MAAP5. Here, the in-vessel hydrogen production during a core melt accident for Lungmen Nuclear Power Station of Taiwan Power Company, an advanced boiling water reactor, is analyzed. Sensitivity studies are performed to identify those parameters with an impact on the output parameter. For this, multiple calculations of MAAP5 are performed with input combinations generated from Latin Hypercube Sampling (LHS). The results are analyzed to determine the 95th percentile with 95% confidence level value of the amount of in-vessel hydrogen production. The calculations show that the default model options for IOXIDE and FGBYPA are recommended. The Pearson Correlation Coefficient (PCC) was used to determine the impact of model parameters on the target output parameters and showed that the three parameters TCLMAX, FCO, FOXBJ are highly influencing the in-vessel hydrogen generation. Suggestions of values of these three parameters are given.


2020 ◽  
Vol 2 (3) ◽  
pp. 256-270
Author(s):  
Shakti Goel ◽  
Rahul Bajpai

A Long Short Term Memory (LSTM) based sales model has been developed to forecast the global sales of hotel business of Travel Boutique Online Holidays (TBO Holidays). The LSTM model is a multivariate model; input to the model includes several independent variables in addition to a dependent variable, viz., sales from the previous step. One of the input variables, “number of active bookers per day”, is estimated for the same day as sales. This need for estimation requires the development of another LSTM model to predict the number of active bookers per day. The number of active bookers is variable, so the predicted is used as an input to the sales forecasting model. The use of a predicted variable as an input variable to another model increases the chance of uncertainty entering the system. This paper discusses the quantum of variability observed in sales predictions for various uncertainties or noise due to the estimation of the number of active bookers. For the purposes of this study, different noise distributions such as normalized, uniform, and logistic distributions are used, among others. Analyses of predictions demonstrate that the addition of uncertainty to the number of active bookers via dropouts as well as to the lagged sales variables leads to model predictions that are close to the observations. The least squared error between observations and predictions is higher for uncertainties modeled using other distributions (without dropouts) with the worst predictions being for Gumbel noise distribution. Gaussian noise added directly to the weights matrix yields the best results (minimum prediction errors). One possibility of this uncertainty could be that the global minimum of the least squared objective function with respect to the model weight matrix is not reached, and therefore, model parameters are not optimal. The two LSTM models used in series are also used to study the impact of corona virus on global sales. By introducing a new variable called the corona virus impact variable, the LSTM models can predict corona-affected sales within five percent (5%) of the actuals. The research discussed in the paper finds LSTM models to be effective tools that can be used in the travel industry as they are able to successfully model the trends in sales. These tools can be reliably used to simulate various hypothetical scenarios also.


2015 ◽  
Vol 15 (17) ◽  
pp. 24727-24749 ◽  
Author(s):  
N. J. Harvey ◽  
H. F. Dacre

Abstract. The decision to close airspace in the event of a volcanic eruption is based on hazard maps of predicted ash extent. These are produced using output from volcanic ash transport and dispersion (VATD) models. In this paper an objective metric to evaluate the spatial accuracy of VATD simulations relative to satellite retrievals of volcanic ash is presented. The metric is based on the fractions skill score (FSS). This measure of skill provides more information than traditional point-by-point metrics, such as success index and Pearson correlation coefficient, as it takes into the account spatial scale over which skill is being assessed. The FSS determines the scale over which a simulation has skill and can differentiate between a "near miss" and a forecast that is badly misplaced. The idealised scenarios presented show that even simulations with considerable displacement errors have useful skill when evaluated over neighbourhood scales of 200–700 km2. This method could be used to compare forecasts produced by different VATDs or using different model parameters, assess the impact of assimilating satellite retrieved ash data and evaluate VATD forecasts over a long time period.


Author(s):  
Xiongbin Peng ◽  
Yuwu Li

Abstract Aiming at the phenomenon of battery capacity regeneration, which leads to inaccurate prediction of lithium-ion battery state of health (SOH), a new fusion method based on ensemble empirical mode decomposition (EEMD), Pearson correlation analysis (PCA), and improved long short-term memory (LSTM) network and Gaussian function-trust region (GS-TR) algorithm is introduced to predict battery SOH. Firstly, adopt the EEMD method to process the battery SOH data to eliminate the impact of capacity recovery. Secondly, the decomposed data signals are classified by the PCA method, and the signals classified as high frequency and low frequency are respectively predicted by the improved LSTM algorithm and the GS-TR algorithm. Finally, the prediction results of the improved LSTM and GS-TR algorithms are integrated. The proposed fusion method avoids the complexity of the hybrid neural network model and improves the prediction efficiency. Based on the average results of the three data sets from NASA, the RMSE result of the proposed algorithm is reduced by 9.56% compared with the improved LSTM with the EEMD algorithm, and 37.57% compared with the improved LSTM without the EEMD algorithm. The results show that the proposed method has higher adaptability and prediction accuracy.


2016 ◽  
Vol 16 (2) ◽  
pp. 861-872 ◽  
Author(s):  
N. J. Harvey ◽  
H. F. Dacre

Abstract. The decision to close airspace in the event of a volcanic eruption is based on hazard maps of predicted ash extent. These are produced using output from volcanic ash transport and dispersion (VATD) models. In this paper the fractions skill score has been used for the first time to evaluate the spatial accuracy of VATD simulations relative to satellite retrievals of volcanic ash. This objective measure of skill provides more information than traditional point-by-point metrics, such as success index and Pearson correlation coefficient, as it takes into the account spatial scale over which skill is being assessed. The FSS determines the scale over which a simulation has skill and can differentiate between a "near miss" and a forecast that is badly misplaced. The idealized scenarios presented show that even simulations with considerable displacement errors have useful skill when evaluated over neighbourhood scales of 200–700 (km)2. This method could be used to compare forecasts produced by different VATDs or using different model parameters, assess the impact of assimilating satellite-retrieved ash data and evaluate VATD forecasts over a long time period.


2019 ◽  
Vol 2019 (1) ◽  
pp. 331-338 ◽  
Author(s):  
Jérémie Gerhardt ◽  
Michael E. Miller ◽  
Hyunjin Yoo ◽  
Tara Akhavan

In this paper we discuss a model to estimate the power consumption and lifetime (LT) of an OLED display based on its pixel value and the brightness setting of the screen (scbr). This model is used to illustrate the effect of OLED aging on display color characteristics. Model parameters are based on power consumption measurement of a given display for a number of pixel and scbr combinations. OLED LT is often given for the most stressful display operating situation, i.e. white image at maximum scbr, but having the ability to predict the LT for other configurations can be meaningful to estimate the impact and quality of new image processing algorithms. After explaining our model we present a use case to illustrate how we use it to evaluate the impact of an image processing algorithm for brightness adaptation.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jie Zhu ◽  
Blanca Gallego

AbstractEpidemic models are being used by governments to inform public health strategies to reduce the spread of SARS-CoV-2. They simulate potential scenarios by manipulating model parameters that control processes of disease transmission and recovery. However, the validity of these parameters is challenged by the uncertainty of the impact of public health interventions on disease transmission, and the forecasting accuracy of these models is rarely investigated during an outbreak. We fitted a stochastic transmission model on reported cases, recoveries and deaths associated with SARS-CoV-2 infection across 101 countries. The dynamics of disease transmission was represented in terms of the daily effective reproduction number ($$R_t$$ R t ). The relationship between public health interventions and $$R_t$$ R t was explored, firstly using a hierarchical clustering algorithm on initial $$R_t$$ R t patterns, and secondly computing the time-lagged cross correlation among the daily number of policies implemented, $$R_t$$ R t , and daily incidence counts in subsequent months. The impact of updating $$R_t$$ R t every time a prediction is made on the forecasting accuracy of the model was investigated. We identified 5 groups of countries with distinct transmission patterns during the first 6 months of the pandemic. Early adoption of social distancing measures and a shorter gap between interventions were associated with a reduction on the duration of outbreaks. The lagged correlation analysis revealed that increased policy volume was associated with lower future $$R_t$$ R t (75 days lag), while a lower $$R_t$$ R t was associated with lower future policy volume (102 days lag). Lastly, the outbreak prediction accuracy of the model using dynamically updated $$R_t$$ R t produced an average AUROC of 0.72 (0.708, 0.723) compared to 0.56 (0.555, 0.568) when $$R_t$$ R t was kept constant. Monitoring the evolution of $$R_t$$ R t during an epidemic is an important complementary piece of information to reported daily counts, recoveries and deaths, since it provides an early signal of the efficacy of containment measures. Using updated $$R_t$$ R t values produces significantly better predictions of future outbreaks. Our results found variation in the effect of early public health interventions on the evolution of $$R_t$$ R t over time and across countries, which could not be explained solely by the timing and number of the adopted interventions.


Hydrology ◽  
2021 ◽  
Vol 8 (3) ◽  
pp. 102
Author(s):  
Frauke Kachholz ◽  
Jens Tränckner

Land use changes influence the water balance and often increase surface runoff. The resulting impacts on river flow, water level, and flood should be identified beforehand in the phase of spatial planning. In two consecutive papers, we develop a model-based decision support system for quantifying the hydrological and stream hydraulic impacts of land use changes. Part 1 presents the semi-automatic set-up of physically based hydrological and hydraulic models on the basis of geodata analysis for the current state. Appropriate hydrological model parameters for ungauged catchments are derived by a transfer from a calibrated model. In the regarded lowland river basins, parameters of surface and groundwater inflow turned out to be particularly important. While the calibration delivers very good to good model results for flow (Evol =2.4%, R = 0.84, NSE = 0.84), the model performance is good to satisfactory (Evol = −9.6%, R = 0.88, NSE = 0.59) in a different river system parametrized with the transfer procedure. After transferring the concept to a larger area with various small rivers, the current state is analyzed by running simulations based on statistical rainfall scenarios. Results include watercourse section-specific capacities and excess volumes in case of flooding. The developed approach can relatively quickly generate physically reliable and spatially high-resolution results. Part 2 builds on the data generated in part 1 and presents the subsequent approach to assess hydrologic/hydrodynamic impacts of potential land use changes.


Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 463
Author(s):  
Gopinathan R. Abhijith ◽  
Leonid Kadinski ◽  
Avi Ostfeld

The formation of bacterial regrowth and disinfection by-products is ubiquitous in chlorinated water distribution systems (WDSs) operated with organic loads. A generic, easy-to-use mechanistic model describing the fundamental processes governing the interrelationship between chlorine, total organic carbon (TOC), and bacteria to analyze the spatiotemporal water quality variations in WDSs was developed using EPANET-MSX. The representation of multispecies reactions was simplified to minimize the interdependent model parameters. The physicochemical/biological processes that cannot be experimentally determined were neglected. The effects of source water characteristics and water residence time on controlling bacterial regrowth and Trihalomethane (THM) formation in two well-tested systems under chlorinated and non-chlorinated conditions were analyzed by applying the model. The results established that a 100% increase in the free chlorine concentration and a 50% reduction in the TOC at the source effectuated a 5.87 log scale decrement in the bacteriological activity at the expense of a 60% increase in THM formation. The sensitivity study showed the impact of the operating conditions and the network characteristics in determining parameter sensitivities to model outputs. The maximum specific growth rate constant for bulk phase bacteria was found to be the most sensitive parameter to the predicted bacterial regrowth.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 387
Author(s):  
Yiting Liang ◽  
Yuanhua Zhang ◽  
Yonggang Li

A mechanistic kinetic model of cobalt–hydrogen electrochemical competition for the cobalt removal process in zinc hydrometallurgical was proposed. In addition, to overcome the parameter estimation difficulties arising from the model nonlinearities and the lack of information on the possible value ranges of parameters to be estimated, a constrained guided parameter estimation scheme was derived based on model equations and experimental data. The proposed model and the parameter estimation scheme have two advantages: (i) The model reflected for the first time the mechanism of the electrochemical competition between cobalt and hydrogen ions in the process of cobalt removal in zinc hydrometallurgy; (ii) The proposed constrained parameter estimation scheme did not depend on the information of the possible value ranges of parameters to be estimated; (iii) the constraint conditions provided in that scheme directly linked the experimental phenomenon metrics to the model parameters thereby providing deeper insights into the model parameters for model users. Numerical experiments showed that the proposed constrained parameter estimation algorithm significantly improved the estimation efficiency. Meanwhile, the proposed cobalt–hydrogen electrochemical competition model allowed for accurate simulation of the impact of hydrogen ions on cobalt removal rate as well as simulation of the trend of hydrogen ion concentration, which would be helpful for the actual cobalt removal process in zinc hydrometallurgy.


Sign in / Sign up

Export Citation Format

Share Document