parameter estimation scheme
Recently Published Documents


TOTAL DOCUMENTS

31
(FIVE YEARS 8)

H-INDEX

8
(FIVE YEARS 1)

Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 490
Author(s):  
Junaid ur Rehman ◽  
Hyundong Shin

We propose a parameter estimation protocol for generalized Pauli channels acting on d-dimensional Hilbert space. The salient features of the proposed method include product probe states and measurements, the number of measurement configurations linear in d, minimal post-processing, and the scaling of the mean square error comparable to that of the entanglement-based parameter estimation scheme for generalized Pauli channels. We also show that while measuring generalized Pauli operators the errors caused by the Pauli noise can be modeled as measurement errors. This makes it possible to utilize the measurement error mitigation framework to mitigate the errors caused by the generalized Pauli channels. We use this result to mitigate noise on the probe states and recover the scaling of the noiseless probes, except with a noise strength-dependent constant factor. This method of modeling Pauli channel as measurement noise can also be of independent interest in other NISQ tasks, e.g., state tomography problems, variational quantum algorithms, and other channel estimation problems where Pauli measurements have the central role.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 387
Author(s):  
Yiting Liang ◽  
Yuanhua Zhang ◽  
Yonggang Li

A mechanistic kinetic model of cobalt–hydrogen electrochemical competition for the cobalt removal process in zinc hydrometallurgical was proposed. In addition, to overcome the parameter estimation difficulties arising from the model nonlinearities and the lack of information on the possible value ranges of parameters to be estimated, a constrained guided parameter estimation scheme was derived based on model equations and experimental data. The proposed model and the parameter estimation scheme have two advantages: (i) The model reflected for the first time the mechanism of the electrochemical competition between cobalt and hydrogen ions in the process of cobalt removal in zinc hydrometallurgy; (ii) The proposed constrained parameter estimation scheme did not depend on the information of the possible value ranges of parameters to be estimated; (iii) the constraint conditions provided in that scheme directly linked the experimental phenomenon metrics to the model parameters thereby providing deeper insights into the model parameters for model users. Numerical experiments showed that the proposed constrained parameter estimation algorithm significantly improved the estimation efficiency. Meanwhile, the proposed cobalt–hydrogen electrochemical competition model allowed for accurate simulation of the impact of hydrogen ions on cobalt removal rate as well as simulation of the trend of hydrogen ion concentration, which would be helpful for the actual cobalt removal process in zinc hydrometallurgy.


2020 ◽  
Author(s):  
xiaohui wang ◽  
Martin Verlaan ◽  
Hai Xiang Lin

<p>Combined tide and surge models are very useful tools to issue warnings for storm surges as well as for assessment of the potential impacts of sealevel rise. Over the past decade, a Global Tide and Surge Model (GTSM) has been developed by Deltares with improvements in physics, grid resolution and skill in the each new version. The uncertainties in bathymetry and friction are currently a major part of the remaining model uncertainty. Improved estimates of these parameters would be desirable, but the required computing speed and memory storage are limiting the possibilities at the moment. Here, we propose an efficient coarse grid parameter estimation scheme for the high resolution GTSM to estimate the bathymetry. OpenDA software is combined with GTSM using DUD algorithm (Does not Use Derivative) making use of the FES2014 dataset as observations in deep water. Even though parallel computing is implemented for model simulation, calibration of the fine grid model directly would still require too much computer time, for instance, e.g. it takes 9 hours on 20 cores to simulate 45 days and calibration typically requires many of these simulations. Therefore, a coarse-to-fine strategy is developed by replacing the fine grid with coarse grid in parameter estimation iterations to reduce the computing cost by 67%. Moreover, via a sensitivity analysis we are able to reduce the parameter dimension from O(10<sup>6</sup>) to O(10<sup>2</sup>) which leads to a  further reduction of the required computation and memory. The results of the estimation and model validation demonstrate the parameter estimation scheme to improve the accuracy of the model by approximately 30% with affordable computational and storage demands.</p><p> </p>


Sign in / Sign up

Export Citation Format

Share Document