regularization term
Recently Published Documents


TOTAL DOCUMENTS

104
(FIVE YEARS 42)

H-INDEX

11
(FIVE YEARS 3)

2022 ◽  
Vol 16 (4) ◽  
pp. 1-21
Author(s):  
Honghui Xu ◽  
Zhipeng Cai ◽  
Wei Li

Multi-label image recognition has been an indispensable fundamental component for many real computer vision applications. However, a severe threat of privacy leakage in multi-label image recognition has been overlooked by existing studies. To fill this gap, two privacy-preserving models, Privacy-Preserving Multi-label Graph Convolutional Networks (P2-ML-GCN) and Robust P2-ML-GCN (RP2-ML-GCN), are developed in this article, where differential privacy mechanism is implemented on the model’s outputs so as to defend black-box attack and avoid large aggregated noise simultaneously. In particular, a regularization term is exploited in the loss function of RP2-ML-GCN to increase the model prediction accuracy and robustness. After that, a proper differential privacy mechanism is designed with the intention of decreasing the bias of loss function in P2-ML-GCN and increasing prediction accuracy. Besides, we analyze that a bounded global sensitivity can mitigate excessive noise’s side effect and obtain a performance improvement for multi-label image recognition in our models. Theoretical proof shows that our two models can guarantee differential privacy for model’s outputs, weights and input features while preserving model robustness. Finally, comprehensive experiments are conducted to validate the advantages of our proposed models, including the implementation of differential privacy on model’s outputs, the incorporation of regularization term into loss function, and the adoption of bounded global sensitivity for multi-label image recognition.


2022 ◽  
Vol 14 (2) ◽  
pp. 283
Author(s):  
Biao Qi ◽  
Longxu Jin ◽  
Guoning Li ◽  
Yu Zhang ◽  
Qiang Li ◽  
...  

This study based on co-occurrence analysis shearlet transform (CAST) effectively combines the latent low rank representation (LatLRR) and the regularization of zero-crossing counting in differences to fuse the heterogeneous images. First, the source images are decomposed by CAST method into base-layer and detail-layer sub-images. Secondly, for the base-layer components with larger-scale intensity variation, the LatLRR, is a valid method to extract the salient information from image sources, and can be applied to generate saliency map to implement the weighted fusion of base-layer images adaptively. Meanwhile, the regularization term of zero crossings in differences, which is a classic method of optimization, is designed as the regularization term to construct the fusion of detail-layer images. By this method, the gradient information concealed in the source images can be extracted as much as possible, then the fusion image owns more abundant edge information. Compared with other state-of-the-art algorithms on publicly available datasets, the quantitative and qualitative analysis of experimental results demonstrate that the proposed method outperformed in enhancing the contrast and achieving close fusion result.


Electronics ◽  
2022 ◽  
Vol 11 (2) ◽  
pp. 182
Author(s):  
Rongfang Wang ◽  
Yali Qin ◽  
Zhenbiao Wang ◽  
Huan Zheng

Achieving high-quality reconstructions of images is the focus of research in image compressed sensing. Group sparse representation improves the quality of reconstructed images by exploiting the non-local similarity of images; however, block-matching and dictionary learning in the image group construction process leads to a long reconstruction time and artifacts in the reconstructed images. To solve the above problems, a joint regularized image reconstruction model based on group sparse representation (GSR-JR) is proposed. A group sparse coefficients regularization term ensures the sparsity of the group coefficients and reduces the complexity of the model. The group sparse residual regularization term introduces the prior information of the image to improve the quality of the reconstructed image. The alternating direction multiplier method and iterative thresholding algorithm are applied to solve the optimization problem. Simulation experiments confirm that the optimized GSR-JR model is superior to other advanced image reconstruction models in reconstructed image quality and visual effects. When the sensing rate is 0.1, compared to the group sparse residual constraint with a nonlocal prior (GSRC-NLR) model, the gain of the peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) is up to 4.86 dB and 0.1189, respectively.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Kai Si ◽  
Min Zhou ◽  
Yingfang Qiao

The rapid development of web technology has brought new problems and challenges to the recommendation system: on the one hand, the traditional collaborative filtering recommendation algorithm has been difficult to meet the personalized recommendation needs of users; on the other hand, the massive data brought by web technology provides more useful information for recommendation algorithms. How to extract features from this information, alleviate sparsity and dynamic timeliness, and effectively improve recommendation quality is a hot issue in the research of recommendation system algorithms. In view of the lack of an effective multisource information fusion mechanism in the existing research, an improved 5G multimedia precision marketing based on an improved multisensor node collaborative filtering recommendation algorithm is proposed. By expanding the input vector field, the features of users’ social relations and comment information are extracted and fused, and the problem of collaborative modelling of these two kinds of important auxiliary information is solved. The objective function is improved, the social regularization term and the internal regularization term in the vector domain are analysed and added from the perspective of practical significance and vector structure, which alleviates the overfitting problem. Experiments on a large number of real datasets show that the proposed method has higher recommendation quality than the classical and mainstream baseline algorithm.


2021 ◽  
pp. 116004
Author(s):  
Cuneyt Akinlar ◽  
Hatice Kubra Kucukkartal ◽  
Cihan Topal
Keyword(s):  

Geophysics ◽  
2021 ◽  
pp. 1-103
Author(s):  
Xiaolong Wei ◽  
Jiajia Sun

The non-uniqueness problem in geophysical inversion, especially potential-field inversion, is widely recognized. It is argued that uncertainty analysis of a recovered model should be as important as finding an optimal model. However, quantifying uncertainty still remains challenging, especially for 3D inversions in both deterministic and Bayesian frameworks. Our objective is to develop an efficient method to empirically quantify the uncertainty of the physical property models recovered from 3D potential-field inversion. We worked in a deterministic framework where an objective function consisting of a data misfit term and a regularization term is minimized. We performed inversions using a mixed Lp-norm formulation where various combinations of L p (0 <= p <= 2) norms can be implemented on different components of the regularization term. Specifically, we randomly sampled the p-norm values in multiple times, and generated a large and diverse sequence of physical property models that all reproduce the observed geophysical data equally well. This suite of models offers practical insights into the uncertainty of the recovered model features. We quantified the uncertainty through calculation of standard deviations and interquartile range, as well as visualizations in box plots and histograms. The numerical results for a realistic synthetic density model created based on a ring-shaped igneous intrusive body quantitatively illustrate uncertainty reduction due to different amounts of prior information imposed on inversions. We also applied the method to a field data set over the Decorah area in the northeastern Iowa. We adopted an acceptance-rejection strategy to generate 31 equivalent models based on which the uncertainties of the inverted models as well as the volume and mass estimates are quantified.


2021 ◽  
Vol 11 (14) ◽  
pp. 6485
Author(s):  
Tao Song ◽  
Xing Hu ◽  
Wei Du ◽  
Lianzheng Cheng ◽  
Tiaojie Xiao ◽  
...  

As a popular population based heuristic evolutionary algorithm, differential evolution (DE) has been widely applied in various science and engineering problems. Similar to other global nonlinear algorithms, such as genetic algorithm, simulated annealing, particle swarm optimization, etc., the DE algorithm is mostly applied to resolve the parametric inverse problem, but has few applications in physical property inversion. According to our knowledge, this is the first time DE has been applied in obtaining the physical property distribution of gravity data due to causative sources embedded in the subsurface. In this work, the search direction of DE is guided by better vectors, enhancing the exploration efficiency of the mutation strategy. Besides, to reduce the over-stochastic of the DE algorithm, the perturbation directions in mutation operations are smoothed by using a weighted moving average smoothing technique, and the Lp-norm regularization term is implemented to sharpen the boundary of density distribution. Meanwhile, in the search process of DE, the effect of Lp-norm regularization term is controlled in an adaptive manner, which can always have an impact on the data misfit function. In the synthetic anomaly case, both noise-free and noisy data sets are considered. For the field case, gravity anomalies originating from the Shihe iron ore deposit in China were inverted and interpreted. The reconstructed density distribution is in good agreement with the one obtained by drill-hole information. Based on the tests in the present study, one can conclude that the Lp-norm inversion using DE is a useful tool for physical property distribution using gravity anomalies.


Sign in / Sign up

Export Citation Format

Share Document