Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds

Author(s):  
Shuailing Feng ◽  
Wen Huang ◽  
Lele Song ◽  
Shihui Ying ◽  
Tieyong Zeng
2020 ◽  
Vol 30 (1) ◽  
pp. 210-239 ◽  
Author(s):  
Shixiang Chen ◽  
Shiqian Ma ◽  
Anthony Man-Cho So ◽  
Tong Zhang

2015 ◽  
Vol 56 ◽  
pp. 160 ◽  
Author(s):  
Jueyou Li ◽  
Changzhi Wu ◽  
Zhiyou Wu ◽  
Qiang Long ◽  
Xiangyu Wang

Author(s):  
Patrick Knöbelreiter ◽  
Thomas Pock

AbstractIn this work, we propose a learning-based method to denoise and refine disparity maps. The proposed variational network arises naturally from unrolling the iterates of a proximal gradient method applied to a variational energy defined in a joint disparity, color, and confidence image space. Our method allows to learn a robust collaborative regularizer leveraging the joint statistics of the color image, the confidence map and the disparity map. Due to the variational structure of our method, the individual steps can be easily visualized, thus enabling interpretability of the method. We can therefore provide interesting insights into how our method refines and denoises disparity maps. To this end, we can visualize and interpret the learned filters and activation functions and prove the increased reliability of the predicted pixel-wise confidence maps. Furthermore, the optimization based structure of our refinement module allows us to compute eigen disparity maps, which reveal structural properties of our refinement module. The efficiency of our method is demonstrated on the publicly available stereo benchmarks Middlebury 2014 and Kitti 2015.


Sign in / Sign up

Export Citation Format

Share Document