摘 要:Tensor decomposition is a powerful tool for extracting physically meaningful latent factors from multi-dimensional nonnegative data, and has been an increasing interest in a variety of fields such as image processing, machine learning, and computer vision. In this talk, we propose a sparse nonnegative Tucker decomposition and completion approach for the recovery of underlying nonnegative data under incompleted and generally noisy observations. Here the underlying nonnegative data tensor is decomposed into a core tensor and several factor matrices with all entries being nonnegative and the factor matrices being sparse. The loss function is derived by the maximum likelihood estimation of the noisy observations, and the $ell_0$ norm is employed to enhance the sparsity of the factor matrices. We establish the error bound of the estimator of the proposed model under generic noise scenarios, which is then specified to the observations with additive Gaussian noise, additive Laplace noise, and Poisson observations, respectively. Our theoretical results are better than those by existing tensor-based or matrix-based methods. Moreover, the minimax lower bounds are shown to be matched with the derived upper bounds up to logarithmic factors. Numerical experiments on both synthetic and real-world data sets demonstrate the superiority of the proposed method for nonnegative tensor data completion.
报告人简介:张雄军, 华中师范大学数学与统计学学院副教授,博士生导师. 2017年博士毕业于湖南大学, 2015年-2016年香港浸会大学博士交换生,2020年-2021年香港大学博士后,2019年获湖南省优秀博士学位论文. 主持国家自然科学基金面上项目、青年基金项目,湖北省自然科学基金青年项目等. 主要研究领域包括图像处理和张量优化, 目前已在包括 IEEE TPAMI, IEEE TIT, IEEE TNNLS, SIAM J. Imaging Sci., SIAM J. Sci. Comput., Appl. Comput. Harmon. Anal., Inverse Problems等期刊发表论文近30篇。
邀请人:曾燎原