Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency o...Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency of multi-view data,while neglecting the diversity among different views as well as the high-order relationships of data,resulting in the loss of valuable complementary information.In this paper,we design a hypergraph regularized diverse deep matrix factorization(HDDMF)model for multi-view data representation,to jointly utilize multi-view diversity and a high-order manifold in a multilayer factorization framework.A novel diversity enhancement term is designed to exploit the structural complementarity between different views of data.Hypergraph regularization is utilized to preserve the high-order geometry structure of data in each view.An efficient iterative optimization algorithm is developed to solve the proposed model with theoretical convergence analysis.Experimental results on five real-world data sets demonstrate that the proposed method significantly outperforms stateof-the-art multi-view learning approaches.展开更多
Diffusion models are effective purification methods,where the noises or adversarial attacks are removed using generative approaches before pre-existing classifiers conducting classification tasks.However,the efficienc...Diffusion models are effective purification methods,where the noises or adversarial attacks are removed using generative approaches before pre-existing classifiers conducting classification tasks.However,the efficiency of diffusion models is still a concern,and existing solutions are based on knowledge distillation which can jeopardize the generation quality because of the small number of generation steps.Hence,we propose TendiffPure as a tensorized and compressed diffusion model for purification.Unlike the knowledge distillation methods,we directly compress U-Nets as backbones of diffusion models using tensor-train decomposition,which reduces the number of parameters and captures more spatial information in multi-dimensional data such as images.The space complexity is reduced from O(N^(2))to O(NR^(2))with R≤4 as the tensor-train rank and N as the number of channels.Experimental results show that TendiffPure can more efficiently obtain high-quality purification results and outperforms the baseline purification methods on CIFAR-10,Fashion-MNIST,and MNIST datasets for two noises and one adversarial attack.展开更多
Tensor decomposition and tensor networks(TNs)are factorizations of high order tensors into a network of low-order tensors,which have been studied in quantum physics,chemistry and applied mathematics.In recent years,TN...Tensor decomposition and tensor networks(TNs)are factorizations of high order tensors into a network of low-order tensors,which have been studied in quantum physics,chemistry and applied mathematics.In recent years,TNs have been increasingly investigated and applied to machine learning and AI fields,due to its significant efficacy in modeling large-scale and high-order data,representing model parameters in deep neural networks,and accelerating computations for learning algorithms.In particular,TNs have been exploited to solve several challenging problems in data completion,model compression,multimodal fusion,multitask knowledge sharing and theoretical analysis of deep neural networks.More potential technologies using TNs are rapidly emerging and finding many interesting applications in machine learning,such as modeling probability functions,probabilistic graphical models and implementing efficient TN computations in GPU.However,the topic of TNs in machine learning is relatively young and many open problems are still not fully explored.This special topic aims to promote research and development related to innovative TNs technology from perspectives of fundamental theory and algorithms,novel approaches in machine learning and deep neural networks,and various applications in computer vision,biomedical image processing and many other related fields.展开更多
基金This work was supported by the National Natural Science Foundation of China(62073087,62071132,61973090).
文摘Deep matrix factorization(DMF)has been demonstrated to be a powerful tool to take in the complex hierarchical information of multi-view data(MDR).However,existing multiview DMF methods mainly explore the consistency of multi-view data,while neglecting the diversity among different views as well as the high-order relationships of data,resulting in the loss of valuable complementary information.In this paper,we design a hypergraph regularized diverse deep matrix factorization(HDDMF)model for multi-view data representation,to jointly utilize multi-view diversity and a high-order manifold in a multilayer factorization framework.A novel diversity enhancement term is designed to exploit the structural complementarity between different views of data.Hypergraph regularization is utilized to preserve the high-order geometry structure of data in each view.An efficient iterative optimization algorithm is developed to solve the proposed model with theoretical convergence analysis.Experimental results on five real-world data sets demonstrate that the proposed method significantly outperforms stateof-the-art multi-view learning approaches.
文摘Diffusion models are effective purification methods,where the noises or adversarial attacks are removed using generative approaches before pre-existing classifiers conducting classification tasks.However,the efficiency of diffusion models is still a concern,and existing solutions are based on knowledge distillation which can jeopardize the generation quality because of the small number of generation steps.Hence,we propose TendiffPure as a tensorized and compressed diffusion model for purification.Unlike the knowledge distillation methods,we directly compress U-Nets as backbones of diffusion models using tensor-train decomposition,which reduces the number of parameters and captures more spatial information in multi-dimensional data such as images.The space complexity is reduced from O(N^(2))to O(NR^(2))with R≤4 as the tensor-train rank and N as the number of channels.Experimental results show that TendiffPure can more efficiently obtain high-quality purification results and outperforms the baseline purification methods on CIFAR-10,Fashion-MNIST,and MNIST datasets for two noises and one adversarial attack.
文摘Tensor decomposition and tensor networks(TNs)are factorizations of high order tensors into a network of low-order tensors,which have been studied in quantum physics,chemistry and applied mathematics.In recent years,TNs have been increasingly investigated and applied to machine learning and AI fields,due to its significant efficacy in modeling large-scale and high-order data,representing model parameters in deep neural networks,and accelerating computations for learning algorithms.In particular,TNs have been exploited to solve several challenging problems in data completion,model compression,multimodal fusion,multitask knowledge sharing and theoretical analysis of deep neural networks.More potential technologies using TNs are rapidly emerging and finding many interesting applications in machine learning,such as modeling probability functions,probabilistic graphical models and implementing efficient TN computations in GPU.However,the topic of TNs in machine learning is relatively young and many open problems are still not fully explored.This special topic aims to promote research and development related to innovative TNs technology from perspectives of fundamental theory and algorithms,novel approaches in machine learning and deep neural networks,and various applications in computer vision,biomedical image processing and many other related fields.