期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Improving meta-learning model via meta-contrastive loss 被引量:2
1
作者 Pinzhuo TIAN Yang GAO 《Frontiers of Computer Science》 SCIE EI CSCD 2022年第5期107-113,共7页
Recently,addressing the few-shot learning issue with meta-learning framework achieves great success.As we know,regularization is a powerful technique and widely used to improve machine learning algorithms.However,rare... Recently,addressing the few-shot learning issue with meta-learning framework achieves great success.As we know,regularization is a powerful technique and widely used to improve machine learning algorithms.However,rare research focuses on designing appropriate meta-regularizations to further improve the generalization of meta-learning models in few-shot learning.In this paper,we propose a novel metacontrastive loss that can be regarded as a regularization to fill this gap.The motivation of our method depends on the thought that the limited data in few-shot learning is just a small part of data sampled from the whole data distribution,and could lead to various bias representations of the whole data because of the different sampling parts.Thus,the models trained by a few training data(support set)and test data(query set)might misalign in the model space,making the model learned on the support set can not generalize well on the query data.The proposed meta-contrastive loss is designed to align the models of support and query sets to overcome this problem.The performance of the meta-learning model in few-shot learning can be improved.Extensive experiments demonstrate that our method can improve the performance of different gradientbased meta-learning models in various learning problems,e.g.,few-shot regression and classification. 展开更多
关键词 META-LEARNING few-shot learning metaregularization deep learning
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部