吉林大学学报(理学版) ›› 2023, Vol. 61 ›› Issue (1): 111-117.

• • 上一篇    下一篇

基于对比学习方法的小样本学习

付海涛1, 刘烁1, 冯宇轩1, 朱丽1, 张竞吉1, 关路2   

  1. 1. 吉林农业大学 信息技术学院, 长春 130118; 2. 长春理工大学 经济管理学院, 长春 130022
  • 收稿日期:2022-07-24 出版日期:2023-01-26 发布日期:2023-01-26
  • 通讯作者: 关路 E-mail: Guanlu2021@163.com

Few-Shot Learning Based on  Contrastive Learning Method

FU Haitao1, LIU Shuo1, FENG Yuxuan1, ZHU Li1, ZHANG Jingji1, GUAN Lu2   

  1. 1. College of Information Technology, Jinlin Agricultural University, Changchun 130118, China; 2. School of Economics and Management, Changchun University of Science and Technology, Changchun 130022, China
  • Received:2022-07-24 Online:2023-01-26 Published:2023-01-26

摘要: 针对目前小样本学习中存在的问题, 设计一种新的网络结构及其训练方法以改进小样本学习. 该网络在特征嵌入部分采用卷积网络并结合多尺度滑动池化方法以增强特征提取. 网络主体结构为类孪生网络, 以便于通过样本间的对比从小样本数据中学到语义. 网络的训练方法采用嵌套层次的参数更新以保证收敛的稳定性. 在两个经典小样本学习数据集上与常用的视觉模型和前沿小样本学习方法进行了对比实验, 实验结果表明, 该方法在小样本学习的精度上有显著提升, 可作为样本不充足情况下的解决方案.

关键词: 小样本学习, 对比学习, 孪生神经网络, 滑动池化

Abstract: Aiming at the problems existing in few-shot learning at present, we designed a new network structure and its training method to improve the few-shot learning. The  convolution network and multi-scale slide pooling method were used to enhance feature extraction in the feature embedding part of the network. The main structure  of the networks was the Siamese network  to facilitate learning semantics from small sample data through comparison between samples. The training method  of the framework adopted nested level parameter updating to ensure the stability of convergence. Compared with the common visual model and 
few-shot learning methods, the experimental results  on two classical few-shot learning datasets show that the method significantly improves the  accuracy of  few-shot learning, and  can be used as a solution  under the condition of insufficient sample.

Key words: few-shot learning, contrastive learning, Siamese network, slide pooling

中图分类号: 

  • TP181