吉林大学学报(理学版) ›› 2026, Vol. 64 ›› Issue (2): 301-0310.

• • 上一篇    下一篇

基于多压缩机制协同的轻量化模型在课堂行为检测中的应用

吴建1, 王兴旺1, 孙亚峰1, 于美铭2   

  1. 1. 吉林大学 计算机科学与技术学院, 长春 130012; 2. 吉林大学 公共计算机教学与研究中心, 长春 130012
  • 收稿日期:2024-12-05 出版日期:2026-03-26 发布日期:2026-03-26
  • 通讯作者: 于美铭 E-mail:yumm@jlu.edu.cn

Application of Lightweight Model Based on Synergy of Multiple Compression Mechanisms in Classroom Behavior Detection

WU Jian1, WANG Xingwang1, SUN Yafeng1, YU Meiming2   

  1. 1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
    2. Public Computer Education and Research Center, Jilin University, Changchun 130012, China
  • Received:2024-12-05 Online:2026-03-26 Published:2026-03-26

摘要: 针对现有高精度目标检测模型体量巨大、 计算成本高, 很难在资源受限环境中广泛部署的问题, 提出一种融合知识蒸馏、 模型量化和网络剪枝的协同轻量化方法, 以构建适用于课堂场景的高效行为检测模型. 该方法先通过分步蒸馏实现大模型向中小模型的有效知识迁移, 再结合结构化剪枝获得与小规模模型相当的轻量化网络, 并进一步采用量化特征蒸馏在量化感知训练中提升模型的特征表达能力. 实验结果表明, 改进后的轻量化模型在保持较小参数规模和计算量的同时, 其检测精度显著优于原始小模型, 在多个课堂行为数据集上其性能均得到稳定提升.

关键词: 知识蒸馏, 轻量化模型, 课堂行为检测, 模型剪枝, 学生模型

Abstract: Aiming at the problems of large scale and high computational cost in existing high-accuracy object detection models,  it was difficult to deploy them widely in resource-limited  environments, we proposed a colaborative lightweight method  that integrated knowledge distillation, model quantization, and network pruning to construct an efficient  behavior detection model suitable for classroom scenarios. The method first achieved effective knowledge transfer from large to medium and small models through step-by-step distillation,  then combined structured pruning  to obtain a  lightweight  network equivalent to small-scale models, and further used  quantized feature distillation  to enhance feature expression ability of the model in  quantization-aware training.  Experimental results show  that the improved lightweight model has significantly higher detection accuracy than the original small  model  while  maintaining a small parameter size and  computational complexity. Its performance has steadily improved  on multiple classroom behavior datasets.

Key words: knowledge distillation, lightweight model, classroom behavior detection, model pruning, student model

中图分类号: 

  • TP391