Journal of Jilin University Science Edition ›› 2026, Vol. 64 ›› Issue (2): 301-0310.

Previous Articles     Next Articles

Application of Lightweight Model Based on Synergy of Multiple Compression Mechanisms in Classroom Behavior Detection

WU Jian1, WANG Xingwang1, SUN Yafeng1, YU Meiming2   

  1. 1. College of Computer Science and Technology, Jilin University, Changchun 130012, China;
    2. Public Computer Education and Research Center, Jilin University, Changchun 130012, China
  • Received:2024-12-05 Online:2026-03-26 Published:2026-03-26

Abstract: Aiming at the problems of large scale and high computational cost in existing high-accuracy object detection models,  it was difficult to deploy them widely in resource-limited  environments, we proposed a colaborative lightweight method  that integrated knowledge distillation, model quantization, and network pruning to construct an efficient  behavior detection model suitable for classroom scenarios. The method first achieved effective knowledge transfer from large to medium and small models through step-by-step distillation,  then combined structured pruning  to obtain a  lightweight  network equivalent to small-scale models, and further used  quantized feature distillation  to enhance feature expression ability of the model in  quantization-aware training.  Experimental results show  that the improved lightweight model has significantly higher detection accuracy than the original small  model  while  maintaining a small parameter size and  computational complexity. Its performance has steadily improved  on multiple classroom behavior datasets.

Key words: knowledge distillation, lightweight model, classroom behavior detection, model pruning, student model

CLC Number: 

  • TP391