Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks

文献类型: 外文期刊

第一作者: Chen, Haoxuan

作者: Chen, Haoxuan;Peng, Yangyang;Zhou, Hui;Liu, Ming;Chen, Haoxuan;Huang, Huamao;Hu, Haiying

作者机构:

关键词: Oudemansiella raphanipes; quality grading; knowledge distillation; multi-teacher model

期刊名称:AGRICULTURE-BASEL ( 影响因子:3.6; 五年影响因子:3.8 )

ISSN:

年卷期: 2025 年 15 卷 3 期

页码:

收录情况: SCI

摘要: Oudemansiella raphanipes is valued for its rich nutritional content and medicinal properties, but traditional manual grading methods are time-consuming and labor-intensive. To address this, deep learning techniques are employed to automate the grading process, and knowledge distillation (KD) is used to enhance the accuracy of a small-parameter model while maintaining a low resource occupation and fast response speed in resource-limited devices. This study employs a three-teacher KD framework and investigates three cascaded structures: the parallel model, the standard series model, and the series model with residual connections (residual-series model). The student model used is a lightweight ShuffleNet V2 0.5x, while the teacher models are VGG16, ResNet50, and Xception. Our experiments show that the cascaded structures result in improved performance indices, compared with the traditional ensemble model with equal weights; in particular, the residual-series model outperforms the other models, achieving a grading accuracy of 99.7% on the testing dataset with an average inference time of 5.51 ms. The findings of this study have the potential for broader application of KD in resource-limited environments for automated quality grading.

分类号:

  • 相关文献
作者其他论文 更多>>