此页面上的内容需要较新版本的 Adobe Flash Player。

获取 Adobe Flash Player

MobileNet network optimization based on convolutional block attention module


ZHAO Shuxu, MEN Shiyao, YUAN Lin


(School of Electronics and Information Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China)


Abstract:Deep learning technology is widely used in computer vision. Generally, a large amount of data is used to train the model weights in deep learning, so as to obtain a model with higher accuracy. However, massive data and complex model structures require more calculating resources. Since people generally can only carry and use mobile and portable devices in application scenarios, neural networks have limitations in terms of calculating resources, size and power consumption. Therefore, the efficient lightweight model MobileNet is used as the basic network in this study for optimization. First, the accuracy of the MobileNet model is improved by adding methods such as the convolutional block attention module (CBAM) and expansion convolution. Then, the MobileNet model is compressed by using pruning and weight quantization algorithms based on weight size. Afterwards, methods such as Python crawlers and data augmentation are employed to create a garbage classification data set. Based on the above model optimization strategy, the garbage classification mobile terminal application is deployed on mobile phones and raspberry pies, realizing completing the garbage classification task more conveniently.

Key words:MobileNet; convolutional block attention module (CBAM); model pruning and quantization; edge machine learning


References


[1]SUN Z Y, LU C X, SHI Z Z, et al. Research and progress in deep learning. Computer Science, 2016, 43(2):1-8.

[2]LEI J, GAO X, SONG J, et al. Overview of deep networkmodel compression. Journal of Software, 2018, 29(2):251-266.

[3]HAN S. Deep Compression:compressing deep neural networks with pruning, trained quantization and huffman coding//The 4th International Conference on Learning Representations, May 2-4, 2016, San Juan, Puerto Rico, USA. Oper Review.net, 2016:3-7.

[4]HOWARD A G, ZHU M, CHEN B, et al. MobileNets:Efficient convolutional neural networks for mobile vision applications. (2017-04-17)[2020-06-15]. https://arxiv. org/abs/1704.04861.

[5]ZHONG D J. Research on key issues of robustness recognition based on deep learning model transfer. Chengdu:University of Electronic Science and Technology, 2019.

[6]ZHANG G. Research on high-density image object counting algorithm based on density function estimation. Hefei:Anhui University, 2019.

[7]WOO S, PARK J, LEE J Y, et al. CBAM:Convolutionalblock attention module. (2018-07-18)[2020-06-15]. https://arxiv.org/abs/1807.06521.

[8]MICHAEL N. Neural networks and deep learning. (2019-12-26)[2020-06-15]. http://neural network sand deep learning.com/chap3.html#the_cross-entropy_cost_function.

[9]XU X L. The development and current situation of artificial neural networks. Microelectronics, 2017, 47(2):239-242.

[10]KINGMA D P, BA J. Adam:Amethod for stochastic optimization. (2014-12-22)[2020-06-15]. https://arxiv. org/abs/1412.6980v9.

[11]ZHU M, SUYOG G. To prune, or not to prune:exploring the efficacy of pruning formodel compression. (2017-10-05)[2020-06-15]. https://arxiv.org/abs/1710.01878.

[12]KRIZHEVSKY A, HINTON G. Learning multiple layers of features from tiny images. Handbook of Systemic Autoimmune Diseases, 2009, 1(3):32-35.

[13]HE K, ZHANG X, REN S, et al. Delving deep into rectifiers:Surpassing human-level performance on ImageNet classification//International Conference on Computer Vision, Des. 13-16, 2015, IEEE, Santiago, Chile, Washington: IEEE Computer Society, 2015:1026-1034.

[14]CHEN F. Research and optimization of mobilenet compression model. Wuhan:Huazhong University of Science and Technology, 2018.

[15]PEREZ L, WANG J. The effectiveness of data augmentation in image classification using deep learning.(2017-12-13)[2020-06-15]. https://arxiv.org/abs/1712.04621.

[16]WU G W. Research and implementation of lightweight of lightweight convolutional neural network based on mobile terminal. Xi’an:Xidian University, 2018.

[17]WANG A R, LIU W. Product recognition based on mobilenet model on tensor flow platform. Information and Communication, 2018, 192(12):54-55.

 


基于卷积注意力机制的MobileNet网络优化


赵庶旭, 门士尧, 元  琳


(兰州交通大学, 电子与信息工程学院, 甘肃 兰州 730070)


摘  要:    深度学习技术被广泛应用于图像分类等计算机视觉领域。 大型数据集通常需要大量的计算能力来训练。 在实际应用场景中, 人们通常只能携带和使用移动设备和便携设备, 因此, 在计算能力、 容量、 功耗等方面受到限制。  本文以高效的轻量级模型MobileNet为基础, 通过增加卷积块注意模块和调整网络模型结构来提高模型精度, 同时基于权值大小来修剪和量化压缩模型大小, 实现了基于该优化模型的垃圾分类移动终端应用, 便捷地完成垃圾分类任务。  

关键词: MobileNet; 卷积注意力机制; 模型修剪与量化; 边缘机器学习  



引用格式:ZHAO Shuxu, MEN Shiyao, YUAN Lin. MobileNet network optimization based on convolutional block attention module. Journal of Measurement Science and Instrumentation, 2022, 13(2):225-234. DOI:10.3969/j.issn.1674-8042.2022.02.012



[full text view]