激光与光电子学进展, 2018, 55 (7): 071101, 网络出版: 2018-07-20
基于三维卷积神经网络的无参考视频质量评价 下载: 603次
No-Reference Video Quality Assessment Based on Three-Dimensional Convolutional Neural Networks
成像系统 视频质量评价 无参考 三维卷积神经网络 时空域特征 线性支持向量回归 imaging systems video quality assessment no-reference three-dimensional convolutional neural networks spatiotemporal features linear support vector regression
摘要
为了在不借助参考视频的条件下准确评价失真视频质量,提出一种应用三维卷积神经网络提取失真视频时空域特征的通用型无参考视频质量评价算法。在视频质量库上训练卷积神经网络模型3D ConvNets,使3D ConvNets学习到与视频失真程度相关的特征;应用3D ConvNets对输入的失真视频进行特征提取,对提取得到的质量特征先后进行L2范数规则化和主成分分析以防止过拟合并去除冗余特征;使用线性支持向量回归根据视频质量特征预测失真视频的质量分数。实验结果表明,本文算法能够较为准确地评价多种视频失真类型,并且在更换测试视频库后依然保持较高的评价准确度,同时算法评价视频质量的计算复杂度极低。
Abstract
In order to assess the quality of distorted videos accurately without reference videos, a universal no-reference video quality assessment algorithm is proposed, which applies three-dimensional (3D) convolutional neural networks to extracting spatiotemporal features of distorted videos. Firstly, the convolutional neural network model 3D ConvNets is trained on the video quality database, and then the features related to video distortion degree are learned. Then, 3D ConvNets is used to extract features of the input distorted video, after which L2-normalization and principal component analysis are performed to prevent overfitting and eliminate redundancy. Finally, linear support vector regression is used to predict quality score of the distorted video based on video quality features. The experimental results show that the proposed algorithm can assess video quality accurately across different kinds of distortion, and it still maintains a high level of accuracy when the test video database is changed. Last but not least, the computational complexity of quality assessment process is extremely low for the proposed algorithm.
张淑芳, 郭志鹏. 基于三维卷积神经网络的无参考视频质量评价[J]. 激光与光电子学进展, 2018, 55(7): 071101. Zhang Shufang, Guo Zhipeng. No-Reference Video Quality Assessment Based on Three-Dimensional Convolutional Neural Networks[J]. Laser & Optoelectronics Progress, 2018, 55(7): 071101.