激光与光电子学进展, 2018, 55 (1): 013301, 网络出版: 2018-09-10
基于移动平台热成像的行人检测随机蕨分类器 下载: 1009次
Random Ferns Classifier for Pedestrian Detection Based on Thermal Imaging of Mobile Platform
视觉光学 行人检测 随机蕨分类器 特征提取 热成像图片 在线学习 visual optics pedestrian detecting random ferns classifier feature extraction thermal imaging picture online learning
摘要
傍晚至清晨的照度低,成为交通意外、安全事故、犯罪事件的高发时间段。将适用于低照度环境下的热成像摄像机安装于移动平台,实现了监控区域的扩展。首先框选热成像图片中的行人及背景区域,然后提取亮度特征和方向中心对称-局部二值模式纹理特征进行随机蕨分类器的训练及分类;利用检测到的目标扩展训练样本库,更新分类器的后验概率分布,实现了分类器的在线自主学习。通过仿真测试,得到该算法对车载视频的运算速度为242.18 s,误检率为9.53%;对无人机视频的运算速度为14.93 s,误检率为4.52%。该算法的误检率低、分类速度快、便于移植,适用于对实时性要求较高的应用场合,具有一定的实际工程意义。
Abstract
Traffic accidents, security incidents and crime affairs have a high incidence from evening to early morning due to low illumination. A thermal imaging camera suitable for low illumination environment is installed on the mobile platform to realize the expansion of surveillance area. Pedestrian and background regions in thermal imaging pictures are framed firstly, and then the brightness feature and oriented center symmetric-local binary patterns texture feature are extracted for the training and classification of random ferns classifier. The detected targets are used to extend training sample database, the posterior probability distribution of the classifier is updated, and the online automatic learning of the classifier is realized. Through simulation test, the computing speed of the algorithm for vehicle video is 242.18 s and the false detection rate is 9.53%. For unmanned aerial vehicle video, the computing speed is 14.93 s, and the false detection rate is 4.52%. This algorithm has low false detection rate, fast classification speed and transplant easily. It is suitable for applications with high real-time requirements, and it has certain practical engineering significance.
诸葛琳娜, 张磊. 基于移动平台热成像的行人检测随机蕨分类器[J]. 激光与光电子学进展, 2018, 55(1): 013301. Zhuge Linna, Zhang Lei. Random Ferns Classifier for Pedestrian Detection Based on Thermal Imaging of Mobile Platform[J]. Laser & Optoelectronics Progress, 2018, 55(1): 013301.