光电工程, 2017, 44 (4): 442, 网络出版: 2017-07-10
主动式全景视觉的隧道全断面变形检测方法
Development of a modeling method for monitoring tunnel deformation based on active panoramic vision technique
隧道变形检测 隧道横断面 三维重建 主动式全景视觉 亚像素 tunnel deformation monitoring tunnel cross section tunnel 3D reconstruction active panorama vi-sion sub-pixel
摘要
针对隧道距离长、运维检测时间跨度长、可检测时间短、形变数据变化小等特点,本文提出了一种基于主动式全景视觉的隧道全断面变形检测方法。首先通过配置在隧道检测装置上的主动式全景视觉传感器对隧道横断面进行全景扫描,获取隧道全景切面扫描图像;接着通过改进的高斯曲线拟合法提取全景切面扫描图像上的亚像素激光中心点,并采用贝塞尔曲线对其进行平滑处理;然后根据全方位视觉传感器的标定结果解析出隧道横断面上内壁的几何信息;再利用隧道横断面上内壁三维点云数据进行三维重建;最后,对重构的隧道模型进行了精度分析。实验研究表明:基于主动式全景视觉的隧道全断面变形的检测方法具有检测速度快、实时性好、数据全面、可视化程度高等优点,能满足对狭长隧道进行快速定性定量分析的需求。
Abstract
Aiming at the characteristics of long-distance tunnel, long maintenance intervals, limited repairing time and small changes in the deformation data, this paper presents a detection method of tunnel deformation based on active omni-directional vision tunnel section deformation. Firstly, the active panoramic vision sensor (ASODVS) is installed at the tunnel and detecting device scans the cross section of tunnel to acquire tunnel section pano-ramic images. Secondly, the sub-pixel center of the panoramic image is extracted by the improved Gauss curve fitting method, and the smooth processing is performed by using the Bezier curves. Thirdly, the system analyzes the geometry information of the tunnel cross-section of the inner wall through calibration results of omnidirec-tional vision sensor. The tunnel cross section 3D point cloud data are used for 3D reconstruction. Finally, the pre-cision of the tunnel reconstruction model is analyzed. The experimental results show the method has the ad-vantages of high speed acquisition, real-time, comprehensive data and good visualization. It can meet the needs of the rapid qualitative and quantitative analyses.
汤一平, 袁公萍, 陈麒, 韩国栋, 胡克钢. 主动式全景视觉的隧道全断面变形检测方法[J]. 光电工程, 2017, 44(4): 442. Yiping Tang, Gongping Yuan, Qi Chen, Guodong Han, Kegang Hu. Development of a modeling method for monitoring tunnel deformation based on active panoramic vision technique[J]. Opto-Electronic Engineering, 2017, 44(4): 442.