光子学报, 2019, 48 (8): 0810001, 网络出版: 2019-11-28
基于几何约束的自适应相位解包裹算法
Adaptive Phase Unwrapping Method Based on Geometric Constraint
相位解包裹 自适应测量 相移法 三维图像重构 结构光 标定 几何约束 Phase unwrapping Adaptive imaging Fringe analysis Three-dimensional image reconstruction Structured light Calibration Geometric constraint
摘要
提出一种通过相机和投影仪的空间几何约束来展开相位包裹的方法, 只需要对结构光投影测量系统进行标定, 不需要进行传统的时间或空间相位展开.通过投影单周期条纹得到物体的大致高度信息以确定虚拟深度平面, 在虚拟平面z0min处, 根据结构光系统的标定参数创建最小绝对相位图, 物体的包裹相位逐像素与进行比较, 即确定条纹级数, 实现相位解包裹.该方法具有良好的鲁棒性, 对硬件要求低, 采集图像少并且不需要额外的物体来获得z0min, 能够实现自适应动态测量.实验结果表明, 在同等条件下, 与传统时间相位展开方法相比, 该方法的相对误差降低了14.33%, 同时简化了测量方法, 能够有效实现物体的三维形貌测量.
Abstract
A method of unwrapping phase by the spatial geometric constraints of the camera and projector is proposed. This method only needs to calibrate the structured light measurement system without conventional temporal or spatial phase unwrapping. The approximate height information of the object is obtained by projecting one-period fringes to determine virtual depth plane, and at the virtual plane z0min, the minimum absolute phase map is created according to the calibration parameters of the structured light system. And then the unwrapped phase of the object can be unwrapped pixel by pixel with reference to the minimum absolute phase map, by determining the number of fringe order. The proposed method has advantages of good robustness and low requirement for hardware, and it does not require any additional image acquisition and additional objects to obtain z0min, which can realize adaptive dynamic measurement. The experimental results show that under the same conditions, compared with the conventional temporal phase unwrapping method, the relative error of the proposed method is reduced by 14.33%, and the measurement method is simplified, which can effectively realize the three-dimensional shape measurement of objects.
李雯, 蔡宁, 林斌, 曹向群. 基于几何约束的自适应相位解包裹算法[J]. 光子学报, 2019, 48(8): 0810001. LI Wen, CAI Ning, LIN Bin, CAO Xiang-qun. Adaptive Phase Unwrapping Method Based on Geometric Constraint[J]. ACTA PHOTONICA SINICA, 2019, 48(8): 0810001.