首页 > 论文 > 光学学报 > 37卷 > 5期(pp:515002--1)

一种基于光场图像的聚焦光场相机标定方法

A Calibration Method of Focused Light Field Cameras Based on Light Field Images

  • 摘要
  • 论文信息
  • 参考文献
  • 被引情况
  • PDF全文
分享:

摘要

基于光场成像技术的场景三维深度重建, 需标定光场相机的几何参数。本文提出一种基于原始光场图像的聚焦光场相机的标定方法。拍摄标定板不同旋转角度的原始光场图像; 根据图像上像点与虚拟像点(像点关于微透镜的共轭点)的共轭关系, 计算得到虚拟像点的坐标; 根据标定板上角点与虚拟像点的共轭关系, 建立聚焦光场相机标定模型; 利用Levenberg-Marquardt算法求解标定模型, 进行标定实验; 比较所提方法与基于全聚焦图像的标定方法的标定参数。结果表明, 原始光场图像与全聚焦图像对应的虚拟像点(角点关于主透镜的共轭点)之间的误差小于21 pixel, 角点的标定误差小于3%, 基于原始光场图像和全聚焦图像方法获得的结构参数和外部参数具有较好的一致性, 表明基于原始光场图像的标定方法的可行性。

Abstract

In the three-dimensional depth reconstruction of the scene based on light field photography, it is necessary to calibrate the geometric parameters of light field cameras. In this paper, a calibration method of focused light field cameras is proposed based on Jiangsu raw light field images. The raw light field images of a calibration board with different orientations are captured. According to the conjugation relationship between image points and virtual image points (conjugation points of image points for the microlens), the coordinates of the virtual image points are calculated. The calibration model of focused light field cameras is established according to the conjugation relationship between the corner points on the calibration board and the virtual image points. The model is then solved by Levenberg-Marquardt algorithm. Calibration experiments are carried out. The accuracy of the proposed method is compared to that of the calibration method based on the total focused images. Experimental results show that the error between the virtual image points obtained from raw light field images and those (conjugation points of corner points for mainlens) from total focused images is less than 21 pixels. The relative calibration errors of the corner points are less than 3%. The calibrated configuration parameters and external parameters from raw light field images are in good consistence with those from total focused images. The proposed method is proved to be effective calibrating the focused light field cameras.

投稿润色
补充资料

中图分类号:TB391.4

DOI:10.3788/aos201737.0515002

所属栏目:机器视觉

基金项目:国家自然科学基金 (51676044, 51506030, 51327803)、 江苏省自然科学基金杰出青年基金(JL150501)

收稿日期:2016-11-30

修改稿日期:2017-01-06

网络出版日期:--

作者单位    点击查看

孙俊阳:东南大学能源与环境学院, 江苏 南京 210096
孙 俊:东南大学能源与环境学院, 江苏 南京 210096
许传龙:东南大学能源与环境学院, 江苏 南京 210096
张 彪:东南大学能源与环境学院, 江苏 南京 210096
王式民:东南大学能源与环境学院, 江苏 南京 210096

联系人作者:孙俊阳(1039227684@qq.com)

备注:孙俊阳(1993—), 男, 硕士研究生, 主要从事火焰三维温度场测量方面的研究。

【1】Feng Huanfei. Research on camera calibration method in three dimensional reconstruction[D]. Chongqing: Chongqing Jiaotong University, 2013: 1-10.
冯焕飞. 三维重建中的相机标定方法研究[D]. 重庆: 重庆交通大学, 2013: 1-10.

【2】Li Yunxiang. Research on camera calibration and 3D reconstruction technology[D]. Qingdao: Qingdao University, 2009: 1-8.
李云翔. 相机标定与三维重建技术研究[D]. 青岛: 青岛大学, 2009: 1-8.

【3】Zhang Yujin. Image engineering in China and some current research focuses[J]. Journal of Computer Aided Design and Graphics, 2002, 14(6): 490-497.
章毓晋. 中国图像工程及当前的几个研究热点[J]. 计算机辅助设计及图形学报, 2002, 14(6): 490-497.

【4】Yang Fan, Liu Wei, Zhang Yang, et al. Binocular camera calibration method combined with the four collinear constraints[J]. Acta Optica Sinica, 2016, 36(7): 0715001.
杨 帆, 刘 巍, 张 洋, 等. 结合四角共线约束的大视场双目相机标定方法[J]. 光学学报, 2016, 36(7): 0715001.

【5】Liu Yan, Li Tengfei. Research of the improvement of Zhang′s camera calibration method[J]. Optical Technique, 2014, 40(6): 565-570.
刘 艳, 李腾飞. 对张正友相机标定法的改进研究[J]. 光学技术, 2014, 40(6): 565-570.

【6】Zhang Chunping,Wang Qing. Survey on imaging model and calibration of light field camera[J]. Chinese J Lasers, 2016, 43(6): 0609004.
张春萍, 王 庆. 光场相机成像模型及参数标定方法综述[J]. 中国激光, 2016, 43(6): 0609004.

【7】Gershun A. The light filed[J]. Journal of Mathematics and Physics, 1939, 18(1): 51-97.

【8】Adelson E H, Wang J Y A. Single lens stereo with a plenoptic camera[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 1992, 14(2): 99-106.

【9】Ng R, Levoy M, Bredif M, et al. Light field photography with a hand-held plenoptic camera[J]. Computer Science Technical Report, 2005, 2(11): 1-11.

【10】Georgiew T, Lumsdaine A. Focused plenoptic camera and rendering[J]. Journal of Electronic Imaging, 2010, 19(2): 0211061.

【11】Lumsdaine A, Georgiew T. The focused plenoptic camera[C]. 2009 IEEE International Conference on Computational Photography, San Francisco, America, 2009, 8667(15): 1-8.

【12】Zhang Z Y. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334.

【13】Yang Yang, Yang Shuo, Zhou Quan, et al. Center calibration and image correction method based on Tsai algorithm[J]. Journal of University of Jinan (Natural Science Edition), 2007, 21: 27-29.
杨 洋, 杨 硕, 周 全, 等. 一种基于Tsai算法的中心标定及图像矫正方法[J]. 济南大学学报(自然科学版), 2007, 21: 27-29.

【14】Bok Y, Jeon H G, Kweon I S. Geometric calibration of micro-lens-based light field cameras using line features[C]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(2): 287-300.

【15】Strobl K H, Lingenauber M. Stepwise calibration of focused plenoptic cameras[J]. Computer Vision and Image Understanding, 2016, 145: 140-147.

【16】Johannsen O, Heinze C, Goldluecke B, et al. On the calibration of focused plenoptic cameras[C]. Time-of-Flight and Depth Imaging, 2013: 302-317.

【17】Zeller N, Quint F, Stilla U. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2016, 118: 83-100.

【18】Liu Zeqing, Zhang Yurong, Zhao Jianxin, et al. High speed camera calibration for velocity measurement in range static explosion experiment[J]. Laser & Optoelectronics Progress, 2016, 53(11): 111501.
刘泽庆, 张玉荣, 赵建新, 等. 靶场静爆实验测速高速相机标定方法[J]. 激光与光电子学进展, 2016, 53(11): 111501.

引用该论文

Sun Junyang,Sun Jun,Xu Chuanlong,Zhang Biao,Wang Shimin. A Calibration Method of Focused Light Field Cameras Based on Light Field Images[J]. Acta Optica Sinica, 2017, 37(5): 0515002

孙俊阳,孙 俊,许传龙,张 彪,王式民. 一种基于光场图像的聚焦光场相机标定方法[J]. 光学学报, 2017, 37(5): 0515002

您的浏览器不支持PDF插件,请使用最新的(Chrome/Fire Fox等)浏览器.或者您还可以点击此处下载该论文PDF