We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

On Depth Error from Spherical Camera Calibration within Omnidirectional Stereo Vision

Groom, M. and Breckon, T.P. (2022) 'On Depth Error from Spherical Camera Calibration within Omnidirectional Stereo Vision.', 26th International Conference on Pattern Recognition Montreal, Québec, 21-25 Aug 2022.


As a depth sensing approach, whilst stereo vision provides a good compromise between accuracy and cost, a key limitation is the limited field of view of the conventional cameras that are used within most stereo configurations. By contrast, the use of spherical cameras within a stereo configuration offers omnidirectional stereo sensing. However, despite the presence of significant image distortion in spherical camera images, only very limited attempts have been made to study and quantify omnidirectional stereo depth accuracy. In this paper we construct such an omnidirectional stereo system that is capable of real-time 360◦ disparity map reconstruction as the basis for such a study. We first investigate the accuracy of using a standard spherical camera model for calibration combined with a longitude-latitude projection for omnidirectional stereo, and show that the depth error increases significantly as the angle from the camera optical axis approaches the limits of the camera field of view. In contrast, we then consider an alternative calibration approach via the use of perspective undistortion with a conventional pinhole camera model allowing omnidirectional cameras to be mapped to a conventional rectilinear stereo formulation. We find that conversely this proposed approach exhibits improved depth accuracy at large angles from the camera optical axis when compared to omnidirectional stereo depth based on a spherical camera model calibration.

Item Type:Conference item (Paper)
Full text:(AM) Accepted Manuscript
Download PDF
Publisher Web site:
Publisher statement:© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Date accepted:17 May 2022
Date deposited:15 June 2022
Date of first online publication:22 August 2022
Date first made open access:25 August 2022

Save or Share this output

Look up in GoogleScholar