We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

Eliminating the blind spot : adapting 3D object detection and monocular depth estimation to 360° panoramic imagery.

Payen de La Garanderie, Grégoire and Atapour Abarghouei, Amir and Breckon, Toby P. (2018) 'Eliminating the blind spot : adapting 3D object detection and monocular depth estimation to 360° panoramic imagery.', in Computer Vision – ECCV 2018 : 15th European Conference, Munich, Germany, September 8-14, 2018, Proceedings, Part XII. Cham: Springer, pp. 812-830. Lecture notes in computer science. (11217).


Recent automotive vision work has focused almost exclusively on processing forward-facing cameras. However, future autonomous vehicles will not be viable without a more comprehensive surround sensing, akin to a human driver, as can be provided by 360 ∘ panoramic cameras. We present an approach to adapt contemporary deep network architectures developed on conventional rectilinear imagery to work on equirectangular 360 ∘ panoramic imagery. To address the lack of annotated panoramic automotive datasets availability, we adapt contemporary automotive dataset, via style and projection transformations, to facilitate the cross-domain retraining of contemporary algorithms for panoramic imagery. Following this approach we retrain and adapt existing architectures to recover scene depth and 3D pose of vehicles from monocular panoramic imagery without any panoramic training labels or calibration parameters. Our approach is evaluated qualitatively on crowd-sourced panoramic images and quantitatively using an automotive environment simulator to provide the first benchmark for such techniques within panoramic imagery.

Item Type:Book chapter
Full text:(AM) Accepted Manuscript
Download PDF
Publisher Web site:
Publisher statement:The final publication is available at Springer via
Date accepted:No date available
Date deposited:15 October 2018
Date of first online publication:06 October 2018
Date first made open access:No date available

Save or Share this output

Look up in GoogleScholar