Cookies

We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.


Durham Research Online
You are in:

Dense gradient-based features (DeGraF) for computationally efficient and invariant feature extraction in real-time applications.

Katramados, I. and Breckon, T.P. (2016) 'Dense gradient-based features (DeGraF) for computationally efficient and invariant feature extraction in real-time applications.', in 2016 IEEE International Conference on Image Processing (ICIP), September 25-28, 2016, Phoenix, Arizona, USA ; proceedings. Piscataway, NJ: IEEE, pp. 300-304.

Abstract

We propose a computationally efficient approach for the extraction of dense gradient-based features based on the use of localized intensity-weighted centroids within the image. Whilst prior work concentrates on sparse feature derivations or computationally expensive dense scene sensing, we show that Dense Gradient-based Features (DeGraF) can be derived based on initial multi-scale division of Gaussian preprocessing, weighted centroid gradient calculation and either local saliency (DeGraF-α) or signal-to-noise inspired (DeGraF-β) final stage filtering. We present two variants (DeGraF-α / DeGraF-β) of which the signal-to-noise based approach is shown to perform admirably against the state of the art in terms of feature density, computational efficiency and feature stability. Our approach is evaluated under a range of environmental conditions typical of automotive sensing applications with strong feature density requirements.

Item Type:Book chapter
Full text:(AM) Accepted Manuscript
Download PDF
(1418Kb)
Status:Peer-reviewed
Publisher Web site:http://dx.doi.org/10.1109/ICIP.2016.7532367
Publisher statement:© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Date accepted:12 July 2016
Date deposited:03 October 2016
Date of first online publication:19 August 2016
Date first made open access:No date available

Save or Share this output

Export:
Export
Look up in GoogleScholar