We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

Evaluating Gaussian Grasp Maps for Generative Grasping Models

Prew, W. and Breckon, T.P. and Bordewich, M.J.R. and Beierholm, U. (2022) 'Evaluating Gaussian Grasp Maps for Generative Grasping Models.', Proc. Int. Joint Conf. Neural Networks Padova, Italy, 18-23 July 2022.


Generalising robotic grasping to previously unseen objects is a key task in general robotic manipulation. The current method for training many antipodal generative grasping models rely on a binary ground truth grasp map generated from the centre thirds of correctly labelled grasp rectangles. However, these binary maps do not accurately reflect the positions in which a robotic arm can correctly grasp a given object. We propose a continuous Gaussian representation of annotated grasps to generate ground truth training data which achieves a higher success rate on a simulated robotic grasping benchmark. Three modern generative grasping networks are trained with either binary or Gaussian grasp maps, along with recent advancements from the robotic grasping literature, such as discretisation of grasp angles into bins and an attentional loss function. Despite negligible difference according to the standard rectangle metric, Gaussian maps better reproduce the training data and therefore improve success rates when tested on the same simulated robot arm by avoiding collisions with the object: achieving 87.94% accuracy. Furthermore, the best performing model is shown to operate with a high success rate when transferred to a real robotic arm, at high inference speeds, without the need for transfer learning. The system is then shown to be capable of performing grasps on an antagonistic physical object dataset benchmark.

Item Type:Conference item (Paper)
Full text:(AM) Accepted Manuscript
Available under License - Creative Commons Attribution 4.0.
Download PDF
Publisher Web site:
Publisher statement:This work was funded by UKRI EPSRC. For the purpose of open access, the authors have applied a Creative Commons Attribution (CC BY) license to the Accepted Manuscript version arising.
Date accepted:26 April 2022
Date deposited:06 June 2022
Date of first online publication:18 July 2022
Date first made open access:06 June 2022

Save or Share this output

Look up in GoogleScholar