Cookies

We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.


Durham Research Online
You are in:

Multiclass-SGCN: Sparse Graph-based Trajectory Prediction with Agent Class Embedding

Li, Ruochen and Katsigiannis, Stamos and Shum, Hubert P. H. (2022) 'Multiclass-SGCN: Sparse Graph-based Trajectory Prediction with Agent Class Embedding.', in 2022 IEEE International Conference on Image Processing (ICIP) Proceedings. , pp. 2346-2350.

Abstract

Trajectory prediction of road users in real-world scenarios is challenging because their movement patterns are stochastic and complex. Previous pedestrian-oriented works have been successful in modelling the complex interactions among pedestrians, but fail in predicting trajectories when other types of road users are involved (e.g., cars, cyclists, etc.), because they ignore user types. Although a few recent works construct densely connected graphs with user label information, they suffer from superfluous spatial interactions and temporal dependencies. To address these issues, we propose Multiclass-SGCN, a sparse graph convolution network based approach for multi-class trajectory prediction that takes into consideration velocity and agent label information and uses a novel interaction mask to adaptively decide the spatial and temporal connections of agents based on their interaction scores. The proposed approach significantly outperformed state-of-the-art approaches on the Stanford Drone Dataset, providing more realistic and plausible trajectory predictions.

Item Type:Book chapter
Full text:(AM) Accepted Manuscript
Download PDF
(20479Kb)
Status:Peer-reviewed
Publisher Web site:https://doi.org/10.1109/ICIP46576.2022.9897644
Publisher statement:© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Date accepted:20 June 2022
Date deposited:29 June 2022
Date of first online publication:18 October 2022
Date first made open access:20 October 2022

Save or Share this output

Export:
Export
Look up in GoogleScholar