Spotlight Poster
Motion Forecasting in Continuous Driving
Nan Song · Bozhou Zhang · Xiatian Zhu · Li Zhang
East Exhibit Hall A-C #3902
Motion forecasting for agents in autonomous driving is highly challenging due to the numerous possibilities for each agent's next action and their complex interactions in space and time. In real applications, motion forecasting takes place repeatedly and continuously as the self-driving car moves.However, existing forecasting methods typically process each driving scene within a certain range independently,totally ignoring the situational and contextual relationshipsbetween successive driving scenes.This significantly simplifies the forecasting task,making the solutions unrealistic.To address this fundamental limitation,we propose a novel motion forecasting framework for continuous driving, named RealMotion.It comprises two integral streams both at the scene level:(1) The scene context stream progressively accumulates historical scene information until the present moment, capturing long-term interactive relationships among scene elements.(2) The agent trajectory stream optimizes current forecasting by sequentially relaying past predictions.Besides, a data reorganization strategy is introduced to narrow the gap between existing benchmarks and real-world applications, consistent with our network. These approaches enable exploiting more broadly the situational and progressive insights of dynamic motion across space and time. Extensive experiments on Argoverse series with different settings demonstrate that our RealMotion achieves state-of-the-art performance, along with the advantage of efficient real-world inference.
Live content is unavailable. Log in and register to view live content