TrafficGen: Learning to Generate Diverse
and Realistic Traffic Scenarios |
Webpage | Video | Code | Paper |
This work introduces a data-driven method called TrafficGen for traffic scenario generation. It learns from the fragmented human driving data collected in the real world and then can generate realistic traffic scenarios. TrafficGen is an autoregressive generative model with an encoder-decoder architecture. In each autoregressive iteration, it first encodes the current traffic context with the attention mechanism and then decodes a vehicle's initial state followed by generating its long trajectory.
TrafficGen synthesizes a complete traffic scenario in an autoregressive way. The first step is to place vehicles on the given map. At each iteration, a vehicle is placed by sampling the spatial probability distribution over all the regions in the scene. We visualize current spatial distribution through the iteration-varying heatmap in the above figure. At the end of sampling, a set of N vehicles are placed at the map and forms the traffic snapshot.
We then use a motion forecasting model as the trajectory generator since it can output multi-mode diverse samples. However, motion forecasting models cannot be directly used for long trajectory generation, since they are brittle to distributional shift. We sample one possible future trajectory for each vehicle. To mitigate the long-term cumulative error, only the first several steps of the trajectory are used. Then the long trajectories of a vehicle are generated by several rollouts.
With different HD maps and traffic light signals as input, TrafficGen can generate diverse traffic scenarios in the video below.
The following video shows that by sampling many times on the same map, TrafficGen can generate diverse traffic flows.
@inproceedings{feng2023trafficgen, title={Trafficgen: Learning to generate diverse and realistic traffic scenarios}, author={Feng, Lan and Li, Quanyi and Peng, Zhenghao and Tan, Shuhan and Zhou, Bolei}, booktitle={2023 IEEE International Conference on Robotics and Automation (ICRA)}, pages={3567--3575}, year={2023}, organization={IEEE} }