Skip to content

Thinklab-SJTU/DriveTransformer

Repository files navigation

DriveTransformer: Unified Transformer for Scalable End-to-End Autonomous Driving

drivetransformer_bench2drive.mp4

Official implementation of paper DriveTransformer: Unified Transformer for Scalable End-to-End Autonomous Driving. Xiaosong Jia, Junqi You, Zhiyuan Zhang, Junchi Yan. ICLR 2025

DriveTransformer offers a unified, parallel, and synergistic approach to end-to-end autonomous driving, facilitating easier training and scalability. The framework is composed of three unified operations: task self-attention, sensor cross-attention, temporal cross-attention and has three key features:

  • Task Parallelism: All agent, map, and planning queries direct interact with each other at each block.
  • Sparse Representation: Task queries direct interact with raw sensor features.
  • Streaming Processing: Task queries are stored and passed as history information.

overall

Getting Started

Model and Result

Model Driving Score Success Rate(%) Efficiency Comfortness Latency Config Download
DriveTransformer-Large 63.46 35.01 100.64 20.78 211.7ms config Google Drive/Baidu Cloud

Citation

@inproceedings{jia2025drivetransformer,
  title={DriveTransformer: Unified Transformer for Scalable End-to-End Autonomous Driving},
  author={Xiaosong Jia and Junqi You and Zhiyuan Zhang and Junchi Yan},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2025}
}

About

[ICLR 2025] DriveTransformer: Unified Transformer for Scalable End-to-End Autonomous Driving

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published