Skip to content

George-Zhuang/lbm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lattice Boltzmann Model

Like tracking anything? Want efficiency? Try LBM!

Lattice Boltzmann Model for Learning Real-World Pixel Dynamicity (NeurIPS 2025)
Guangze Zheng, Shijie Lin, Haobo Zuo, Si Si, Ming-Shan Wang, Changhong Fu, and Jia Pan
arXiv Project Page

The features of LBM inlcude:

  • physics-inspired by lattice Boltzmann method in fluid dynamics.
  • online in a frame-by-frame feed-forward manner.
  • real-time with ~50 FPS on NVIDIA Jetson Orin NX (TensorRT FP16).
  • robust against detection failure for 2d object tracking.

📌 News

  • 2025.09 LBM is accpected by NeurIPS 2025.
  • 2025.06 LBM TensorRT is available. LBM can also track 3D points now by lifting.
  • 2025.04 LBM is proposed for online and real-time 2D point tracking and object tracking in dynamic scenes, with only 18M parameter and achieve SOTA performance.

📚 Tutorial

  • Train train LBM from scratch. About 2 days on 4 NVIDIA H800 GPUs.
  • Eval eval LBM to reproduce results in the paper.
  • TensorRT run LBM on NVIDIA Jetson devices as fast as on RTX 4090! 49 FPS on NVIDIA Jetson Orin NX.

🛠️ Prepare

  • Clone this repo:

    git clone https://github.com/George-Zhuang/lbm.git
    cd lbm
  • Basic packages:

    conda create -n lbm python=3.10
    pip3 install torch torchvision --index-url https://download.pytorch.org/whl/cu124 # please check your cuda version
    pip install -r requirements.txt
  • [Optional] Demo with Ultralytics for detection in 2D object tracking:

    pip install ultralytics
    pip install --no-cache-dir git+https://github.com/ultralytics/CLIP.git
  • Download the pretrained weights for demo and evaluation from HuggingFace and put them in checkpoints folder. For example:

    huggingface-cli download ZhengGuangze/LBM lbm.pt --local-dir checkpoints

▶️ Demo

  • Click for point tracking

    Simply run the following:

    python tools/demo_click.py --video_path data/demo.mp4

    The demo uses cv2 for visualization. Please click a few points to track and press q to quit the cv2 window.

  • Object tracking

    This demo corresponds to Section 4.5 in the paper. Simply run the following and ultralytics will download YOLOE and MobileCLIP weights automatically:

    python tools/demo_box.py --video_path data/demo.mp4 --prompt bird

😊 Acknowledgements

Thanks to these great repositories: Track-On, CoTracker, DELTA, TAPNet, and many other inspiring works in the community.

🎫 License

The model is licensed under the Apache 2.0 license.

About

[NeurIPS 2025] Efficient track anything

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages