A list of uv environments templates for LLM development.
-
Updated
Sep 19, 2025
A list of uv environments templates for LLM development.
Prebuilt FlashAttention wheel for CUDA 13.2 (PyTorch 2.12 nightly)
🌐 Streamline LLM development with ready-to-use environment templates for efficient setup and deployment.
Custom flash-attn build for CUDA 13 + PyTorch 2.12 (cu130) + Triton 3.7
Collection of Flash attention 3 precompiled wheels for direct install on Hopper series NVIDIA GPU (H100, H200 sm_90a)
Compile environment for Pytorch with CUDA
Add a description, image, and links to the flash-attn topic page so that developers can more easily learn about it.
To associate your repository with the flash-attn topic, visit your repo's landing page and select "manage topics."