You are viewing the latest developer preview docs. Click here to view docs for the latest stable release.

logo
vLLM
Installation
Initializing search
    GitHub
    • Home
    • User Guide
    • Developer Guide
    • API Reference
    • CLI Reference
    • Community
    GitHub
      • vLLM
        • Quickstart
          • Installation
          • GPU
          • CPU
          • Google TPU
          • Intel Gaudi
          • AWS Neuron
        • Offline Inference
        • Online Serving
        • Others
        • User Guide
        • Developer Guide
        • API Reference
        • CLI Reference
        • Roadmap
        • Releases
    • User Guide
    • Developer Guide
    • API Reference
    • CLI Reference
    • Community

    Installation

    vLLM supports the following hardware platforms:

    • GPU
      • NVIDIA CUDA
      • AMD ROCm
      • Intel XPU
    • CPU
      • Intel/AMD x86
      • ARM AArch64
      • Apple silicon
      • IBM Z (S390X)
    • Google TPU
    • Intel Gaudi
    • AWS Neuron
    Made with Material for MkDocs