Skip to content
You must be logged in to sponsor vllm-project

Become a sponsor to vLLM

Your contribution will help fund the development and testing of the vLLM project. We strive to maintain vLLM as the best open-source, community-owned project for LLM inference. However, developing it on GPUs is expensive, and ensuring that it is production-ready requires considerable resources. Please help us sustain it!

Current sponsors 11

@robertgshaw2-redhat
@vincentkoc
@dvlpjrs
Private Sponsor
@mhupfauer
@terrytangyuan
@HiddenPeak
@comet-ml
@imkero
@Stack-ML
@GabrielBianconi
Past sponsors 19
@upstash
@AnyISalIn
@lukalafaye
@mgoin
@peakji
@youkaichao
@maxdebayser
@yangalan123
Private Sponsor
@massif-01
@davedgd
@fterrazzoni
@AlpinDale
@kiritoxkiriko
@adheep04
@trianxy
@LEE5J
@AnantChandra

Featured work

  1. vllm-project/vllm

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 48,661

Select a tier

$ a month

Choose a custom amount.
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy