← back to feed

vllm-project/vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

76,09915,439 forksPython
40
SINT

Score Breakdown

ActivityQualityResponseTrustAgentRelevance
15%
Activity
31%
Quality
50%
Response
83%
Trust
13%
Agent Ready
80%
Relevance

Badge for README

SINT badge preview
[![SINT Verified Live](https://discovery-api-production-19d9.up.railway.app/badge/project/9.svg)](https://discovery-api-production-19d9.up.railway.app/badge/redirect/9)

Actions

vllm-project/vllm — SINT Score 40