Skip to content

Pull requests: triton-inference-server/vllm_backend

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

Add input and output tokens to response
#41 opened May 16, 2024 by kebe7jun Loading…
feat: Report more vllm metrics enhancement New feature or request
#92 opened May 13, 2025 by Pavloveuge Loading…
3 of 10 tasks
Add support for priority in vllm backend
#88 opened Apr 24, 2025 by TheCodeWrangler Loading…
2 of 5 tasks
Followup with some fixes
#77 opened Dec 20, 2024 by oandreeva-nv Draft
docs: Update README.md documentation Improvements or additions to documentation
#63 opened Sep 6, 2024 by yinggeh Draft
add multimodal support for qwen2.5
#90 opened May 6, 2025 by abdulazizab2 Loading…
ProTip! What’s not been updated in a month: updated:<2025-11-01.