Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: Support self-hosted s3 compatible storage #361

Open
erhwenkuo opened this issue Jan 3, 2025 · 1 comment
Open

feature: Support self-hosted s3 compatible storage #361

erhwenkuo opened this issue Jan 3, 2025 · 1 comment

Comments

@erhwenkuo
Copy link

For many company, which hosted their own s3 compatible storage on premise (like Minio). It's very important to allow retrieve llm related model file from on-premise S3 storage.

In order to fulfill this feature, the system might need to allow setup three parameters for S3:

  1. AWS_ACCESS_KEY_ID
  2. AWS_SECRET_ACCESS_KEY
  3. AWS_ENDPOINT_URL

The #1&#2 already cover in current KUBEAI design. I would suggest to add #3 (AWS_ENDPOINT_URL) to current design, so it will attract more enterprise to adopt KUBEAI.

@samos123
Copy link
Contributor

samos123 commented Jan 3, 2025

vllm 0.6.6 now supports S3 compatible storage using RunAI model stream. That would allow setting all those parameters using environment variables: https://github.com/run-ai/runai-model-streamer/blob/master/docs/src/env-vars.md

I will test the new vLLM release and update the helm chart.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants