You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For many company, which hosted their own s3 compatible storage on premise (like Minio). It's very important to allow retrieve llm related model file from on-premise S3 storage.
In order to fulfill this feature, the system might need to allow setup three parameters for S3:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_ENDPOINT_URL
The #1 already cover in current KUBEAI design. I would suggest to add #3 (AWS_ENDPOINT_URL) to current design, so it will attract more enterprise to adopt KUBEAI.
The text was updated successfully, but these errors were encountered:
For many company, which hosted their own s3 compatible storage on premise (like Minio). It's very important to allow retrieve llm related model file from on-premise S3 storage.
In order to fulfill this feature, the system might need to allow setup three parameters for S3:
The #1 already cover in current KUBEAI design. I would suggest to add #3 (AWS_ENDPOINT_URL) to current design, so it will attract more enterprise to adopt KUBEAI.
The text was updated successfully, but these errors were encountered: