-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat : Support AWS bedrock base models #25
base: main
Are you sure you want to change the base?
Feat : Support AWS bedrock base models #25
Conversation
|
Hey @nirga , Quick question about integrating Stability AI models in our Hub. I'm looking at AWS Bedrock's Stable Diffusion 3.5 integration (from their model catalog). Not sure about the best format to implement this:
Would appreciate your thoughts on this. Thanks! |
I think it should be in a new api @detunjiSamuel |
This is a draft.I'm cleaning in up the comments , notes and external links on my machine
AWS Bedrock Provider Integration
Added support for AWS Bedrock as a new LLM provider:
Key Changes
Added Bedrock provider implementation with model-specific handlers:
Testing Notes
All tests pass using AWS credentials in us-east-1/2 regions
Verified error handling for invalid credentials/models
Tested non-streaming responses ( models in Bedrock don't seem to have streaming types )
Review notes
The model ID from AWS link does not work consistently.
Instead, use the
Inference profile ARN
orInference profile ID
from the cross-region reference tab as your model_id.Issue: #20