Hades invests a lot of time figuring out how to distribute Machine Learning Models on the cloud, we are making all our DevOps and MLOps work open source so you don't have to. Use this custom Image to serve your Machine Learning Models on AWS Lambda.
To get started, clone the repo and cd
into the hades_ml_lambda
folder. Now run the following commands in your terminal:
- Build the image using Docker.
docker image build -t hades/ml-lambda:latest .
- Run the docker container using the image you just built.
docker run -p 9000:8080 hades/ml-lambda:latest
- Change your working directory to the
tester folder
cd tester
- Run our custom file for testing the lambda function.
python test.py
If you're still using our saved model and the unchanged test script you should get a response of 9 as your prediction and that means it's working!
We used a simple CNN pretrained model on the standard MNIST dataset that can be described as follows in keras:
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), activation='relu', input_shape=input_shape))
model.add(Conv2D(64, kernel_size=(3, 3), activation='relu'))
model.add(Conv2D(128, kernel_size=(3, 3), activation='relu'))
model.add(Flatten())
model.add(Dense(256, activation='relu'))
model.add(Dense(no_classes, activation='softmax'))
To use your own model, all you need to do is swap out your saved model with the one saved in the app folder and you're good to go, it really is that easy!
Since the announcement in December 2020 that AWS Lambda would support custom images, our first idea was to figure out how to run machine learning workloads on AWS Lambda. From a high level, this is how it works.
First, we package up our image using Docker with all the necessary dependencies to load and serve TensorFlow models to enable you to run your inference on AWS Lambda.
That's pretty much it! Next, we upload our pre-built image to ECR to run it on AWS Lambda and start making predictions on the cloud. Follow the instructions here on how to get your pre-built Lambda Image on the cloud: https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/
- We suggest saving your model in .h5 format to ensure compatibility.
- You may need git-lfs if your model is too large.
To use this contianer you'll need:
- Docker Desktop Installed https://www.docker.com/products/docker-desktop
- The Docker CLI installed on your local machine.
- Your model will need to be trained with Tensorflow 2 or above and ideally saved with Keras.
All of our work is open-source and contributions are welcome so feel free to submit pull requests and post issues on our GitHub organisation!