A service to remotely operationalize and manage models from a BentoML Repo in Docker Containers or tmux Sessions
Addressed Issues • Target Group • Setup • ToDos
This repo coordinates the deployment of ML-Models via BentoML adding the following aspects:
- Start the deployment with custom parameters
- Check if the deployment was successful
- Automatically retire old versions of the same model
- Rollback if deployment was unsuccessful
- Adding the service to Prometheus
- Adding Airflow DAGs for Batch Prediction
This repo is for engineers/data scientists who encountered the same problems when using BentoML in an end-to-end-workflow.
- Make sure either Docker is installed and user has Docker rights or tmux is installed
- Airflow and Prometheus are optional
- Create conda env from environment.yml
gunicorn app.main:app -w 4 -k uvicorn.workers.UvicornWorker -t 320 -b 0.0.0.0:8000
- Logic to check if tmux/docker is installed