Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 1011 Bytes

File metadata and controls

5 lines (3 loc) · 1011 Bytes

Serve a Custom Model using Standalone Containers

With IBM Watson NLP, IBM introduced a common library for natural language processing, document understanding, translation, and trust. IBM Watson NLP brings everything under one umbrella for consistency and ease of development and deployment. This tutorial explains how to take a Watson NLP model that you trained in IBM Watson Studio, package it into a container image together with the Watson NLP Runtime, and run this container with Docker. When the container runs, it exposes REST and gRPC endpoints that client programs can use to make inference requests.

Follow the tutorial to take a Watson NLP model that you trained in Watson Studio, package it into a container image together with the Watson NLP Runtime, and run this container with Docker. When the container runs it will expose REST and gRPC endpoints that client programs can use to make inference requests.