Skip to content

Machine Learning Microservice

Use of the kdb Insights Machine Learning functionality as a standalone service

The functionality provided by the kxi-ml docker image/microservice surfaces in two principle areas:

  1. It provides a development location for users wishing to deploy ML models to kdb Insights via the Stream Processor. This image has the same Python requirements as those used by the Stream Processor.
  2. It is used by the Stream Processor as part of Microservice and Enterprise offerings to provide Machine Learning functionality.

The documentation included in this section is intended to provide users with an understanding of the types of problems that can be solved using this functionality and how to get up and running. It is split across 2 sections:

  1. Quickstart guides:
    1. Docker
    2. Kubernetes
  2. Examples:
    1. Model generation and deployment