Ingest and Transform Overview
Data ingestion brings external data into kdb Insights Enterprise. Data ingestion can come in two forms, streaming data or batch data. Both cases of data ingestion use the same building blocks as batch ingestion is just a bounded case of streaming ingestion. Data ingestion and transformation is powered by the kdb Insights Stream Processor. Users can take advantage of three main methods for building ingestion or transformation pipelines. Click on the links below to learn more:
Using the UI
The import wizard lets you connect external data sources with a kdb Insights database.
Pipelines allow you to connect to data sources to data sinks, transform data and analyze data in a drag and drop UI.
Using APIs
Pipelines can be written using the pipeline api and submitted using the kdb Insights CLI as an assembly.
Examples
See below for a list of pipeline examples to get up and running:
- S3 Ingestion - Import data from an S3 bucket
- Kafka - Import data from a Kafka stream
- PostgreSQL - Query data from PostgreSQL