Skip to content


Choose from the list of available writers listed below.

  • Click to edit
  • Right-click to remove


Writes events to the kdb+ console. Can be used for simple streams running on a local deployment. By default, all vectors (lists of the same type) are printed on a single line. By enabling the Split Lists option, vectors will be printed on separate lines. Lists of mixed type are always printed on separate lines.


item description
Prefix A string for output messages.
Timestamp Choose between Local Time, UTC, No Timestamp or empty for no timestamp.
Split Lists When enabled, each item in a list appears on a new line.
QLog When enabled, prints all console logs to a QLog stream.


Publish data on a Kafka topic. Kafka is a distributed event streaming platform. A Kafka producer will publish data to a Kafka broker which can then be consumed by any downstream listeners. All data published to Kafka must be encoded as either strings or serialized as bytes. If data reaches the Kafka publish point that is not encoded, it is converted to q IPC serialization representation.


item description
Broker Define Kafka connection details as host:port information - for example: localhost:9092.
Topic The name of the Kafka data service to publish on.
Use TLS Enable TLS.
Kubernetes Secret The name of a Kubernetes secret that is already available in the cluster and contains your TLS certificates.
Certificate Password TLS certificate password, if required.


Writes events to another kdb+ process, publishes data to a table or invokes a function with that data.


item description
Mode Select between Upsert to table or Call function.
Handle q style IPC Handle to destination process, includes port details to write to.
Table The name of table to upsert data to for further exploration.
Asynchronous When enabled, push to the output process asynchronously. This increases the throughput allowing multiple messages to be queued without blocking processing.

KX Insights Stream

Writes events to a KX Insights stream.


item description
Table Categorize stream messages by table name
Deduplicate Stream When enabled, duplicated message caused by failure events will be de-duplicated before sent downstream. When enabled, the pipeline must be deterministic to avoid any undefined behavior.

KX Insights Database

Writes events to a KX Insights Database


item description
Database Select from available databases
Table Select a table from the schema associated with the database
Deduplicate Stream When enabled, duplicate messages caused by failure events will be deduplicated before being sent to any downstream consumers. With this value enabled, the pipeline must be deterministic to avoid any undefined behaviour.