Skip to content

Testing a custom analytic

This document outlines an environment-agnostic approach for the testing of custom analytics within kdb Insights Enterprise.

The testing of these analytics is presently supported for analytics targeting the Stream Processor and Custom Query APIs. Limitations on the APIs used for query are outlined below. This guide will provide you with an understanding of:

  1. The pre-requisites for completing this workflow
  2. How and where you can develop analytics for deployment to kdb Insights Enterprise
  3. The expected workflow for executing and cleaning up a test
  4. How to publish a valid group of analytics to a Production environment


This use-case is presently limited to custom query APIs that target the Data Access Processes. In a future release, it is intended that this will be extended to include aggregation functions.

Custom query APIs are currently limited to being written in q. In a future release, Data Access Processes will support Python.


The following are required to complete this workflow, regardless of the environment from which it be orchestrated:

  • User must have installed the kdb Insights CLI following the instructions outlined here.
  • You must have access to a running installation of kdb Insights Enterprise, against which:
    • You have completed the CLI configuration steps outlined here or
    • Have available in the location ~/.insights/cli-config within the running process, a configuration file which includes the hostname, and client.secret at minimum.
  • The user has available an assembly file that can be modified, against which their custom analytic may be tested with the proviso that it has access to data that is suitable to validate the successful generation of the custom analytic.
  • This workflow operates on the assumption that the user's environment deployment consists of a "Test" or "UAT" environment in addition to a "Production" deployment of kdb Insights Enterprise. Within this paradigm, a user's custom analytics will be tested against the Test/UAT environment prior to the deployment of the Production instance of kdb Insights Enterprise.

Creating an analytic

The environment in which a user writes the analytic code to be included in their test, is entirely at their discretion. The generation of the code that is being tested, however must follow the APIs associated with target location in which the analytic is to be run.

For example, should you wish to make use of a User Defined Function (UDF) within the Stream Processor then you must follow the requirements of the API outlined here and expanded upon in both in the Stream Processor API documentation for q and Python.

Similarly the generation of custom query APIs should follow the guide outlined here.

In both cases, particularly for the addition of custom query APIs, the testing of these locally may require the use of stub functions for the registration of APIs, and definition of metadata that will be actioned, when tested on the kdb Insights Enterprise deployment.

The following analytic examples show the expected formatting of analytics for deployment to the Stream Processor and Data Access Processes separately. These examples are provided as the analytics defined here are used within the sections which follow.

    columns:$[-11h = type columns;enlist columns;columns];
    filter:enlist (<;`i;100); // Note for partitioned tables, will return first 100 per date

// NOTE: Alternatively, .example.api can take a single parameter called `args`. This will result in a dictionary with keys `table and `columns:
// Uncomment me for an alternative example:
// .example.api:{[args]
//    show args `table`columns
//    };

    .sapi.metaDescription["Simple 'select ... from ...' API."],
    .sapi.metaParam[`name`type`isReq`description!(`table;-11h;1b;"Table to query")],
    .sapi.metaParam[`name`type`isReq`default`description!(`columns;11 -11h;0b;`sym`time;"Column(s) to select.")],
    .sapi.metaReturn[`type`description!(98h;"Result of the select.")],
    .sapi.metaMisc[enlist[`safe]!enlist 1b]];
// @udf.description("This UDF adds 1 to any data it's passed")

Generating a test package containing your code

To run a test you must first have a folder that conforms to the requirements of the kdb Insights CLI packaging mechanism, namely that your codebase contains at its root an appropriate manifest.json file, the format for which is defined here.

This file outlines the configuration of your custom code and is used specifically to denote the code which will be loaded when a package is to be used.

For example the default entrypoint can be used to specify how the loading of code will be handled when loaded using the Packaging q API outlined in greater detail here. Similarly, when loading custom analytics, the Data Access Processes as outlined here require the code path which will be used to load a user's custom code to be defined in the data-access entrypoint.

Once you are happy that your test package has been created appropriately, with your entrypoints defined, it should then be possible within CI or locally to create a test package for use within your test. The generation of a test package is facilitated done using the kdb Insights CLI using the following command:

kxi package packit path/to/package --version 1.0.0

For example, the following shows the output of generating a test package:

kxi package packit qpackage --version 1.0.0
Creating package from qpackage
Package created: /tmp/artifacts/qpackage-1.0.0_61432767.kxi

The artifact name specified in the output of this packaging command is the name of the test artifact that is used throughout the rest of this process. Namely qpackage denotes the name of the package while the version of the package is 1.0.0_61432767. These values should be used when referencing the test package within your deployed assembly. This will be discussed in more detail below.

Running your test

The following outlines the steps required to run the test of a custom query API.

  1. Upload a test package to kdb Insights Enterprise.
  2. Verify the package is available for use.
  3. Update your assembly file to reference the development/test hash of your package.
  4. Deploy the assembly file to kdb Insights Enterprise.
  5. Wait until the assembly has been fully deployed.
  6. Verify the custom API is available and operates as expected.

To run your test, you must first upload the generated test package to your installation of kdb Insights Enterprise. Do this using the kdb Insights CLI command kxi package push. The following shows the application of this command using the above sample test package configuration as an example:

kxi package push qpackage-1.0.0_61432767.kxi
    "qpackage": [
            "version": "1.0.0_61432767",
            "_status": "InstallationStatus.SUCCESS"

Verifying that a package has been uploaded correctly can be completed through the listing of all packages that are available on your installation as follows:

kxi package remote-list
    "qpackage": [
            "version": "1.0.0_61432767",
            "fields": {}

When your package has been uploaded successfully and has been verified, you can modify your assembly file to reference the test hash version associated with your package. This means making the following changes to the configuration associated with your Data Access Processes; for example updating qpackage from version 0.9.0 to a test version 1.0.0:

            - name: KXI_PACKAGES
-              value: "qpackage:0.9.0"
+              value: "qpackage:1.0.0_61432767"
            - name: KXI_PACKAGES
-              value: "qpackage:0.9.0"
+              value: "qpackage:1.0.0_61432767"
            - name: KXI_PACKAGES
-              value: "qpackage:0.9.0"
+              value: "qpackage:1.0.0_61432767"

Once updated, use the CLI to deploy the assembly file to your kdb Insights Enterprise deployment, following the instructions here.

For example as follows:

kxi assembly deploy --filepath test_assembly.yaml

Teardown old assemblies

If you already have an assembly of this name running within your deployment you will need to teardown that assembly and redeploy the assembly again.

When initially deployed, all components of an assembly will not immediately be callable as services come online. To verify that the assembly is fully ready for use, use the list and status sub-commands under kxi assembly within the kdb Insights CLI.

Once it has been established that all services are available, a user can begin to complete tests against their APIs. For example:

  • Test for the existence of the custom API using a curl request. The following example runs a curl request using the getMeta API retrieving and defining INSIGHTS_HOSTNAME and INSIGHTS_TOKEN in accordance with the documentation here. Specifically this test checks for the existence of a named API `.example.api using jq

        export API_NAME=".example.api"
        curl -X POST -H "Content-Type: application/json" \
            -H "Accept: application/json"\
            -H "Authorization: Bearer $INSIGHTS_TOKEN"\
            "https://${INSIGHTS_HOSTNAME}/servicegateway/kxi/getMeta" | jq '.payload.api[] | .api | index("${API_NAME}")'
  • Test the return of applied queries using your custom API

    startTS=$(date -u '+%Y.%m.%dD%H:00:00')
    endTS=$(date -u '+%Y.%m.%dD%H:%M%:%S')
    curl -X POST "https://${INSIGHTS_HOSTNAME}/servicegateway/example/api" \
            -H "Content-Type: application/json" \
        -H "Accepted: application/json" \
        -H "Authorization: Bearer $INSIGHTS_TOKEN" \
        -d "$(jq -n --arg startTS "$startTS" --arg endTS "$endTS" \
                     table   : "trades",
                     columns : ["sym", "price"],
                     startTS : $startTS,
                     endTS   : $endTS
                 }' | jq -asR .)"

Cleaning up a test

Once you have run tests and you are happy with the results, you can fully clean up a test as follows:

  1. Teardown the running assembly

    kxi assembly teardown --name test_assembly
  2. Remove the test artifact and validate its removal

    kxi package remote-remove qpackage-1.0.0_61432767.kxi

Creating and uploading your tested artifact

Once your package has been tested you can create a 'tagged' version of your package which denotes it is ready for use in a production environment. This is done using the --tag option when creating a packaged entity, as follows:

kxi package packit qpackage --version 1.0.0 --tag
Package created: /tmp/artifacts/qpackage-1.0.0.kxi

Once generated this package can be uploaded to your kdb Insights Enterprise deployment of choice and used by deploying an assembly that references the package and its use.


In summary this workflow provides you with the ability to generate within a CI or local development environment a workflow that appropriately packages, uploads and validates your APIs prior to the release of a package or it being published to a production environment.