Skip to content

Bloomberg Equities Analytics Accelerator Quickstart Guide

This page provides a quickstart guide for the Bloomberg Equities Analytics, an FSI Accelerator package that comes with ready-made Pipelines, Views, and Schemas out of the box.

Prerequisites

This quickstart guide is not intended to be a comprehensive KDB Insights Install Guide. For more information regarding KDB Insights installation, refer to the kdb Insights Enterprise documentation. This guide assumes the following pre-requisites:

  • KDB Insights Enterprise is installed
  • Credentials have been obtained and configured:
    • For Insights users:
      • GUI User
      • API Client
  • KX Downloads Portal bearer token to download packages/charts (represented by BEARER in this guide).
  • Tools used:
    • Access to *nix command-line
    • KDB Insights CLI - kxi
  • Kubernetes tools:
    • kubectl
    • K9s
  • Helm installed and logged in to nexus
  • VSCode plugins:
    • KX kdb
    • Jupyter

Quick Start Guide

To download the required FSI packages from the KX Downloads Portal, run the following:

# Download fsi-lib - the Core FSI Library
curl -s --fail-with-body -D /dev/stderr --oauth2-bearer ${BEARER} -L -OJ https://portal.dl.kx.com/assets/raw/kxi-accelerators/fsi/fsi-lib/packages/1.1.3/fsi-lib-1.1.3.kxi

# Download fsi-app-bbg-eqea - the Bloomberg Equities Analytics Accelerator
curl -s --fail-with-body -D /dev/stderr --oauth2-bearer ${BEARER} -L -OJ https://portal.dl.kx.com/assets/raw/kxi-accelerators/fsi/fsi-bbg-eqea/packages/1.0.0/fsi-app-bbg-eqea-1.0.0.kxi

Unpack

Run the command below to unpack fsi-app-bbg-orderbook:

kxi package unpack fsi-app-bbg-eqea-1.0.0.kxi

Configure

Configure your pipelines to target files. In this quickstart, we configure the minimum set of pipelines. For full functionality, refer to Additional pipeline configuration.

Order ingest

First, we configure the Order ingest pipeline.

If we are using normalization, the setting is turned on by setting the boolean as true. However, in this example it is set as false:

sed -i 's|\.eqea\.useTimezoneNormalization\:.*|.eqea.useTimezoneNormalization:0b;|g' fsi-app-bbg-eqea/src/eqeaorderingest-pipeline-spec.q

By not activating normalization, our orders are ingested as-is, without a strike time calculation or time zone adjustment.

Set the target file as follows:

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/order_data.csv";|g' fsi-app-bbg-eqea/src/eqeaorderingest-pipeline-spec.q

fxRates

If you are using fxRates to calculate values in USD, set the target file as follows:

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/fxRates.csv";|g' fsi-app-bbg-eqea/src/eqeafxrates-pipeline-spec.q

fxRates are required to calculate the value in USD, which is the foundation of the dashboard metrics.

Market data

Market data for the time period the order covers is required.

Bloomberg historical data is provided as separate files for Trades and Quotes, so we set the files in two pipelines. We also have the option of either CSV or Parquet. In this example, we use the CSV pipelines.

We set the file path in the Bloomberg market data ingestion pipeline to the corresponding Bloomberg historical files:

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/trades.csv.gz";|g' src/bbgtradeingest-pipeline-spec.q

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/quotes.csv.gz";|g' src/bbgquoteingest-pipeline-spec.q

Pack and deploy

To pack and deploy the packages, run the following command:

kxi package packit fsi-app-bbg-eqea --tag --version 1.0.0

Push the packages to Insights and Deploy the fsi-app-bbg-eqea assembly:

# Install the packages
kxi package push --force fsi-lib/1.1.3
kxi package push --force fsi-app-bbg-eqea/1.0.0
# Deploy the assembly
kxi package deploy fsi-app-bbg-eqea/1.0.0

Note

At this point you should be able to see the pipelines booting up on the GUI or on k9s.

Access APIs

Once deployed, we can query the tables using getData or getTicks, then use the generation API to look at what the order analytics would look like.

Run the generation pipeline to persist the order analytics

Now that data is ingested and the API for generation is working, we can run an instance of the nightly generation pipeline that persists the order analytics. Refer to the guide for generation and persistence of orderAnalytics.

View summary of order analytics

Once orderAnalytics has been written into the database, we can now query the table and use the summary rollup functions. These summary rollups power the example dashboards.

API

Call the summary API on the orderAnalytics table for rollup aggregations using the API.

Dashboards

Log into the dashboards to investigate order analytics. The dashboard is named Equity-Execution-Analysis.

The dashboard serves as an example of rendering some metrics from the summary rollup calculations.

Additional pipeline configuration

To use the full feature set of the accelerator, the following additional reference pipelines are also required:

Exchange reference data

If normalizing Order data, exchange reference data is required to calculate strike time:

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/exchangeHours.csv";|g' fsi-app-bbg-eqea/src/eqeaexchange-refdata-pipeline-spec.q

Composite mapping

If composite tickers are being used for market data but orders are for local tickers, then we must ingest a composite ticker map, as follows:

sed -i 's|\.fsi\.filePath\:.*|.fsi.filePath:`$":s3://my-bucket/compositeTickerMap.csv";|g' fsi-app-bbg-eqea/src/eqeacompositeticker-pipeline-spec.q

This feature also requires a boolean setting turned on - which is enabled by default - in a custom file in an overlay:

.eqea.mapToCompositeTicker:1b;

Next steps - nightly schedules

Once you are happy with all the pipelines, the next step is to set up a re-occurring schedule of ingest pipelines to consume all data nightly, followed by a pipeline that generates the analytics.

Refer to the scheduling pipelines documentation for more details on setting up a re-occurring schedule.