Skip to content

REST client API

Communicate with other cloud services

Our examples load the Kurl library and use async and sync requests to query public datasets.

Blank Slate

A sample application that does nothing, simply loads the REST client

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create a qp.json.

{
    "client": {
        "entry": [ "client.q" ],
        "depends": [ "kurl" ]
    }
}

Create client.q

// Enter your code here!

Build and run.

qp build client
qp run client

Quickstart with service accounts

For running an application on a virtual machine in the cloud

Two steps to using service accounts:

  • Make a service account and list operations (roles or permissions)
  • Launch VM instances and assign them service accounts (IAM roles)

To run any of these examples, simply SSH into a VM (of the same vendor as the example) and follow along.

Finding public datasets in Google BigQuery using Async API

A sample application that demonstrates the power of Async API to list public datasets

Datasets are returned as alphabetically sorted ‘paged’ results, and page tokens are used to get the next pages of results.

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "listdata": {
        "entry": [ "listdata.q" ],
        "depends": [ "kurl" ]
    }
}

Create listdata.q.

// Find the utility_us public dataset.
// This makes several calls to get paged results of datasets, 
// and stops when exact match is found
datasetID:"utility_us"
publicDatasetEndpoint:
  "https://bigquery.googleapis.com/bigquery/v2/projects/bigquery-public-data/datasets"

findDataset:{[datasetID;resp]
  if[200 <> first resp; 'last resp];
  listing:.j.k last resp;
  datasets:`s#{x . `datasetReference`datasetId} each listing `datasets;
  if[datasetID in datasets;
    show "Found ",datasetID;
    :(::);
    ];
  if[`nextPageToken in key listing;
    show "Making subsequent listing starting with ",listing `nextPageToken;
    nextQuery:publicDatasetEndpoint,"?pageToken=",listing `nextPageToken;
    .kurl.async (nextQuery; `GET; ``callback!(::;.z.s[datasetID;]))
    ]; }

.kurl.async (publicDatasetEndpoint; `GET; ``callback!(::;findDataset[datasetID;]))

Build and run.

qp build listdata
qp run listdata

SQL queries against Covid19 table in Google BigQuery

A sample application that uses SQL to query public Covid19 dataset hosted by Google’s BigQuery

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "covid": {
        "entry": [ "covid.q" ],
        "depends": [ "kurl" ]
    }
}

Create covid.q.

// Select the first available projectID you have for Google Cloud
resp: .kurl.sync ("https://bigquery.googleapis.com/bigquery/v2/projects";`GET;::)
if[200 <> first resp; ' last resp]
projects:.j.k[last resp] `projects
if[0 = count projects;'"You need access to at least once project"]
projectID:first projects `id
-1 "Using bigquery for project: ", projectID;

// Run an SQL query agaist a known dataset by name
// The full name of the table is bigquery-public-data.covid19_italy.national_trends
sql_table:"`bigquery-public-data.covid19_italy.national_trends`"

query:"SELECT tests_performed, total_confirmed_cases",
     " FROM ",sql_table,
     " WHERE recovered > deaths"

body:.j.j `useLegacySql`query!(0b; query)
headers:enlist["Content-Type"]!enlist "application/json"

queryEndpoint:"https://bigquery.googleapis.com/bigquery/v2/projects/",
  projectID,"/queries/"

resp:.kurl.sync (queryEndpoint; `POST;`headers`body!(headers;body))
if[200 <> first resp; 'last resp]

// Now convert the JSON into kdb+
mapIn:("INTEGER";"FLOAT";"STRING")!("J"$;"F"$;::)
obj:.j.k last resp
rowData:{x `v} each obj . `rows`f
typeData: obj . `schema`fields
tout:flip (`$typeData `name)! typeData[`type] {mapIn[x] @ y}' flip rowData
show 10#tout
-1 "...";

Build and run.

qp build covid
qp run covid

Upload non-kdb+ data to Azure Storage

For using kdb+ databases in Azure Storage, use the objstor component in kdb+ Cloud Edition, not Kurl directly.

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "storage": {
        "entry": [ "storage.q" ],
        "depends": [ "kurl" ]
    }
}

Create storage.q, replacing <account> with the lowercase name of your storage account.

show "Creating a new container"
headers:enlist ["x-ms-version"]!enlist "2017-11-09"
opts:`headers`tenant!(headers;tenant)

u1:"https://<account>.blob.core.windows.net.blob.core.net/kxcecontainer1",
  "?restype=container"

resp:.kurl.sync (u1; `PUT; ::)
if[201 <> first resp; 'last resp]
show "Uploading hello.txt"

opts[`headers]:("x-ms-version";"x-ms-blob-type";"Content-Type")!
  ("2017-11-09";"BlockBlob";"text/plain")

opts[`body]:"hello world"
u2:"https://<account>.blob.core.windows.net/kxcecontainer1/hello.txt"
resp:.kurl.sync (u2; `PUT; opts)
if[201 <> first resp; 'last resp]

Build and run.

qp build storage
qp run storage

Create a namespace in AWS Cloud Map

Use AWS Cloud Map to create an HTTP namespace and wait for it to come online

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "service": {
        "entry": [ "service.q" ],
        "depends": [ "kurl" ]
    }
}

Create service.q.

// Create a new namespace, this returns an operation ID
body:.j.j enlist[`Name]!enlist "kxceNamespace"

headers:("Content-Type";"x-amz-target")!
  ("application/x-amz-json-1.1";"Route53AutoNaming_v20170314.CreateHttpNamespace")

URL:"https://servicediscovery.us-east-2.amazonaws.com"
resp:.kurl.sync (URL;`POST;`headers`body!(headers;body))
if[first resp <> 200; 'last resp]

checkOperation:{[resp]
  if[200 <> first resp; ' last resp];
  j:.j.k last resp;
  if["SUCCESS" ~ j . `Operation`Status; show "Namespace created"; .test.done:1b]; }

getOperation:{[url;id]
  show "Calling getOperation to see if namespace has been created...";
  headers:("Content-Type";"x-amz-target")!
    ("application/x-amz-json-1.1";"Route53AutoNaming_v20170314.GetOperation");
  opts:`headers`body`callback!(headers;id; checkOperation);
  .kurl.async (url;`POST;opts)
  }[URL;]

// The response of creating a namespace returns an input to GetOperation
.test.id:last resp
.test.done:0b
.z.ts:{ if[.test.done; system "t 0"; :(::)]; getOperation .test.id; }
\t 500

Build and run.

qp build service
qp run service

Interactive OAuth2 login

You can add applications as trusted clients within Google and Azure (Active Directory).

You can bundle client secrets with your application, to enable Single Sign On workflows.

The examples use an HTTP redirect URI for simplicity.

To use HTTPS, you must include TLS cerficates and ENV vars with your QPacker deployment.

You will also need to open port 1234 for incoming requests so that you can use a browser window to log in.

Google People API

Use Google’s People API to list your contact groups

These are the groups you may or may not have in https://contacts.google.com/.

Complete list of Google’s OAuth2 scopes

Create client credentials: in the Google Cloud Console, open API & Services > Credentials and create a new OAuth client ID.

Pick the web application type and give it a Redirect URI of http://<hostname>:1234, where hostname is the hostname/IPV4 of your VM or local machine.

Enable the People API: under API & Services > Library browse for the Google People API and install it. Wait a few minutes before proceeding to the step.

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "oauth2": {
        "entry": [ "oauth2.q" ],
        "depends": [ "kurl" ]
    }
}

Download the client.json, or copy the values presented by Google and uncomment the block.

Replace <hostname> with your machine’s or instance’s public IPv4 address or hostname.

Create oauth2.q.

// Read in downloaded JSON and parse it
client:.j.k "c"$read1`:/path/to/client.json

// UNCOMMENT TO USE HARDCODED VALUES
//// Registered client application secrets
//client:`client_id`client_secret`auth_uri`token_uri`redirect_uris!(
//    "<Client ID>";
//    "<Client Secret>";
//    "https://accounts.google.com/o/oauth2/auth";
//    "https://oauth2.googleapis.com/token";
//    enlist "http://<hostname>:1234"
//    );

callback:{[tenant; auth_response]
  .test.resp:.kurl.sync 
    ("https://people.googleapis.com/v1/contactGroups";`GET;``tenant!(::;tenant));
  show .test.resp; }

uri:"https://www.googleapis.com/auth/contacts.readonly"

.kurl.oauth2.startLoginFlow[
  "*.googleapis.com"; // Google endpoint/s the login is valid for
  client;
  `scope`access_type`prompt`redirect_uri!
    ("openid email ",uri;"offline";"consent";"http://<hostname>:1234");
  callback]

Build and run.

qp build oauth2
qp run oauth2

Have yourself and your colleagues each open a browser window to

http://<hostname>:1234

and log in. (Or use incognito windows yourself.)

Microsoft Graph API

You can use Azure Active Directory to register a trusted application secret that you may bundle with your deployment. Trusted applications can start interactive Single Sign On (SSO) workflows on behalf of users.

Azure Services compatible with Active Directory

Notice Microsoft Graph is not under the Azure umbrella above, but is still supported by Active Directory.

First, create a new Azure Active Directory and client secret.

Using the Azure Portal, go to the Azure Active Directory service. In the Manage section in Azure Active Directory find the App Registrations menu.

Create a new application and give it a Redirect URI of http://<hostname>:1234, where hostname is the hostname/IPV4 of your VM or local machine.

Create a client secret under the Certificates & Secrets menu.

Go to the API Permissions section of the App Registration, and ensure you have User.Read permission for Microsoft Graph. You may also need to Grant admin consent, which will add a green checkmark.

Next deploy the OAuth2 application as follows.

Create a new directory and copy to it kurl.qpk from kdb+ Cloud Edition.

Create qp.json.

{
    "oauth2": {
        "entry": [ "oauth2.q" ],
        "depends": [ "kurl" ]
    }
}

Create oauth2.q as below, replacing

replace with
<Secret Value> Value for the new client secret you made
<Application (client) ID>
<Directory (tenant) ID>
values from the App Overview
<hostname> your machine’s or instance’s Public IPv4 address or hostname
// Registered client application secrets
client:`client_id`client_secret`auth_uri`token_uri`redirect_uris!(
  "<Application (client) ID>";
  "<Secret Value>";
  "https://login.microsoftonline.com/<Directory (tenant) ID>/oauth2/v2.0/authorize";
  "https://login.microsoftonline.com/<Directory (tenant) ID>/oauth2/v2.0/token";
  enlist "http://<hostname>:1234" )

callback:{[tenant; auth_response]
  .test.resp:.kurl.sync 
    ("https://graph.microsoft.com/v1.0/me";`GET;``tenant!(::;tenant));
  show .test.resp;
  // Add here any further API calls you like!  You would need Outlook enabled
  // or other Microsoft Apps to do anything more interesting.
  }

// Open web browser that you will login to, and invoke callback when done
optns:`scope`access_type`prompt`redirect_uri!
  ("openid offline_access https://graph.microsoft.com/mail.read";
   "offline";
   "consent";
   "http://<hostname>:1234")

.kurl.oauth2.startLoginFlow["https://graph.microsoft.com"; client; optns; callback]

Build and run.

qp build oauth2
qp run oauth2

Have yourself and your colleagues each open a browser window to

http://<hostname>:1234

and log in. (Or use incognito windows yourself.)


Kurl documentation