Client data upload

The client file-upload system allows you to upload CSV files to a specified location in Refinery. You can specify a particular directory that you want to upload a file to in the dashboard. The current options for this are:

secmaster   Security Master/Reference Data
trth        Tick History TAS
orderExec   Order and Execution/Trade Data
stats       Pre-aggregated bars from DSS

Based on the directory that you choose and the postfix of the CSV, the data will be loaded into the system in different ways.

secMaster

Select secMaster to upload

Type of data File-name pattern Example Schema
Corporate Actions *corax.csv 2017corax.csv Corporate Actions Standard Events - DataScope Select
MIC *MICInfo.csv manualMICInfo.csv ISO10383 Market Identifier Codes
Reference *RefData.csv ordersRefData.csv Symbol Cross Reference - DataScope Select
Symbology Tracking *SymbologyTrack.csv futuresSymbologyTrack.csv See Example Below
Calendar *Calendar.csv apacVenuesCalendar.csv See Example Below

For corporate action data, MIC data and reference data, once your data matches this schema, the client file upload filewatcher in the back-end will load this data in using .daas.infra.csvLoader and route the loaded data to the necessary processes. For symbology tracking, the client file upload filewatcher in the back-end will load this data in using .daas.filewatcher.trackSymCSVLoader to route the data.

MIC data example

An example of the MIC Data is as follows:

Market MIC,Instrument ID
XLON,AZN.L
XSHG,600111.SS
XSHG,600988.SS
XSHG,600392.SS
XSHG,601118.SS
XSHG,600837.SS

Reference data example

Instrument ID,Issuer Name,Asset Type,Asset Type Description,Currency
Code,Exchange Code,Exchange Description,ISIN,SEDOL,Issuer OrgID,Security
Description,PILC,Asset SubType,Asset SubType Description,Market
MIC,CUSIP,Common Code,Wertpapier,Round Lot Size,Trading Status,GICS
Industry Code,Exchange Region Code,Main Sector,Market
Capitalization,Primary Quotation Exchange Country Code,CFI Code,Market
Segment Name,MiFID Bond Type,MiFID Bond Type Description,Trading
Symbol,RIC Root,Thomson Reuters Classification Scheme,PE Code,Primary
Chain or Tile,Security Long Description,Base Currency Code,Secondary
Currency Code,Maturity Date,First Trade Date,Start Date,Term Start,Term
Maturity,Number of Days to Maturity,ToTV - DSB,ToTV Effective Date -
DSB,uToTV - DSB,uToTV Evective Date - DSB,MiFID Post Trade LIS Threshold
Value,MiFID Post Trade LIS Threshold Floor,MiFID Post Trade SSTI
Threshold Value,MiFID Post Trade SSTI Threshold Floor,MiFID Standard
Market Size - ESMA
AEGN.AS,Aegon NV,EQTY,Equities,EUR,AEX,Euronext
Amsterdam,NL0000303709,5927375,16740,AEGON ORD,188334,ODSH,Ordinary
shares,XAMS,,025340655,A0JL2Y,1,1,40301020,E,AEX Financials Financial
Index,\"14,586,015,405.499462128\",NL,ESVTFN,Segment
A,,,NL0000303709,,ORD,8132,,Aegon Ord
Shs,,,,1969.07.07,,,,,,,,,\"7,000,000\",,,,\"10,000\"
ENI.MI,Eni SpA,EQTY,Equities,EUR,MIL,Milan Stock
Exchange,IT0003132476,7145056,40871,ENI ORD,2434275,ODSH,Ordinary
shares,MTAA,,026082196,897791,1,1,10102010,E,FTSE Italia All-Share Oil &
Gas
Index,\"71,142,809,611.084152222\",IT,ESVUFR,MB1,,,308,,ORD,3614,,ENI
Ord Shs,,,,1995.11.28,,,,,,,,,\"10,000,000\",,,,\"10,000\"

Symbology tracking data example

sym,date,time,gmtOffset,oldRIC,newRIC,dataSource
J,05/16/2018,2018-06-18D00:21:59.579954000,50,STO,EQNR.K,NASDAQ

Calendar data example

MIC,event,validFrom,holiday,dayOfWeek,time,status
XCAI,tradeOpen,8/1/2008,0,mon\|tue\|wed\|thu\|fri,10:00:00,open
XCAI,tradeClose,8/1/2008,0,mon\|tue\|wed\|thu\|fri,14:30:00,open
XCAI,Coptic Christmas Day,1/7/2018,1,,,closed
XCAI,Revolution Day,1/25/2018,1,,,closed
XCAI,Coptic Easter Sunday,4/8/2018,1,,,closed
XCAI,Sham el Nessim,4/9/2018,1,,,closed
XCAI,Sinai Liberation Day,4/25/2018,1,,,closed
XCAI,Labour Day,5/1/2018,1,,,closed
XCAI,Eid al-Fitr,6/15/2018,1,,,closed
XCAI,Eid al-Fitr,6/16/2018,1,,,closed
XCAI,Revolution Day,7/23/2018,1,,,closed
XCAI,Arafat Day,8/20/2018,1,,,closed
XCAI,Eid al-Adha,8/21/2018,1,,,closed
XCAI,Eid al-Adha,8/22/2018,1,,,closed
XCAI,Islamic New Year,9/11/2018,1,,,closed
XCAI,Armed Forces Day,10/6/2018,1,,,closed
XCAI,Prophet Muhammad\'s Birthday,11/20/2018,1,,,closed
XDUB,tradeOpen,11/1/2007,0,mon\|tue\|wed\|thu\|fri,07:50:00,open
XDUB,tradeClose,11/1/2007,0,mon\|tue\|wed\|thu\|fri,16:30:00,open
XDUB,New Year\'s Day,1/1/2018,1,,,closed
XDUB,St Patrick\'s Day,3/19/2018,1,,,closed
XDUB,Good Friday,3/30/2018,1,,,closed
XDUB,Easter Monday,4/2/2018,1,,,closed
XDUB,May Bank Holiday,5/7/2018,1,,,closed
XDUB,June Bank Holiday,6/4/2018,1,,,closed
XDUB,August Bank Holiday,8/6/2018,1,,,closed
XDUB,October Bank Holiday,10/29/2018,1,,,closed
XDUB,Christmas Eve - Half Day,12/24/2018,1,,13:15:00,open
XDUB,Christmas Day,12/25/2018,1,,,closed
XDUB,St. Stephen\'s Day,12/26/2018,1,,,closed
XDUB,New Year\'s Eve - Half Day,12/31/2018,1,,13:15:00,open

See Calendar Data for a non-standard format.

Order/execution data

In the Client Upload dashboard, select Order/Execution from the dropdown to upload:

  • Order Data
    • Files must have the postfix *orders.csv
  • Execution/Trade Data
    • Files must have the postfix *executions.csv

The schemas for these files are in the Appendix. Currently, files loaded into the system must have the fields in the correct order or the file will be rejected. The layout of these schemas can be viewed in the DataType Codes section of the Appendix.

In addition to the loaded schema, the system will generate the following columns for internally identifying the transactions:

Order

  • internalTimestamp: Date and time the transaction enters the system

  • internalOrderID: Unique internally generated identifier

  • internalOrderVersion: Internally generated versioning

Execution

  • internalTimestamp: Date and time the transaction enters the system

  • internalOrderID: Unique internally generated identifier to link back to order

  • internalExecutionID; Unique internally generated identifier

  • internalExecutionVersion: Internally generated versioning

Providing the user data matches the appropriate schema, the transaction data filewatcher will load the data into the system and publish the real-time data to the transaction data pipeline.

Data validation

Validation is performed on the incoming files to ensure that the data has a minimum number of fields populated. The configuration for this validation can be found in the .daas.orderExec.fwRequiredFieldTypes parameter. If any of the mandatory fields are null, the system will reject the file and output the reason why the file was rejected in the logs of the transaction data filewatcher process.

When a file is rejected, none of the data contained in the file is loaded into the system and the file is moved into the rejected directory within fileStore/transactionData.

Historical batch loading

If the file loaded into the system contains historical dates, the system will use the merge framework to add the historical data to the HDB. The default configuration for the merge is set in the .daas.cfg.fwAcceptedMergeHeaders:DEFAULT parameter. The first row in this parameter is the default for each of the options. Any further rows in the table are accepted values.

If the file loaded into the system does not contain any merge headers, the default merge configuration will be set. If the file loaded contains some but not all of the merge headers, the defaults will be set for the others. However, if the file contains any merge headers set to values that are not in the accepted values table, the file will be moved into the rejected directory and the logs will record which of the headers caused the file to be rejected.

NB: if the file that is rejected contains both historical data and real-time data, none of the data will be added to the system, so the entire file can be reloaded when the issue has been fixed.

DSS_bars

Select DSS_Bars to upload pre-calculated bars from DSS. The filename pattern is *DSS_EOD.csv. The file must have the correct column headings in the correct order; otherwise the data will not be loaded. Within the file the sym and date columns must be populated; otherwise the row will not be included.

For the files to be picked up, the filewatcher process daas_fw_dss_barsData must be running. For the data to be queried, the HDB process emea_stats_dss_marketData_<ASSET_CLASS>_exchNoFilter_0 must be running. These processes do not appear in workflows and must be started manually by the user in Delta Control.

To add the DSS stats HDB processes to a workflow, go to the workflow of the asset class stack, right-click on the starter process and click add existing process.

Then tick the box for the process to add: in this case emea_stats_dss_marketDate_{ASSET_CLASS}_exchNoFilter_0_<ENV> and click OK.

Once that workflow is saved, the DSS HDB will start up with that workflow.