Creating Sandbox

In order to run and debug KBC dockerized applications (including Custom Science, R and Python Transformations) on your own computer, you need to manually supply the application with a data folder and configuration file.

To create a sample data folder, use a Docker Runner API. There are three calls available:

The API calls will resolve and validate the input mapping and create a configuration file. Then they will archive the whole /data/ folder and upload it to your KBC project. None of these API calls will write any tables or files other than the archive, so they are very safe to run.

The body structure of the first two API calls is the same. Before you start, you need a KBC project. We recommend that you use Apiary or Postman to call the API.

Create Sandbox API Call


Create a table in KBC Storage which contains a column named number. You can use the sample table. In the following example, the table is stored in the in.c-main bucket and the table name is test. The table ID is therefore in.c-main.test.

Storage Screenshot

You also need your Storage API token which can be found by clicking the icon at the top right corner.

Create the API Request

Get a collection of sample requests in Postman here: Run in Postman There is a Sandbox introduction request with the following JSON contents in its body:

    "configData": {
        "storage": {
            "input": {
                "tables": [
                        "source": "in.c-main.test",
                        "destination": "source.csv"
        "parameters": {
            "multiplier": 4

The node refers to the existing table ID (the table created in the previous step) in Storage. The node refers to the destination to which the table will be downloaded for the application; it will therefore be the source for the application. For registered components with a UI, the entire node is generated by the UI. The node parameters contains arbitrary parameters which are passed to the application.

The URL of the request is The request body is in JSON.

Enter your Storage API token into Headers - X-StorageAPI-Token.

Run the API Request

When running the request with valid parameters, you should receive a response similar to this:

    "id": "176883685",
    "url": "",
    "status": "waiting"

This means an asynchronous job which will prepare the sandbox has been created. If curious, view the job progress under Jobs in KBC:

Job progress screenshot

The job will be usually executed very quickly, so you might as well go straight to Storage - File Uploads in KBC. There you will find a file with a sample data folder. You can now use this folder with your Docker extension or Custom Science

Input Data API Call

The input API call differs in that it must be used with an existing component. It requires componentId obtained from the component registration. This also means that this call can be used both with existing configurations as well as ad-hoc configurations (as in the above sandbox request.


We assume you have the same in.c-main.test source table as in the previous request. You can then create configuration for our sample component by visiting the following URL:{projectId}/applications/

Where you replace {projectId} with the Id of the project in KBC (you can find that in the URL). Then create the configuration. The equivalent to what we have used in the Sandbox above call would be

Configuration screnshot

Run the API Request

When you created the configuration, it was assigned a configuration Id — sample-configuration-27 — in our example. Use this Id instead of manually crafting the request body.

You can see an Input data Introduction sample request in our collection of requests in Postman
Run in Postman.

The following is the request body:

    "config": "sample-configuration-27"

Where you need to replace sample-configuration-27 with your own configuration ID. The request URL is

Where is the component ID (you can replace that with your own component if you like). Again, do not forget to enter your Storage API token into Headers - X-StorageAPI-Token.

As with the Sandbox call, running the API call will create a job which will execute and produce a file in Storage - File Uploads.

Important: If you actually want to run the above sample-configuration-27 configuration, you also need to set the output mapping from destination.csv to some table.


  • For unregistered components, use the Sandbox call:
    • the whole configuration must be passed as the body (configData node) of the API call in JSON format
    • the source data is limited to 50 rows
  • For registered components, use the Input data call:
    • the configuration can be either passed as the body (configData node), or it can refer to an existing configuration (config node)
    • the source data is exported unlimited — this can lead to large data folders!
  • Both the Sandbox and Input calls create a job (automatically executed) which produces a file in your Storage - File Uploads
    • the folder can be extracted and mapped to your dockerized application
    • the contains the input tables and files, their manifests and configuration file
    • the does not contain any data in the out folder, your application has to produce it