/mcp endpoints instead of /sse. Streamable HTTP provides bidirectional streaming for improved performance and reliability.
The Keboola MCP Server is available at github.com/keboola/mcp-server. If this integration works well for you, please consider giving the repository a ⭐️!
This section describes how to integrate with Keboola using the Model Context Protocol (MCP). For information on using MCP within the Keboola UI, please see help.keboola.com/ai/mcp-server/.
Integration with existing MCP clients typically involves configuring the client with your Keboola project details and API tokens/OAuth provider. Popular MCP clients include:
The Keboola MCP Server facilitates this by acting as a bridge, translating natural language queries from your client into actions within your Keboola environment. For a comprehensive list of clients supporting the Model Context Protocol, please visit list of available clients.
Anthropic offers a beta feature, the MCP connector, which enables you to connect to remote MCP servers (such as the Keboola MCP Server) directly through Claude’s Messages API. This method bypasses the need for a separate, standalone MCP client if you are already using the Claude Messages API.
Key features of this integration:
mcp_servers parameter in your API requests to Claude.This approach can simplify your architecture if you’re building applications that programmatically interact with Claude and need to leverage MCP-enabled tools without managing an additional client layer.
For complete details, API examples, and configuration options, please consult the official Anthropic MCP connector documentation.
Modern AI agent frameworks can connect directly to the Keboola MCP Server and expose all of its tools inside your agents. This unlocks fully automated data workflows driven by natural-language instructions.
The OpenAI Agents SDK ships with first-class MCP support. Simply
start the Keboola MCP Server (locally via uvx or remotely over Streamable HTTP) and register it with the SDK:
from openai_agents_python import Agent
from openai_agents_python.mcp import MCPServerStdio
async with MCPServerStdio(
params={"command": "uvx", "args": ["keboola_mcp_server"]}
) as mcp:
agent = Agent(
name="Assistant",
instructions="Use the Keboola tools to achieve the task",
mcp_servers=[mcp],
)
await agent.run("Load yesterday's CSV into Snowflake")
The SDK automatically calls list_tools() on the server, making every Keboola operation available to the model.
LangChain does not yet include a built-in MCP connector, but you can integrate by:
https://mcp.REGION.keboola.com.list_tools() to a Tool in LangChain.AgentExecutor.Because the server returns standard JSON schemas, the mapping is straightforward and can be handled with a lightweight wrapper. Native MCP support is already under discussion in the LangChain community.
If you are developing your own MCP client or integrating MCP capabilities into a custom application, you can connect to the Keboola MCP Server. The server supports standard MCP communication protocols.
For detailed instructions and SDKs for building your own MCP client, refer to the official Model Context Protocol documentation for client developers.
Information on supported transports (e.g., stdio, Streamable HTTP) is provided in the ‘MCP Server Capabilities’ section below. For more details on the Keboola MCP server, including how it can be run and configured for custom client integration, please refer to its GitHub repository.
The Keboola MCP Server supports several core concepts of the Model Context Protocol. Here’s a summary:
| Concept | Supported | Notes |
|---|---|---|
| Transports | ✅ | Supports stdio and Streamable HTTP (recommended) for client communication. SSE is deprecated and will be removed on 01.04.2026. |
| Prompts | ✅ | Processes natural language prompts from MCP clients to interact with Keboola. |
| Tools | ✅ | Provides a rich set of tools for storage operations, component management, SQL execution, job control. |
| Resources | ❌ | Exposing Keboola project entities (data, configurations, etc.) as formal MCP Resources is not currently supported. |
| Sampling | ❌ | Advanced sampling techniques are not explicitly supported by the server itself. |
| Roots | ❌ | The concept of ‘Roots’ as defined in general MCP is not a specific feature of the Keboola MCP server. |
When connecting to the Keboola MCP Server via HTTP-based transports (Streamable HTTP recommended; SSE is deprecated), you can control which tools are available to clients using HTTP headers. This is useful for restricting AI agent capabilities, enforcing compliance policies, or providing customer-specific access controls.
stdio transport for local execution.
The following HTTP headers control tool access:
| Header | Description | Example Value |
|---|---|---|
X-Allowed-Tools |
Comma-separated list of tool names to allow. Only these tools will be available. | get_configs,get_buckets,query_data |
X-Disallowed-Tools |
Comma-separated list of tool names to exclude. These tools will be removed from the available set. | create_config,run_job |
X-Read-Only-Mode |
When set to true, 1, or yes, restricts access to read-only tools only. |
true |
When multiple headers are present, filters are applied in the following order:
X-Allowed-Tools is specified, only those tools are initially available.X-Read-Only-Mode is enabled, the available tools are intersected with the read-only tools set.X-Disallowed-Tools are removed from the final set.Empty headers are treated as no restriction/exclusion (backward compatible behavior).
The following 15 tools are classified as read-only (they do not modify data):
| Category | Tools |
|---|---|
| Components | get_configs, get_components, get_config_examples |
| Flows | get_flows, get_flow_examples, get_flow_schema |
| Storage | get_buckets, get_tables |
| SQL | query_data |
| Data Apps | get_data_apps |
| Jobs | get_jobs |
| Search | search, find_component_id |
| Project | get_project_info |
| Documentation | docs_query |
AI Agent Restrictions: When integrating AI agents (like Devin, Cursor, or custom agents) with your Keboola project, you may want to limit their capabilities. For example, allowing an agent to query and explore data but preventing it from creating or modifying configurations:
X-Read-Only-Mode: true
Compliance and Security: For environments with strict data governance requirements, you can create customer-specific access profiles. For example, allowing only specific tools while explicitly blocking others:
X-Allowed-Tools: get_buckets,get_tables,query_data,search
X-Disallowed-Tools: run_job
Combined Restrictions: You can combine all three headers for fine-grained control. For example, to allow only a subset of read-only tools:
X-Allowed-Tools: get_configs,get_buckets,get_tables,query_data,create_config
X-Read-Only-Mode: true
X-Disallowed-Tools: query_data
This configuration would result in only get_configs, get_buckets, and get_tables being available (the intersection of allowed and read-only, minus the disallowed).
For a consistent and isolated environment, running the Keboola MCP Server via Docker is often the recommended approach for local execution, especially if you don’t want to manage Python environments directly or are integrating with clients that can manage Docker containers. Docker allows applications to be packaged with all their dependencies into a standardized unit for software development.
Before proceeding, ensure you have Docker installed on your system. You can find installation guides on the official Docker website.
docker pull keboola/mcp-server:latest
Run the Docker container:
docker run -it --rm \
-e KBC_STORAGE_TOKEN="YOUR_KEBOOLA_STORAGE_TOKEN" \
-e KBC_WORKSPACE_SCHEMA="YOUR_WORKSPACE_SCHEMA" \
keboola/mcp-server:latest \
--api-url https://connection.YOUR_REGION.keboola.com
Replace YOUR_KEBOOLA_STORAGE_TOKEN, YOUR_WORKSPACE_SCHEMA, and https://connection.YOUR_REGION.keboola.com with your actual values.
# Ensure your Google Cloud credentials JSON file is accessible
docker run -it --rm \
-e KBC_STORAGE_TOKEN="YOUR_KEBOOLA_STORAGE_TOKEN" \
-e KBC_WORKSPACE_SCHEMA="YOUR_WORKSPACE_SCHEMA" \
-e GOOGLE_APPLICATION_CREDENTIALS="/creds/credentials.json" \
-v /local/path/to/your/credentials.json:/creds/credentials.json \
keboola/mcp-server:latest \
--api-url https://connection.YOUR_REGION.keboola.com
Replace placeholders and ensure /local/path/to/your/credentials.json points to your actual credentials file on your host machine.
The --rm flag ensures the container is removed when it stops. The server inside Docker will typically listen on stdio by default, which is suitable for clients that can invoke and manage Docker commands.
Example: Configuring Cursor IDE to use Docker for Keboola MCP Server:
If your MCP client (like Cursor) supports defining a Docker command for an MCP server, the configuration might look like this:
{
"mcpServers": {
"keboola": {
"command": "docker",
"args": [
"run",
"-it",
"--rm",
"-e", "KBC_STORAGE_TOKEN",
"-e", "KBC_WORKSPACE_SCHEMA",
"keboola/mcp-server:latest",
"--api-url", "https://connection.YOUR_REGION.keboola.com"
],
"env": {
"KBC_STORAGE_TOKEN": "YOUR_KEBOOLA_STORAGE_TOKEN",
"KBC_WORKSPACE_SCHEMA": "YOUR_WORKSPACE_SCHEMA"
}
}
}
}
Note:
YOUR_KEBOOLA_STORAGE_TOKEN, YOUR_WORKSPACE_SCHEMA, and the Keboola API URL.KBC_STORAGE_TOKEN and KBC_WORKSPACE_SCHEMA from its env block to the docker run command through the -e flags. The --api-url is passed directly as an argument to the keboola/mcp-server entrypoint.While MCP clients like Cursor or Claude typically manage the MCP server automatically, you might want to run the Keboola MCP Server locally for development, testing, or when using a custom client.
The primary way to run the server locally is by using uv or uvx to execute the keboola_mcp_server package. More information about the server is available in its Keboola MCP Server GitHub repository. Make sure you have Python 3.10+ and uv installed.
KBC_STORAGE_TOKEN: Your Keboola Storage API token.KBC_WORKSPACE_SCHEMA: Your Keboola project’s workspace schema (for SQL queries).KBC_API_URL: Your Keboola instance API URL (e.g., https://connection.keboola.com or https://connection.YOUR_REGION.keboola.com).Refer to the Keboola Tokens and Keboola workspace manipulation for detailed instructions on obtaining these values.
1.1. Additional Setup for BigQuery Users
If your Keboola project uses BigQuery as its backend, you will also need to set up the GOOGLE_APPLICATION_CREDENTIALS environment variable. This variable should point to the JSON file containing your Google Cloud service account key that has the necessary permissions to access your BigQuery data.
Example:
GOOGLE_APPLICATION_CREDENTIALS="/path/to/your/credentials.json"
uvx keboola_mcp_server --api-url $KBC_API_URL
The KBC_API_URL was set as an environment variable but can also be provided manually. The command starts the server communicating via stdio. To run the server in Streamable HTTP mode (listening on a network host/port such as localhost:8000), pass the appropriate flags to keboola_mcp_server. For day-to-day use with clients like Claude or Cursor you usually do not need to run this command manually, as they handle the server lifecycle.
When you run the Keboola MCP Server manually, it will typically listen on stdio or on a specific HTTP port if configured for Streamable HTTP.
stdio-based clients: Configure the client application to launch the local keboola_mcp_server executable and communicate over standard input/output.Streamable HTTP-based clients: If you start the server in HTTP mode, your client should connect to the specified host and port (e.g., http://localhost:8000/mcp?storage_token=XXX&workspace_schema=YYY).Keboola MCP Server is also hosted in every multi-tenant stack with OAuth authentication support. In case your AI assistant supports remote connection and OAuth, you can connect to Keboola’s MCP Server by following these steps:
https://mcp.<YOUR_REGION>.keboola.com/mcp.
Users & Settings > MCP Server
Some of the AI Assistants or MCP Clients do not support the remote OAuth connection yet.
In that case you can still connect to the remote instance using the mcp-remote adapter.
brew install node
Configure your client using mcp.json:
{
"mcpServers": {
"keboola": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.<YOUR_REGION>.keboola.com/mcp"
]
}
}
}
Once you save the settings and refresh your AI assistant, you will be prompted to authenticate with your Keboola account and select the project you want to connect to.
If you are running the Keboola MCP Server locally using uvx, you can configure Cursor IDE to connect to this local instance. This is useful for development or testing with a custom server build.
Manual setup:
KBC_STORAGE_TOKEN, KBC_WORKSPACE_SCHEMA and the API URL.Example mcp_servers.json snippet:
{
"mcpServers": {
"keboola": {
"command": "uvx",
"args": [
"keboola_mcp_server",
"--api-url", "https://connection.YOUR_REGION.keboola.com"
],
"env": {
"KBC_STORAGE_TOKEN": "your_keboola_storage_token",
"KBC_WORKSPACE_SCHEMA": "your_workspace_schema"
}
}
}
}
You can use this link to get the above configuration template into your Cursor:
Remote setup
Alternatively, click the button related to your region to use the remote deployment:
| Stack (Region) | Cursor Deeplink |
|---|---|
| US Virginia AWS (default) | |
| US Virginia GCP (us-east4) | |
| EU Frankfurt AWS (eu-central-1) | |
| EU Ireland Azure (north-europe) | |
| EU Frankfurt GCP (europe-west3) |
Always refer to your MCP client’s documentation for the most up-to-date instructions on configuring external MCP servers.