LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page
  • Prerequisites
  • Step 1: Create a topic in your WarpStream cluster
  • Step 2: Produce some records
  • Step 3: Connect ClickHouse to WarpStream
  • Step 4: Query your data in ClickHouse
  • Next Steps

Was this helpful?

  1. Reference
  2. Integrations

ClickHouse

This page describes how to integrate WarpStream with ClickHouse, ingest data from WarpStream into ClickHouse, and query the data in ClickHouse.

PreviousAWS Lambda TriggersNextDebezium

Last updated 4 months ago

Was this helpful?

A video walkthrough can be found below:

Prerequisites

  1. A WarpStream cluster is up and running.

Step 1: Create a topic in your WarpStream cluster

Store these values as environment variables for easy reference:

export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;

Then, if you don't already have an available topic, create one using the WarpStream CLI or in the UI, then follow Step 2:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type create-topic -topic clickhouse_demo

You should see the following output in your Terminal:

Created topic clickhouse_demo.

Step 2: Produce some records

Using the WarpStream CLI, produce several messages to your topic:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic clickhouse_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_2", "page_id": "home"}'

Note that the WarpStream CLI uses double commas (,,) as a delimiter between JSON records.

Step 3: Connect ClickHouse to WarpStream

In the ClickHouse dashboard, navigate to "Data sources" and then click "Get started":

Select WarpStream from the list of options:

Fill in the fields using the data you generated in Step 1:

Next, select the incoming data format (JSON or AVRO) and then the topic. For this example, we use JSON format, and we have topics 'products' and 'reviews'. We'll use 'products', and ClickHouse will display a record from the Producer to confirm.

The data is then parsed, and you will then have the opportunity to make additional changes to the table that will be produced:

Step 4: Query your data in ClickHouse

We now see our WarpStream pipe ingesting data into ClickHouse. Next, select 'SQL Console' in the left-hand navigation to interact with your new ClickHouse table.

We can now take full advantage of ClickHouse's power with WarpStream data. A simple SQL SELECT command is illustrated below:

Next Steps

Congratulations! You've set up a stream processing pipeline between WarpStream and ClickHouse and performed a basic SQL query. This is just the beginning of what you can do.

WarpStream account - get access to WarpStream by registering .

ClickHouse account - get access to ClickHouse by registering .

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and clicking the Connect tab. If you don't have SASL credentials, you can also from the console.

Next, check out the WarpStream docs for configuring the , or review the to learn more about what is possible with WarpStream and ClickHouse!

here
here
WarpStream Agent
ClickHouse docs
create a set of credentials
ClickHouse Data sources