LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page
  • Prerequisites
  • Step 1: Create a topic in your WarpStream cluster
  • Step 2: Produce some records
  • Step 3: Connect Tinybird to WarpStream
  • Step 4: Configure your Tinybird pipeline
  • Step 5: Deploy your Tinybird endpoint
  • Next Steps

Was this helpful?

  1. Reference
  2. Integrations

Tinybird

This page describes how to integrate WarpStream with Tinybird, ingest data into Tinybird from WarpStream, and then create an API endpoint for your applications to access the result set.

PreviousTimeplusNextUpsolver

Last updated 4 months ago

Was this helpful?

A video walkthrough can be found below:

Prerequisites

  1. Serverless WarpStream cluster up and running.

Step 1: Create a topic in your WarpStream cluster

Store these values for easy reference; they will be needed in Tinybird. If you are going to produce records to your topic from the command line, then export them in a terminal window:

export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;

Then, create a topic in the WarpStream console if you don't already have one.

Step 2: Produce some records

You can use the WarpStream CLI to produce messages to your topic if you don't already have an active topic to work with:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic decodable_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_2", "page_id": "home"}'

Note that the WarpStream CLI uses double commas (,,) as a delimiter between JSON records.

Step 3: Connect Tinybird to WarpStream

In the Tinybird Web Console, next to "DATA PROJECT," click "+" and select "Data Source," select "Kafka," and file in the connection information.

Walk through the next steps to select your Topic and set your Configuration. Name the data source, and then click "Create Data Source," which will present a data preview and graph of ingested data, similar to the following:

Step 4: Configure your Tinybird pipeline

Once again, next to "DATA PROJECT," click "+" and select "Pipe." Here you will create your SQL code to power your Tinybird API Endpoint. Tinybird will validate your code and provide a subset of the qualified dataset. In this example, we are looking for any "action" that does not have a status of "ACCEPT," we've named this "rejects." Once satisfied with the results, click "Create API Endpoint."

Step 5: Deploy your Tinybird endpoint

Tinybird will present various code snippets to enable you to deploy your endpoint using the pipeline you just created.

Taking the HTTP option and copying it into a browser yields the following result:

Next Steps

Congratulations! You've set up a stream processing pipeline between WarpStream and Tinybird.

WarpStream account - get access to WarpStream by registering .

Tinybird account - get access to Tinybird by registering .

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and clicking the Connect tab. If you don't have SASL credentials yet, you can also from the console.

Next, check out the WarpStream docs for configuring the , or review the to learn more about what is possible with WarpStream and Tinybird!

here
here
WarpStream Agent
Tinybird Docs
create a set of credentials
Tinybird Connection Panel
Tinybird Kafka Connection Details
TinyBird API deployment options