LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream cli-beta
        • benchmark-consumer
        • benchmark-producer
        • console-consumer
        • console-producer
        • consumer-group-lag
        • diagnose-record
        • file-reader
        • file-scrubber
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page
  • Prerequisites
  • Step 1: Create a topic in your WarpStream cluster
  • Step 2: Produce some records
  • Step 3: Connect Decodable to WarpStream
  • Step 4: Start the Connection and verify that Decodable is ingesting data from WarpStream
  • Step 5: Create a Pipeline
  • Next Steps

Was this helpful?

  1. Reference
  2. Integrations

Decodable

This page describes how to integrate WarpStream with Decodable, ingest data into Decodable from WarpStream, and query the data in Decodable.

PreviousDebeziumNextDeltaStream

Last updated 5 months ago

Was this helpful?

A video walkthrough can be found below:

Prerequisites

  1. WarpStream cluster is up and running.

Step 1: Create a topic in your WarpStream cluster

Store these values as environment variables for easy reference:

export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;

Then, create a topic using the WarpStream CLI:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type create-topic -topic decodable_demo

You should see the following output in your Terminal:

Created topic decodable_demo.

Step 2: Produce some records

Using the WarpStream CLI, produce several messages to your topic:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic decodable_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_2", "page_id": "home"}'

Note that the WarpStream CLI uses double commas (,,) as a delimiter between JSON records.

Step 3: Connect Decodable to WarpStream

In the Decodable Web Console, under the Start Building heading, click "Connect to a source". Select "Apache Kafka", and configure Decodable to read from your WarpStream topic.

Click Next.

Create a new stream named decodable_demo:

Specify the schema of our data:

And finally, finish creating the Connection:

Step 4: Start the Connection and verify that Decodable is ingesting data from WarpStream

In the upper right corner of the Connection overview screen, click "Start", then start the connection with the default settings:

After a few moments, the Decodable Connection should transition from Starting to Running, and the Output Metrics will show that Decodable is consuming from WarpStream:

Navigate to Streams > decodable_demo to view a sample of your data in Decodable.

Step 5: Create a Pipeline

In the Decodable Web Console, navigate to Pipelines > Create your first Pipeline and select decodable_demo from the list of Streams.

This query creates a new Stream and inserts only the records that have a user_id of user_0.

Click Next.

Click Create Stream, then click Next.

Name your Pipeline and click Create Pipeline.

Produce some more records to WarpStream from your Terminal (you can add more records if you want; just remember to use the ,, delimiter in the WarpStream CLI):

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic decodable_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_1", "page_id": "home"},,{"action": "scroll", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "", "page_id": "home"},,{"action": "click", "user_id": "", "page_id": "home"},,{"action": "click", "user_id": "user_0", "page_id": "home"}'

And then Start your Pipeline, configuring it to read from the earliest offset:

After a few moments, you should be able to observe the Input and Output Metrics, confirming that the Pipeline is working as expected!

Next Steps

Congratulations! You've set up a stream processing pipeline between WarpStream and Decodable and created a simple Pipeline that filters data out of a firehose of events.

WarpStream account - get access to WarpStream by registering .

Decodable account - get access to Decodable by registering .

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and clicking the Connect tab. If you don't have SASL credentials yet, you can also from the console.

Next, check out the WarpStream docs for configuring the , or review the to learn more about what is possible with WarpStream and Decodable!

here
here
WarpStream Agent
Decodable docs
create a set of credentials