LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Orbit
    • Port Forwarding (K8s)
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream cli-beta
        • benchmark-consumer
        • benchmark-producer
        • console-consumer
        • console-producer
        • consumer-group-lag
        • diagnose-record
        • file-reader
        • file-scrubber
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
Powered by GitBook
On this page
  • Step 1: Create a topic in your WarpStream cluster
  • Step 2: Create your SQL connection string
  • Next Steps

Was this helpful?

  1. Reference
  2. Integrations

Streambased

Instructions on integrating WarpStream with Streambased.

PreviousSQLiteNextStreamlit

Last updated 11 months ago

Was this helpful?

The Streambased folks have a great tutorial on integrating WarpStream with command line Streambased. What follows is a description of using their Analytics Service for Kafka {A.S.K.) product, which is free and requires no downloads or setup. A video walkthrough can be found below.

Step 1: Create a topic in your WarpStream cluster

Save these three values, as you will need them in the sasl.jaas.config parameter in the next step.

Step 2: Create your SQL connection string

Once you have constructed this connection string, it can be executed in a query or BI tool of your choice to establish the connection to your WarpStream cluster. From that point, you can interact with your stream like any other SQL-compatible data source.

SET SESSION streambased_connection='{
"bootstrap.servers":"YOUR_BOOTSTRAP_BROKER>",
"security.protocol":"SASL_SSL",
"sasl.jaas.config":"org.apache.kafka.common.security.plain.PlainLoginModule required username=''<YOUR_SASL_USERNAME>'' password=''<YOUR_SASL_PASSWORD>'';",
"sasl.mechanism":"PLAIN",
"schema.registry.url":"<SCHEMA_REGISTRY_ENDPOINT_URL>",
"basic.auth.credentials.source":"USER_INFO",
"basic.auth.user.info":"<SCHEMA_REGISTRY_KEY>:<SCHEMA_REGISTRY_SECRET>"
}';

Next Steps

Using the above configured SET SESSION command, you now have everything you need to connect with the SQL or BI tool of your choice to query your WarpStream producer stream.

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and clicking the Connect tab. If you don't have SASL credentials yet, you must also from the console by clicking on the Credentials tab.

The saasl.jaas.config has a bit of a gotcha in that the single quote has to be escaped. Streambased has to format your information to ensure correctness properly.

create a set of credentials
provided a tool
on their website