LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream cli-beta
        • benchmark-consumer
        • benchmark-producer
        • console-consumer
        • console-producer
        • consumer-group-lag
        • diagnose-record
        • file-reader
        • file-scrubber
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page
  • Unauthenticated / BYOC WarpStream Cluster (Example)
  • Authenticated WarpStream Cluster (Example)
  • Configuring Zone Aware Routing

Was this helpful?

  1. Reference
  2. Integrations

OpenTelemetry Collector

This page describes how to connect the OpenTelemetry Collector to WarpStream using the Kafka exporter.

PreviousOckamNextParadeDB

Last updated 8 months ago

Was this helpful?

The Kafka exporteris no longer bundled as part of the standard otel collector, so you'll need to use one of the .

Unauthenticated / BYOC WarpStream Cluster (Example)

The sample configuration file below demonstrates how to configure the OpenTelemetry collector with a Kafka exporter that writes to an unauthenticated WarpStream cluster. This setup makes the most sense when you're self-hosting the WarpStream Agents in your own cloud account / VPC using the BYOC product.

receivers:
  otlp:
    protocols:
      http:
        cors:
          allowed_origins:
            - "*"

processors:
  batch:

exporters:
  logging:
    loglevel: debug
  kafka:
    brokers: ["<YOUR_WARPSTREAM_BOOTSTRAP_URL>:9092"]
    topic: "traces"
service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, kafka]

Authenticated WarpStream Cluster (Example)

The sample configuration file below demonstrates how to configure the OpenTelemetry collector with a Kafka exporter that writes to an authenticated WarpStream cluster.

receivers:
  otlp:
    protocols:
      http:
        cors:
          allowed_origins:
            - "*"

processors:
  batch:

exporters:
  logging:
    loglevel: debug
  kafka:
    brokers: ["<YOUR_WARPSTREAM_BOOTSTRAP_URL>:9092"]
    topic: "traces"
    auth:
      sasl:
        username: $USERNAME
        password: $PASSWORD
        mechanism: PLAIN
        version: 1
      tls:
        insecure: true
service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, kafka]

Configuring Zone Aware Routing

v0.92.0 of the collector supports specifying Kafka client IDs like so:

receivers:
  otlp:
    protocols:
      http:
        cors:
          allowed_origins:
            - "*"

processors:
  batch:

exporters:
  logging:
    loglevel: debug
  kafka:
    brokers: ["<YOUR_WARPSTREAM_BOOTSTRAP_URL>:9092"]
    client_id: "warpstream_az=us-east-1a"
    topic: "traces"
    auth:
      sasl:
        username: $USERNAME
        password: $PASSWORD
        mechanism: PLAIN
        version: 1
      tls:
        insecure: true
service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, kafka]

See our documentation on for more details.

contrib releases
configuring WarpStream features via the Kafka Client ID