LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream cli-beta
        • benchmark-consumer
        • benchmark-producer
        • console-consumer
        • console-producer
        • consumer-group-lag
        • diagnose-record
        • file-reader
        • file-scrubber
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page
  • How Schema Validation Works
  • Topic Level Configurations for Schema Validation
  • Using WarpStream's BYOC Schema Registry for Schema Validation
  • Connecting to External Kafka-Compatible Schema Registry
  • Authentication
  • Connecting to AWS Glue Schema Registry

Was this helpful?

  1. BYOC
  2. Schema Registry

Schema Validation

This page describes how to configure the topic and agent to perform schema validation.

PreviousWarpStream BYOC Schema RegistryNextWarpStream Schema Linking

Last updated 1 month ago

Was this helpful?

How Schema Validation Works

Historically, schemas stored in schema registries are used only by clients to serialize/deserialize and validate messages. With WarpStream, you can configure the agents to not only validate that the record contains a valid schema ID, but that the record actually conforms to the corresponding schema. The agent can then reject or emit metrics when it receives invalid records.

Note that enabling schema validation will increase the CPU usage of the agent.

Currently, WarpStream supports two types of schema registries:

  • Kafka-compatible Schema Registry

  • AWS Glue Schema Registry

Here is a brief overview of how schema validation works in WarpStream:

  • The producer serializes data with the schema retrieved from the schema registry

  • The producer send the data to a WarpStream agent.

  • On receiving the message, the WarpStream agent decodes the message to obtain some form of schema identifier that points to a remote schema.

  • The agent uses the schema identifier to fetch the remote schema.

  • Finally, the agent verifies if the data actually conforms to the schema and rejects (or emit metrics) any invalid records.

This process is illustrated in the diagram below:

Currently, WarpStream supports the following schema formats: Avro and JSON Schema (with Protobuf coming soon).

Check out this overview video to learn more:

Topic Level Configurations for Schema Validation

Schema validation is configurable per topic. The following configurations can be provided when a topic is created or altered. Note that there are additional configurations depending on the schema registry type, which will be discussed in the next section.

Configuration
Description

warpstream.key.schema.validation

Boolean config that indicates whether to validate the record key.

warpstream.value.schema.validation

Boolean config that indicates whether to validate the record value.

warpstream.schema.validation.warning.only

When an invalid record is detected, the Agent allows the record to be written, but emits a metric indicating that the record is invalid instead of rejecting the record. The metric (counter) emitted is: schema_registry_validation_invalid_record

Defaults to true.

warpstream.schema.registry.type

The type of schema registry that the schemas live in. Supported values include:

  • "STANDARD": Any schema registries that are compatible with Confluent's schema registry

  • "AWS_GLUE": AWS Glue's Schema Registry.

Defaults to "STANDARD"

Topic Level Configurations for Kafka-compatible Schema Registries

Below are topic-level configurations for Kafka-compatible schema registries.

Configuration
Description

warpstream.key.subject.name.strategy

Config that determines which schemas are allowed for the record key.

Allowed values: TopicNameStrategy, RecordNameStrategy, TopicRecordNameStrategy. See more details below.

warpstream.value.subject.name.strategy

Config that determines which schemas are allowed for the record key.

Allowed values: TopicNameStrategy, RecordNameStrategy, TopicRecordNameStrategy. See more details below.

Subject Name Strategy:

Each schema in the Schema Registry is registered under a subject. During schema validation, the agent looks up the subject for the schema ID and verifies that the subject conforms to the subject name strategy.

There are three subject name strategies:

Strategy
Definition

TopicNameStrategy

The subject is derived from the topic name with the following format:

  • <topic name>-key for the record key

  • <topic value>-value for the record value.

RecordNameStrategy

The subject is the schema’s fully-qualified record name.

TopicRecordNameStrategy

The subject is a combination of the topic name and the record name with the following format: <topic name>-<fully-qualified record name>

The fully-qualified record name for Avro is the record’s namespace + record name. For JSON Schema, the record name is the title.

Using WarpStream's BYOC Schema Registry for Schema Validation

Our goal is to make it as easy as possible to perform schema validation with WarpStream's BYOC Schema Registry. To configure the agent to perform schema validation using schemas from your BYOC Schema Registry cluster, you need to do two things:

  • Set the -schemaValidationVirtualClusterID flag to your Schema Registry's Virtual Cluster ID when deploying your agent. Alternatively, you can set the environment variable WARPSTREAM_SCHEMA_VALIDATION_VIRTUAL_CLUSTER_ID to your Virtual Cluster ID.

  • Configure your agent to have the permission to read existing files from the object storage bucket that holds the schemas for your BYOC Schema Registry. In the case of AWS, this is the GetObject permission. In the case of GCP, this is the storage.objects.get permission.

Once the flag is set and the object storage permissions are provided, the agent will automatically fetch schemas from the object storage that holds the schemas for your BYOC Schema Registry when performing schema validation.

Connecting to External Kafka-Compatible Schema Registry

To allow the agent to connect to a Kafka-specific schema registry, set the -schemaRegistryURL flag to the URL of the schema registry. Alternatively, you can also set the WARPSTREAM_SCHEMA_REGISTRY_URL environment variable.

Authentication

Most schema registry implementations support some form of authentication. WarpStream supports connecting to external schema registries with MTLS, TLS, or basic authentication.

Basic Authentication

For basic authentication, supply the username and password as follows:

  • set the -externalSchemaRegistryBasicAuthUsername flag to the username of the schema registry. Alternatively, set the WARPSTREAM_EXTERNAL_SCHEMA_REGISTRY_BASIC_AUTH_USERNAME environment variable

  • set the -externalSchemaRegistryBasicAuthPassword flag to the password of the schema registry. Alternatively, set the WARPSTREAM_EXTERNAL_SCHEMA_REGISTRY_BASIC_AUTH_PASSWORD environment variable

TLS/MTLS

For mTLS, the agent needs both a certificate and a private key to enable the schema registry server to authenticate the agent.

You can use the -externalSchemaRegistryTlsClientCertFile and -externalSchemaRegistryTlsClientPrivateKeyFile to pass in the file paths to the agent certificate and private key, respectively. Alternatively, you can use WARPSTREAM_EXTERNAL_SCHEMA_REGISTRY_TLS_CLIENT_CERT_FILE and WARPSTREAM_EXTERNAL_SCHEMA_REGISTRY_TLS_CLIENT_PRIVATE_KEY_FILE environment variables.

For TLS and mTLS, you can optionally add a file path to the root certificate authority certificate file which the Agent will use to verify the schema registry server's certificate. Use the -externalSchemaRegistryTlsServerCACertFile flag, or the WARPSTREAM_EXTERNAL_SCHEMA_REGISTRY_TLS_SERVER_CA_CERT_FILE environment variable.

Connecting to AWS Glue Schema Registry

The agent must be deployed in AWS to connect to an AWS Glue schema registry. In addition, you'll also need to make sure the Agent containers have the appropriate permissions to read from the schema registry.

Below is an example Terraform configuration for an AWS IAM policy document that provides WarpStream with the appropriate permissions to access an AWS Glue schema registry.

data "aws_iam_policy_document" "warpstream_aws_glue_policy_document" {
  statement {
    sid = "AWSGlueSchemaRegistryReadonlyAccess"
    actions = [
      "glue:GetSchemaVersion"
    ]

    effect    = "Allow"
    resources = [ "*" ]
  }
}

Note that there is currently a that prevents providing the specific registry arn for the iam policy for AWS Glue. You would have to use "*" instead as showed in the example.

terraform bug