Configure Different Object Stores

The bucketURL flag is the URL of the object storage bucket that the WarpStream Agent should write to. See the table below for how to configure it for different object store implementations.
Note that the WarpStream Agents will automatically write all of their data to a top-level warpstream prefix in the bucket. In addition, each cluster will write its data to a cluster-specific prefix (derived from the cluster ID) within the warpstream prefix so multiple WarpStream clusters can co-exist within the same object storage bucket without issue.
An S3 bucket with 16 different cluster prefixes under the top-level warpstream prefix.
Azure Blob Storage
Format: s3://$BUCKET_NAME?region=$BUCKET_REGION
Example: s3://my_warpstream_bucket_123?region=us-east-1
If you want to use an AssumeRole provider to authenticate, you can add the following query parameters to the bucket URL:
  • assumeRoleARN: IAM Role ARN to be assumed
  • assumeRoleDurationMinutes: optional integer configuring the expiry duration of the STS credentials
Example: s3://my_warpstream_bucket_123?region=us-east-1&assumeRoleARN=arn%3Aaws%3Aiam%3A%3AXYZ%3Arole%2FViewLogsPlease&assumeRoleDurationMinutes=10
Format: gs://$BUCKET_NAME
Example: gs://my_warpstream_bucket_123
The WarpStream Agent embeds the official GCP Golang SDK so authentication/authorization with the storage bucket can be handled in any of the expected ways.
Format: azblob://$CONTAINER_NAME
Example: azblob://my_warpstream_container_123
The WarpStream Agent embeds the official Azure Golang SDK which expects the AZURE_STORAGE_ACCOUNT environment variable to be set, along with one of the two following environment variables: AZURE_STORAGE_KEY or AZURE_STORAGE_SAS_TOKEN.
For testing and local development only. All data will be lost once the Agent shuts down.
Example: mem://my_memory_bucket
For testing and local development only. The file store implementation is not robust.
Format: file://$PATH_TO_DIRECTORY
Example: file:///tmp/warpstream_tmp_123
If you're using an "S3 compatible" object store that is not actually S3, like MinIO or Oracle Cloud Object Store, then read this reference on how to configure the bucket URL.
Apache, Apache Kafka, Kafka, and associated open source project names are trademarks of the Apache Software Foundation. Kinesis is a trademark of Amazon Web Services.