Comment on page
Configure the Agent bucketURL Flag for Different Object Stores
The
bucketURL
flag is the URL of the object storage bucket that the WarpStream Agent should write to. See the table below for how to configure it for different object store implementations.AWS S3
GCP GCS
Azure Blob Storage
Memory
File
Format:
s3://$BUCKET_NAME?region=$BUCKET_REGION
Example:
s3://my_warpstream_bucket_123?region=us-east-1
The WarpStream Agent embeds the official AWS Golang SDK V2 so authentication/authorization with the specified S3 bucket can be handled in any of the expected ways, like using a shared credentials file, environment variables, or simply running the Agents in an environment with an appropriate IAM role with Write/Read/List permissions on the S3 bucket.
If you want to use an
AssumeRole
provider to authenticate, you can add the following query parameters to the bucket URL:assumeRoleARN
: IAM Role ARN to be assumedassumeRoleDurationMinutes
: optional integer configuring the expiry duration of the STS credentials
Example:
s3://my_warpstream_bucket_123?region=us-east-1&assumeRoleARN=arn%3Aaws%3Aiam%3A%3AXYZ%3Arole%2FViewLogsPlease&assumeRoleDurationMinutes=10
Format:
gs://$BUCKET_NAME
Example:
gs://my_warpstream_bucket_123
The WarpStream Agent embeds the official GCP Golang SDK so authentication/authorization with the storage bucket can be handled in any of the expected ways.
Format:
azblob://$CONTAINER_NAME
Example:
azblob://my_warpstream_container_123
The WarpStream Agent embeds the official Azure Golang SDK which expects the
AZURE_STORAGE_ACCOUNT
environment variable to be set, along with one of the two following environment variables: AZURE_STORAGE_KEY
or AZURE_STORAGE_SAS_TOKEN
.For testing and local development only. All data will be lost once the Agent shuts down.
Example:
mem://my_memory_bucket
For testing and local development only. The file store implementation is not robust.
Format:
file://$PATH_TO_DIRECTORY
Example:
file:///tmp/warpstream_tmp_123
If you're using an "S3 compatible" object store that is not actually S3, like MinIO or Oracle Cloud Object Store, then read this reference on how to configure the bucket URL.