LogoLogo
WarpStream.comSlackDiscordContact UsCreate Account
  • Overview
    • Introduction
    • Architecture
      • Service Discovery
      • Write Path
      • Read Path
      • Life of a Request (Simplified)
    • Change Log
  • Getting Started
    • Install the WarpStream Agent / CLI
    • Run the Demo
    • "Hello World" for Apache Kafka
  • BYOC
    • Run the Agents Locally
    • Deploy the Agents
      • Object Storage Configuration
      • Kubernetes Known Issues
      • Rolling Restarts and Upgrades
    • Infrastructure as Code
      • Terraform Provider
      • Helm charts
      • Terraform Modules
    • Monitoring
      • Pre-made Datadog Dashboard
      • Pre-made Grafana Dashboard
      • Important Metrics and Logs
      • Recommended List of Alerts
      • Monitoring Consumer Groups
      • Hosted Prometheus Endpoint
    • Client Configuration
      • Tuning for Performance
      • Configure Clients to Eliminate AZ Networking Costs
        • Force Interzone Load Balancing
      • Configuring Kafka Client ID Features
      • Known Issues
    • Authentication
      • SASL Authentication
      • Mutual TLS (mTLS)
      • Basic Authentication
    • Advanced Agent Deployment Options
      • Agent Roles
      • Agent Groups
      • Protect Data in Motion with TLS Encryption
      • Low Latency Clusters
      • Network Architecture Considerations
      • Agent Configuration Reference
      • Reducing Infrastructure Costs
      • Client Configuration Auto-tuning
    • Hosted Metadata Endpoint
    • Managed Data Pipelines
      • Cookbooks
    • Schema Registry
      • WarpStream BYOC Schema Registry
      • Schema Validation
      • WarpStream Schema Linking
    • Port Forwarding (K8s)
    • Orbit
    • Enable SAML Single Sign-on (SSO)
    • Trusted Domains
    • Diagnostics
      • GoMaxProcs
      • Small Files
  • Reference
    • ACLs
    • Billing
      • Direct billing
      • AWS Marketplace
    • Benchmarking
    • Compression
    • Protocol and Feature Support
      • Kafka vs WarpStream Configuration Reference
      • Compacted topics
    • Secrets Overview
    • Security and Privacy Considerations
    • API Reference
      • API Keys
        • Create
        • Delete
        • List
      • Virtual Clusters
        • Create
        • Delete
        • Describe
        • List
        • DescribeConfiguration
        • UpdateConfiguration
      • Virtual Clusters Credentials
        • Create
        • Delete
        • List
      • Monitoring
        • Describe All Consumer Groups
      • Pipelines
        • List Pipelines
        • Create Pipeline
        • Delete Pipeline
        • Describe Pipeline
        • Create Pipeline Configuration
        • Change Pipeline State
      • Invoices
        • Get Pending Invoice
        • Get Past Invoice
    • CLI Reference
      • warpstream agent
      • warpstream demo
      • warpstream cli
      • warpstream playground
    • Integrations
      • Arroyo
      • AWS Lambda Triggers
      • ClickHouse
      • Debezium
      • Decodable
      • DeltaStream
      • docker-compose
      • DuckDB
      • ElastiFlow
      • Estuary
      • Fly.io
      • Imply
      • InfluxDB
      • Kestra
      • Materialize
      • MinIO
      • MirrorMaker
      • MotherDuck
      • Ockam
      • OpenTelemetry Collector
      • ParadeDB
      • Parquet
      • Quix Streams
      • Railway
      • Redpanda Console
      • RisingWave
      • Rockset
      • ShadowTraffic
      • SQLite
      • Streambased
      • Streamlit
      • Timeplus
      • Tinybird
      • Upsolver
    • Partitions Auto-Scaler (beta)
    • Serverless Clusters
Powered by GitBook
On this page

Was this helpful?

  1. Reference
  2. Integrations

AWS Lambda Triggers

This page explains how to use WarpStream with AWS's Self Managed Apache Kafka Trigger

PreviousArroyoNextClickHouse

Last updated 4 months ago

Was this helpful?

This page explains how to use WarpStream with AWS's Self Managed Apache Kafka Trigger to ingest data from a WarpStream cluster and process the records as events in Lambda.

First, you'll need a deployed WarpStream cluster. You can follow , or use our or templates.

If the WarpStream Agents are deployed outside of your AWS cloud account (using Fly.io or Railway, for example), you'll also want to familiarize yourself with our instruction on for the WarpStream Agents.

Once the cluster is up and running, navigate to the WarpStream console and click the "credentials" button for your virtual cluster.

Next, click the "Create Credentials" button to create a new set of credentials.

Pick a name for your credentials, then submit. The next screen will present you with your username and password. Save those values temporarily, because we're going to store them in AWS secret manager next.

Go to AWS Secrets Manager in the AWS console and create a new secret.

Once the secret is created, go to AWS Lambda in the AWS console and create a new function.

Edit the code of your lambda function to print out the event.

export const handler = async (event) => {
  // TODO implement
  console.log("my event", JSON.stringify(event));
  const response = {
    statusCode: 200,
    body: JSON.stringify(event),
  };
  return response;
};

Then click deploy.

Next, navigate to the "Configuration" tab and then select "Permissions"

From there, click on the Lambda's role.

Click "Add permissions" then "Attach policies"

Search for SecretsManagerReadWrite and click "Add permissions".

This will give the lambda the ability to read all secrets, so you may want to make the permission more granular specifically to the secret we created above.

The permission should be added successfully.

Once that's done, navigate back to the Lambda and click "Add trigger".

Then select the Kafka source and fill in the bootstrap host, topic name, and authentication configuration. Make sure you use BASIC_AUTH as the authentication mechanism, and select the secret we created in the previous steps.

Finally, click "Add".

warpstream kcmd -bootstrap-host matano-test.fly.dev -tls -username ccun_XXXXXXXXX -password ccp_XXXXXXXXXX -type produce -topic test -records hello,world

After producing the data, you should be able to see it in the Lambda's cloud watch logs.

You should also now be able to monitor the state of the AWS Lambda's consumer group in the WarpStream console.

Now that everything is setup, you can produce data to the topic. In this example, we'll use the (which includes a CLI tool) to produce data.

WarpStream Agent binary
our instructions on how to deploy the WarpStream Agents in production
Fly.io
Railway
configuring authentication
Make sure you store the key/value pairs as username and password.