Upsolver

This page describes how to integrate WarpStream with Upsolver, ingest data into Upsolver from WarpStream, then process and write the data to one of the available targets in Upsolver.

A video walkthrough can be found here.

Prerequisites

  1. WarpStream account - get access to WarpStream by registering here.

  2. Upsolver account - get access to Upsolver by registering here.

  3. Serverless WarpStream cluster up and running.

Step 1: Create a topic in your WarpStream cluster

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and then clicking the Connect tab. If you don't have SASL credentials yet, you can also create a set of credentials from the console.

Store these values for easy reference; they will be needed in Upsolver. If you are going to produce records to your topic from the command line, then export them as environment variables in a terminal window:

export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;

Then, create a topic in the WarpStream console if you don't already have one.

Step 2: Produce some records

You can use the WarpStream CLI to produce messages to your topic:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic decodable_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_2", "page_id": "home"}'

Note that the WarpStream CLI uses double commas (,,) as a delimiter between JSON records.

Step 3: Connect Upsolver to WarpStream

In the Upsolver Web Console, under Jobs, click "+ New Job". Select "Kafka" as the source and an applicable target.

A standard connection will require you to configure a Kafka host, and Consumer Properties, which take the form of:

bootstrap.servers = HOST:PORT
security.protocol = SASL_SSL
sasl.jaas.config = org.apache.kafka.common.security.plain.PlainLoginModule   required username = "API_KEY"   password = "SECRET";
ssl.endpoint.identification.algorithm = https
sasl.mechanism = PLAIN

Using our credentials from Step 1, it will look like the following. Make sure to include the port 9092:

Name your connection and then click on "Test Connection". Once you have a successful connection, you'll be able to select from a list of available topics and get a preview of data as seen below:

Step 4: Configure your target in Upsolver

From this point, all your setup will be in Upsolver to configure your target and any transformations:

Next Steps

Congratulations! You've set up a stream processing pipeline between WarpStream and Upsolver.

Next, check out the WarpStream docs for configuring the WarpStream Agent, or review the Upsolver docs to learn more about what is possible with WarpStream and Upsolver!

Last updated

Logo

Apache, Apache Kafka, Kafka, and associated open source project names are trademarks of the Apache Software Foundation. Kinesis is a trademark of Amazon Web Services.