Imply

This page describes how to integrate, ingest and query data in Imply from WarpStream. Imply is powered by Apache Druid, a real-time analytics database.

Prerequisites

  1. WarpStream account - get access to WarpStream by registering here.

  2. Imply account - get access to Imply by registering here.

  3. A WarpStream cluster is up and running.

Step 1: Create a topic in your WarpStream cluster

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and clicking the Connect tab. If you don't have SASL credentials, you can also create a set of credentials from the console.

Store these values as environment variables for easy reference:

export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;

Then, if you don't already have an available topic, create one using the WarpStream CLI or in the UI, then follow Step 2:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type create-topic -topic imply_demo

You should see the following output in your Terminal:

Created topic imply_demo.

Step 2: Produce some records

Using the WarpStream CLI, produce several messages to your topic:

warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic clickhouse_demo --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_1", "page_id": "home"},,{"action": "click", "user_id": "user_2", "page_id": "home"}'

Note that the WarpStream CLI uses double commas (,,) as a delimiter between JSON records.

Step 3: Connect Imply to WarpStream

In the Imply dashboard, navigate to "Sources" and then click "Create source" from the option in the upper right and find/select "Kafka/MSK":

Fill in the connection information as indicated below:

Step 4: Ingest data from WarpStream to Imply

With the connection created, you'll want to now select the source to insert data into a table. On the "Sources" page, click on the three-dot (...) menu for the source, and select "Insert data", as seen below:

Name the table, in this case, "orders":

We select our source from this next screen; in this case, we named it "Warp_Orders," and then click "Next ->":

Imply will provide a preview of your data, names, and types. It will try to infer the input format and layout from the data, but you can force it by selecting the "Input format".

The final step before ingestion allows you to make any last-minute changes to how you are going to read the topics and what the destination table is like. Once you have finished with these settings, then click "Start ingestion":

Step 5: Query and visualize your data in Imply

Once your data is loaded, you can use any of the Imply features to build data cubes, dashboards, reports, or even SQL commands, as seen below. This example joins the data imported from two topics, "customers" and "orders," which have a joining field of "customerId":

Next Steps

Congratulations! You've set up a stream processing pipeline between WarpStream and Imply and performed a basic SQL query. This is just the beginning of what is possible.

Next, check out the WarpStream docs for configuring the WarpStream Agent, or review the Imply docs to learn more about what can be done with WarpStream and Imply!

Last updated