# Materialize

A video walkthrough can be found below:

{% embed url="<https://youtu.be/Jz6N3Hvfe5I>" %}

## Prerequisites

1. WarpStream account - get access to WarpStream by registering [here](https://console.warpstream.com/signup).
2. Materialize account - get access to Materialize by registering [here](https://materialize.com/register/).
3. WarpStream cluster is up and running.

## Step 1: Create a topic in your WarpStream cluster

Obtain the Bootstrap Broker from the WarpStream console by navigating to your cluster and then clicking the Connect tab. If you don't have SASL credentials yet, you can also [create a set of credentials](https://docs.warpstream.com/warpstream/kafka/manage-security/sasl-authentication#creating-credentials) from the console.

Store these values as environment variables for easy reference:

```bash
export BOOTSTRAP_HOST=<YOUR_BOOTSTRAP_BROKER> \
SASL_USERNAME=<YOUR_SASL_USERNAME> \
SASL_PASSWORD=<YOUR_SASL_PASSWORD>;
```

Then, create a topic using the WarpStream CLI:

{% code overflow="wrap" %}

```bash
warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type create-topic -topic materialize_click_streams
```

{% endcode %}

You should see the following output in your Terminal:

`Created topic materialize_click_streams.`

## Step 2: Produce some records

Using the WarpStream CLI, produce several messages to your topic:

{% code overflow="wrap" %}

```bash
warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic materialize_click_streams --records '{"action": "click", "user_id": "user_0", "page_id": "home"},,{"action": "hover", "user_id": "user_0", "page_id": "home"},,{"action": "scroll", "user_id": "user_0", "page_id": "home"}'
```

{% endcode %}

Note that the WarpStream CLI uses double commas (`,,)` as a delimiter between JSON records.

## Step 3: Connect Materialize with WarpStream

First, provide your WarpStream credentials as `SECRET` objects in the Materialize console:

```bash
CREATE SECRET warpstream_username AS 'ccun_XXXXXXXXXX';
CREATE SECRET warpstream_password AS 'ccp_XXXXXXXXXX';
```

Then, establish a `CONNECTION` object (note: replace $BOOTSTRAP\_HOST with its value from Step 1):

```bash
CREATE CONNECTION warpstream_kafka TO KAFKA (
    BROKER '$BOOTSTRAP_HOST',
    SASL MECHANISMS = "PLAIN",
    SASL USERNAME = SECRET warpstream_username,
    SASL PASSWORD = SECRET warpstream_password
);
```

Finally, create a `SOURCE` object to being consuming messages from your WarpStream topic:

{% code overflow="wrap" %}

```bash
CREATE SOURCE warpstream_click_stream_source
  FROM KAFKA CONNECTION warpstream_kafka (TOPIC 'materialize_click_streams')
  FORMAT JSON;
```

{% endcode %}

## Step 4: Verify that Materialize is ingesting data from WarpStream

In the Materialize Console, run:

```sql
SELECT * FROM warpstream_click_stream_source;
```

You should see the results displaying the records that you produced in Step 2.

## Step 5: Create a materialized view

First, create the materialized view:

<pre class="language-sql"><code class="lang-sql"><strong>CREATE MATERIALIZED VIEW
</strong><strong>    warpstream_click_stream_mv
</strong><strong>AS
</strong><strong>    SELECT (data::jsonb)['user_id'], COUNT(*)
</strong><strong>    FROM warpstream_click_stream_source
</strong><strong>    GROUP BY (data::jsonb)['user_id'];
</strong></code></pre>

Then, query the result:

```sql
SELECT * FROM warpstream_click_stream_mv;
```

You should see a result showing `user_0` with a count of 3 records.

## Step 6: Produce another record to your WarpStream topic

In your Terminal, produce another record to your topic:

{% code overflow="wrap" %}

```bash
warpstream kcmd -bootstrap-host $BOOTSTRAP_HOST -tls -username $SASL_USERNAME -password $SASL_PASSWORD -type produce -topic materialize_click_streams --records '{"action": "click", "user_id": "user_1", "page_id": "home"}'
```

{% endcode %}

In the Materialize Console, query the materialized view again:

```sql
SELECT * FROM warpstream_click_stream_mv;
```

You should now see two records, the record for `user_0` with a `count` of 3, and a new record for `user_1` with a `count` of 1.

## Next Steps

After validating that your WarpStream cluster is connected to Materialize and that Materialize can consume and process data from WarpStream, you are ready to set up an actual data pipeline between WarpStream and Materialize.

For more information on how to set up and configure Materialize, head over to the [Materialize Docs](https://materialize.com/docs/) page.
