diff --git a/aggregating-count/ksql/README.md b/aggregating-count/ksql/README.md index 1779102d..4459ce13 100644 --- a/aggregating-count/ksql/README.md +++ b/aggregating-count/ksql/README.md @@ -165,9 +165,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/aggregating-minmax/ksql/README.md b/aggregating-minmax/ksql/README.md index 500ad6e9..0c4417bf 100644 --- a/aggregating-minmax/ksql/README.md +++ b/aggregating-minmax/ksql/README.md @@ -166,9 +166,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/aggregating-sum/ksql/README.md b/aggregating-sum/ksql/README.md index ea0d64e9..a516187b 100644 --- a/aggregating-sum/ksql/README.md +++ b/aggregating-sum/ksql/README.md @@ -165,9 +165,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/anomaly-detection/ksql/README.md b/anomaly-detection/ksql/README.md index a1cce2d8..7edb2544 100644 --- a/anomaly-detection/ksql/README.md +++ b/anomaly-detection/ksql/README.md @@ -232,9 +232,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/change-topic-partitions/ksql/README.md b/change-topic-partitions/ksql/README.md index 6ba78c25..acbb04d0 100644 --- a/change-topic-partitions/ksql/README.md +++ b/change-topic-partitions/ksql/README.md @@ -156,9 +156,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. @@ -194,7 +194,7 @@ You can run the example backing this tutorial in one of two ways: locally with t ``` Observe the expected number of partitions for the `topic` and `topic2` topics when you navigate - to `Topics` in the lefthand navigation of the Confluent Cloud Console. + to `Topics` in the left-hand navigation of the Confluent Cloud Console. ### Clean up diff --git a/column-difference/ksql/README.md b/column-difference/ksql/README.md index bdf85c6d..0dadd6ef 100644 --- a/column-difference/ksql/README.md +++ b/column-difference/ksql/README.md @@ -177,9 +177,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/concatenation/ksql/README.md b/concatenation/ksql/README.md index d3e41c58..7a1bd95f 100644 --- a/concatenation/ksql/README.md +++ b/concatenation/ksql/README.md @@ -186,9 +186,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/confluent-cloud-connector-aws-privatelink/kafka/README.md b/confluent-cloud-connector-aws-privatelink/kafka/README.md index e43b730a..c881c80f 100644 --- a/confluent-cloud-connector-aws-privatelink/kafka/README.md +++ b/confluent-cloud-connector-aws-privatelink/kafka/README.md @@ -57,7 +57,7 @@ confluent api-key create --resource ```noformat confluent connect cluster create --config-file /tmp/datagen-connector.json ``` -* After a minute or so, validate in the Confluent Cloud Console that the connector is running. In the lefhand navigation, select `Environments`, click into the environment, then click the PrivateLink cluster. In the lefthand navigation, select `Connectors` and verify that the connector state is `Running` and generating messages: +* After a minute or so, validate in the Confluent Cloud Console that the connector is running. In the lefhand navigation, select `Environments`, click into the environment, then click the PrivateLink cluster. In the left-hand navigation, select `Connectors` and verify that the connector state is `Running` and generating messages: ![Datagen](https://raw.githubusercontent.com/confluentinc/tutorials/master/confluent-cloud-connector-aws-privatelink/kafka/img/cc-datagen.png) diff --git a/convert-timestamp-timezone/ksql/README.md b/convert-timestamp-timezone/ksql/README.md index 71e7554f..357f1296 100644 --- a/convert-timestamp-timezone/ksql/README.md +++ b/convert-timestamp-timezone/ksql/README.md @@ -161,9 +161,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/count-messages/ksql/README.md b/count-messages/ksql/README.md index 18d54da2..bcd4ba20 100644 --- a/count-messages/ksql/README.md +++ b/count-messages/ksql/README.md @@ -160,9 +160,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/deduplication-windowed/ksql/README.md b/deduplication-windowed/ksql/README.md index f50a636b..8978786b 100644 --- a/deduplication-windowed/ksql/README.md +++ b/deduplication-windowed/ksql/README.md @@ -219,9 +219,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/deserialization-errors/ksql/README.md b/deserialization-errors/ksql/README.md index e5093c7b..72c7af61 100644 --- a/deserialization-errors/ksql/README.md +++ b/deserialization-errors/ksql/README.md @@ -69,9 +69,9 @@ confluent ksql cluster create ksqldb-tutorial \ ### Run the commands -Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, +Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then select -`Topics` in the lefthand navigation. Create a topic called `sensors-raw` with 1 partition, and in the `Messages` tab, +`Topics` in the left-hand navigation. Create a topic called `sensors-raw` with 1 partition, and in the `Messages` tab, produce the following two events, one at a time. ```noformat @@ -82,7 +82,7 @@ produce the following two events, one at a time. {"id": "1a076a64-4a84-40cb-a2e8-2190f3b37465", "timestamp": "2020-01-15 02:30:30", "enabled": "true"} ``` -Next, select `ksqlDB` in the lefthand navigation. +Next, select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/filtering/flinksql/README.md b/filtering/flinksql/README.md index 3884f612..a8608520 100644 --- a/filtering/flinksql/README.md +++ b/filtering/flinksql/README.md @@ -3,24 +3,50 @@ # How to filter messages in a Kafka topic with Flink SQL -Consider a topic with events that represent book publications. In this tutorial, we'll use Flink SQL to find only the -publications written by a particular author. +### Overview -## Setup +In this tutorial, we'll use Flink SQL to filter messages in a Kafka topic. -Let's assume the following DDL for our base `publication_events` table: +### Prerequisites + +* A [Confluent Cloud](https://confluent.cloud/signup) account +* A Flink compute pool created in Confluent Cloud. Follow [this](https://docs.confluent.io/cloud/current/flink/get-started/quick-start-cloud-console.html) quick start to create one. + +### Tutorial steps + +The following steps use Confluent Cloud. See the section at the bottom to run it locally with Docker. + +In the [Confluent Cloud Console](https://confluent.cloud/), navigate to your environment and select `Flink` in the left-hand navigation. Then click the `Open SQL Workspace` button for the compute pool that you have created. + +Select the default catalog (Confluent Cloud environment) and database (Kafka cluster) to use with the dropdowns at the top right. + +Run following SQL statement to create a table named `publication_events` that represents book publications: ```sql CREATE TABLE publication_events ( book_id INT, author STRING, - title STRING + title STRING ); ``` -## Filter events +Populate the table with test data: -Given the `publication_events` table definition above, we can filter to the publications by a particular author using a `WHERE` clause: +```sql +INSERT INTO publication_events VALUES + (0, 'C.S. Lewis', 'The Silver Chair'), + (1, 'George R. R. Martin', 'A Song of Ice and Fire'), + (2, 'C.S. Lewis', 'Perelandra'), + (3, 'George R. R. Martin', 'Fire & Blood'), + (4, 'J. R. R. Tolkien', 'The Hobbit'), + (5, 'J. R. R. Tolkien', 'The Lord of the Rings'), + (6, 'George R. R. Martin', 'A Dream of Spring'), + (7, 'J. R. R. Tolkien', 'The Fellowship of the Ring'), + (8, 'George R. R. Martin', 'The Ice Dragon'), + (9, 'Mario Puzo', 'The Godfather'); +``` + +Use the [`WHERE` clause](https://docs.confluent.io/cloud/current/flink/reference/queries/select.html#where-clause) to filter the rows down to the publications written by George R. R. Martin: ```sql SELECT * @@ -28,40 +54,13 @@ FROM publication_events WHERE author = 'George R. R. Martin'; ``` -## Running the example +The query output should look like this: -You can run the example backing this tutorial in one of three ways: a Flink Table API-based JUnit test, locally with the Flink SQL Client -against Flink and Kafka running in Docker, or with Confluent Cloud. +![Query output](https://raw.githubusercontent.com/confluentinc/tutorials/master/filtering/flinksql/img/query-output.png) -
- Flink Table API-based test - - ### Prerequisites - - * Java 17, e.g., follow the OpenJDK installation instructions [here](https://openjdk.org/install/) if you don't have Java. - * Docker running via [Docker Desktop](https://docs.docker.com/desktop/) or [Docker Engine](https://docs.docker.com/engine/install/) - - ### Run the test - - Clone the `confluentinc/tutorials` GitHub repository (if you haven't already) and navigate to the `tutorials` directory: - - ```shell - git clone git@github.com:confluentinc/tutorials.git - cd tutorials - ``` - - Run the following command to execute [FlinkSqlFilteringTest#testFilter](https://github.com/confluentinc/tutorials/blob/master/filtering/flinksql/src/test/java/io/confluent/developer/FlinkSqlFilteringTest.java): - - ```plaintext - ./gradlew clean :filtering:flinksql:test - ``` - - The test starts Kafka and Schema Registry with [Testcontainers](https://testcontainers.com/), runs the Flink SQL commands - above against a local Flink `StreamExecutionEnvironment`, and ensures that the filter results are what we expect. -
- Flink SQL Client CLI + Docker instructions ### Prerequisites @@ -89,8 +88,7 @@ against Flink and Kafka running in Docker, or with Confluent Cloud. docker exec -it flink-sql-client sql-client.sh ``` - Finally, run following SQL statements to create the `publication_events` table backed by Kafka running in Docker, populate it with - test data, and run the filter query. + Run following SQL statement to create a table named `publication_events` that represents book publications: ```sql CREATE TABLE publication_events ( @@ -110,6 +108,8 @@ against Flink and Kafka running in Docker, or with Confluent Cloud. ); ``` + Populate the table with test data: + ```sql INSERT INTO publication_events VALUES (0, 'C.S. Lewis', 'The Silver Chair'), @@ -124,6 +124,8 @@ against Flink and Kafka running in Docker, or with Confluent Cloud. (9, 'Mario Puzo', 'The Godfather'); ``` + Use the [`WHERE` clause](https://docs.confluent.io/cloud/current/flink/reference/queries/select.html#where-clause) to filter the rows down to the publications written by George R. R. Martin: + ```sql SELECT * FROM publication_events @@ -147,54 +149,3 @@ against Flink and Kafka running in Docker, or with Confluent Cloud. ```
- -
- Confluent Cloud - - ### Prerequisites - - * A [Confluent Cloud](https://confluent.cloud/signup) account - * A Flink compute pool created in Confluent Cloud. Follow [this](https://docs.confluent.io/cloud/current/flink/get-started/quick-start-cloud-console.html) quick start to create one. - - ### Run the commands - - In the Confluent Cloud Console, navigate to your environment and then click the `Open SQL Workspace` button for the compute - pool that you have created. - - Select the default catalog (Confluent Cloud environment) and database (Kafka cluster) to use with the dropdowns at the top right. - - Finally, run following SQL statements to create the `publication_events` table, populate it with test data, and run the filter query. - - ```sql - CREATE TABLE publication_events ( - book_id INT, - author STRING, - title STRING - ); - ``` - - ```sql - INSERT INTO publication_events VALUES - (0, 'C.S. Lewis', 'The Silver Chair'), - (1, 'George R. R. Martin', 'A Song of Ice and Fire'), - (2, 'C.S. Lewis', 'Perelandra'), - (3, 'George R. R. Martin', 'Fire & Blood'), - (4, 'J. R. R. Tolkien', 'The Hobbit'), - (5, 'J. R. R. Tolkien', 'The Lord of the Rings'), - (6, 'George R. R. Martin', 'A Dream of Spring'), - (7, 'J. R. R. Tolkien', 'The Fellowship of the Ring'), - (8, 'George R. R. Martin', 'The Ice Dragon'), - (9, 'Mario Puzo', 'The Godfather'); - ``` - - ```sql - SELECT * - FROM publication_events - WHERE author = 'George R. R. Martin'; - ``` - - The query output should look like this: - - ![Query output](https://raw.githubusercontent.com/confluentinc/tutorials/master/filtering/flinksql/img/query-output.png) - -
diff --git a/filtering/flinksql/build.gradle b/filtering/flinksql/build.gradle deleted file mode 100644 index b2b95b81..00000000 --- a/filtering/flinksql/build.gradle +++ /dev/null @@ -1,37 +0,0 @@ -buildscript { - repositories { - mavenCentral() - } -} - -plugins { - id 'java' - id 'idea' -} - -java { - sourceCompatibility = JavaVersion.VERSION_17 - targetCompatibility = JavaVersion.VERSION_17 -} -version = "0.0.1" - -repositories { - mavenCentral() -} - -dependencies { - testImplementation project(path: ':common', configuration: 'testArtifacts') - testImplementation 'com.google.guava:guava:31.1-jre' - testImplementation 'junit:junit:4.13.2' - testImplementation 'org.testcontainers:testcontainers:1.19.3' - testImplementation 'org.testcontainers:kafka:1.19.3' - testImplementation 'commons-codec:commons-codec:1.17.0' - testImplementation 'org.apache.flink:flink-sql-connector-kafka:3.2.0-1.19' - testImplementation 'org.apache.flink:flink-connector-base:1.19.1' - testImplementation 'org.apache.flink:flink-sql-avro-confluent-registry:1.19.1' - testImplementation 'org.apache.flink:flink-test-utils:1.19.1' - testImplementation 'org.apache.flink:flink-test-utils-junit:1.19.1' - testImplementation 'org.apache.flink:flink-table-api-java-bridge:1.19.1' - testImplementation 'org.apache.flink:flink-table-planner_2.12:1.19.1' - testImplementation 'org.apache.flink:flink-table-planner_2.12:1.19.1:tests' -} diff --git a/filtering/flinksql/img/query-output.png b/filtering/flinksql/img/query-output.png index 0b75e084..2cee980b 100644 Binary files a/filtering/flinksql/img/query-output.png and b/filtering/flinksql/img/query-output.png differ diff --git a/filtering/flinksql/settings.gradle b/filtering/flinksql/settings.gradle deleted file mode 100644 index 5b5dfa8c..00000000 --- a/filtering/flinksql/settings.gradle +++ /dev/null @@ -1,12 +0,0 @@ -/* - * This file was generated by the Gradle 'init' task. - * - * The settings file is used to specify which projects to include in your build. - * - * Detailed information about configuring a multi-project build in Gradle can be found - * in the user manual at https://docs.gradle.org/6.7.1/userguide/multi_project_builds.html - */ - -rootProject.name = 'filtering' -include ':common' -project(':common').projectDir = file('../../common') diff --git a/filtering/flinksql/src/test/java/io/confluent/developer/FlinkSqlFilteringTest.java b/filtering/flinksql/src/test/java/io/confluent/developer/FlinkSqlFilteringTest.java deleted file mode 100644 index e5723ff4..00000000 --- a/filtering/flinksql/src/test/java/io/confluent/developer/FlinkSqlFilteringTest.java +++ /dev/null @@ -1,42 +0,0 @@ -package io.confluent.developer; - - -import org.apache.flink.table.api.TableResult; -import org.apache.flink.types.Row; -import org.apache.flink.types.RowKind; -import org.junit.Test; - -import java.util.ArrayList; -import java.util.List; -import java.util.Optional; - -import static io.confluent.developer.TestUtils.rowObjectsFromTableResult; -import static org.junit.Assert.assertEquals; - -public class FlinkSqlFilteringTest extends AbstractFlinkKafkaTest { - - @Test - public void testFilter() throws Exception { - // create base publications table and populate with test data - streamTableEnv.executeSql(getResourceFileContents("create-all-publications.sql.template", - Optional.of(kafkaPort),Optional.of(schemaRegistryPort))).await(); - streamTableEnv.executeSql(getResourceFileContents("populate-publication-events.sql")).await(); - - // execute query on result table that should have publications for 1 author - TableResult tableResult = streamTableEnv.executeSql(getResourceFileContents("query-publications-by-author.sql")); - - // Compare actual and expected results - List actualResults = rowObjectsFromTableResult(tableResult); - List expectedRowResults = getExpectedFinalUpdateRowObjects(); - assertEquals(actualResults, expectedRowResults); - } - - private List getExpectedFinalUpdateRowObjects() { - List rowList = new ArrayList<>(); - rowList.add(Row.ofKind(RowKind.INSERT, 1, "A Song of Ice and Fire")); - rowList.add(Row.ofKind(RowKind.INSERT, 3, "Fire & Blood")); - rowList.add(Row.ofKind(RowKind.INSERT, 6, "A Dream of Spring")); - rowList.add(Row.ofKind(RowKind.INSERT, 8, "The Ice Dragon")); - return rowList; - } -} diff --git a/filtering/flinksql/src/test/resources/create-all-publications.sql.template b/filtering/flinksql/src/test/resources/create-all-publications.sql.template deleted file mode 100644 index 7cff5ef3..00000000 --- a/filtering/flinksql/src/test/resources/create-all-publications.sql.template +++ /dev/null @@ -1,16 +0,0 @@ -CREATE TABLE publication_events ( - book_id INT, - author STRING, - title STRING -) WITH ( - 'connector' = 'kafka', - 'topic' = 'publication_events', - 'properties.bootstrap.servers' = 'localhost:KAFKA_PORT', - 'scan.startup.mode' = 'earliest-offset', - 'scan.bounded.mode' = 'latest-offset', - 'key.format' = 'raw', - 'key.fields' = 'book_id', - 'value.format' = 'avro-confluent', - 'value.avro-confluent.url' = 'http://localhost:SCHEMA_REGISTRY_PORT', - 'value.fields-include' = 'EXCEPT_KEY' -); diff --git a/filtering/flinksql/src/test/resources/populate-publication-events.sql b/filtering/flinksql/src/test/resources/populate-publication-events.sql deleted file mode 100644 index 9060fda0..00000000 --- a/filtering/flinksql/src/test/resources/populate-publication-events.sql +++ /dev/null @@ -1,11 +0,0 @@ -INSERT INTO publication_events VALUES - (0, 'C.S. Lewis', 'The Silver Chair'), - (1, 'George R. R. Martin', 'A Song of Ice and Fire'), - (2, 'C.S. Lewis', 'Perelandra'), - (3, 'George R. R. Martin', 'Fire & Blood'), - (4, 'J. R. R. Tolkien', 'The Hobbit'), - (5, 'J. R. R. Tolkien', 'The Lord of the Rings'), - (6, 'George R. R. Martin', 'A Dream of Spring'), - (7, 'J. R. R. Tolkien', 'The Fellowship of the Ring'), - (8, 'George R. R. Martin', 'The Ice Dragon'), - (9, 'Mario Puzo', 'The Godfather'); diff --git a/filtering/flinksql/src/test/resources/query-publications-by-author.sql b/filtering/flinksql/src/test/resources/query-publications-by-author.sql deleted file mode 100644 index 4c504d46..00000000 --- a/filtering/flinksql/src/test/resources/query-publications-by-author.sql +++ /dev/null @@ -1,5 +0,0 @@ -SELECT - book_id, - title -FROM publication_events -WHERE author = 'George R. R. Martin'; diff --git a/filtering/ksql/README.md b/filtering/ksql/README.md index cfbed552..8e5617de 100644 --- a/filtering/ksql/README.md +++ b/filtering/ksql/README.md @@ -175,9 +175,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/flatten-nested-data/ksql/README.md b/flatten-nested-data/ksql/README.md index c4f4b7ec..2f5078b6 100644 --- a/flatten-nested-data/ksql/README.md +++ b/flatten-nested-data/ksql/README.md @@ -298,9 +298,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/geo-distance/ksql/README.md b/geo-distance/ksql/README.md index 814c5822..9ff4346f 100644 --- a/geo-distance/ksql/README.md +++ b/geo-distance/ksql/README.md @@ -203,9 +203,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/hopping-windows/ksql/README.md b/hopping-windows/ksql/README.md index db64ed57..5d968195 100644 --- a/hopping-windows/ksql/README.md +++ b/hopping-windows/ksql/README.md @@ -193,9 +193,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/joining-stream-stream/ksql/README.md b/joining-stream-stream/ksql/README.md index 1d519ecb..93535d17 100644 --- a/joining-stream-stream/ksql/README.md +++ b/joining-stream-stream/ksql/README.md @@ -215,9 +215,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/joining-stream-table/ksql/README.md b/joining-stream-table/ksql/README.md index 2fdf3516..4c1984d2 100644 --- a/joining-stream-table/ksql/README.md +++ b/joining-stream-table/ksql/README.md @@ -198,9 +198,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/joining-table-table/ksql/README.md b/joining-table-table/ksql/README.md index 65125614..f556572b 100644 --- a/joining-table-table/ksql/README.md +++ b/joining-table-table/ksql/README.md @@ -202,9 +202,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/ksql-heterogeneous-json/ksql/README.md b/ksql-heterogeneous-json/ksql/README.md index 2a74e91a..4da062d6 100644 --- a/ksql-heterogeneous-json/ksql/README.md +++ b/ksql-heterogeneous-json/ksql/README.md @@ -205,9 +205,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then select - `Topics` in the lefthand navigation. Create a topic called `data_stream` with 1 partition, and in the `Messages` tab, + `Topics` in the left-hand navigation. Create a topic called `data_stream` with 1 partition, and in the `Messages` tab, produce the following four events as the `Value`, one at a time. ```noformat @@ -302,7 +302,7 @@ You can run the example backing this tutorial in one of two ways: locally with t } ``` - Next, select `ksqlDB` in the lefthand navigation. + Next, select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/masking-data/ksql/README.md b/masking-data/ksql/README.md index 6329d260..13151b5c 100644 --- a/masking-data/ksql/README.md +++ b/masking-data/ksql/README.md @@ -172,9 +172,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/merging/ksql/README.md b/merging/ksql/README.md index fe6773e4..a0ee0650 100644 --- a/merging/ksql/README.md +++ b/merging/ksql/README.md @@ -190,9 +190,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/multi-joins/ksql/README.md b/multi-joins/ksql/README.md index b1c7da3c..1c5981b5 100644 --- a/multi-joins/ksql/README.md +++ b/multi-joins/ksql/README.md @@ -212,9 +212,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/multiple-event-types-avro/kafka/README.md b/multiple-event-types-avro/kafka/README.md index aa55b1d5..c241447f 100644 --- a/multiple-event-types-avro/kafka/README.md +++ b/multiple-event-types-avro/kafka/README.md @@ -146,7 +146,7 @@ Using the Confluent Cloud Console, create a topic with default settings called ` ### Generate client configuration -In the Confluent Cloud Console, navigate to the Cluster Overview page. Select `Clients` in the lefthand navigation and create a new `Java` client. Generate API keys during this step, and download the generated client configuration. Place it at `multiple-event-types-avro/kafka/cloud.properties`. +In the Confluent Cloud Console, navigate to the Cluster Overview page. Select `Clients` in the left-hand navigation and create a new `Java` client. Generate API keys during this step, and download the generated client configuration. Place it at `multiple-event-types-avro/kafka/cloud.properties`. ### Register schemas @@ -156,7 +156,7 @@ Run the following task to register the schemas in Schema Registry: ./gradlew :multiple-event-types-avro:kafka:registerSchemasTask ``` -In the Confluent Cloud Console, navigate to `Topics` in the lefthand navigation, select the `avro-events` topic, and click `Schema`. Validate that a `Value` schema has been set. +In the Confluent Cloud Console, navigate to `Topics` in the left-hand navigation, select the `avro-events` topic, and click `Schema`. Validate that a `Value` schema has been set. ### Build the application diff --git a/multiple-event-types-protobuf/kafka/README.md b/multiple-event-types-protobuf/kafka/README.md index bd6663c8..c7361823 100644 --- a/multiple-event-types-protobuf/kafka/README.md +++ b/multiple-event-types-protobuf/kafka/README.md @@ -131,7 +131,7 @@ Using the Confluent Cloud Console, create a topic with default settings called ` ### Generate client configuration -In the Confluent Cloud Console, navigate to the Cluster Overview page. Select `Clients` in the lefthand navigation and create a new `Java` client. Generate API keys during this step, and download the generated client configuration. Place it at `multiple-event-types-protobuf/kafka/cloud.properties`. +In the Confluent Cloud Console, navigate to the Cluster Overview page. Select `Clients` in the left-hand navigation and create a new `Java` client. Generate API keys during this step, and download the generated client configuration. Place it at `multiple-event-types-protobuf/kafka/cloud.properties`. ### Register schemas @@ -141,7 +141,7 @@ Run the following task to register the schemas in Schema Registry: ./gradlew :multiple-event-types-protobuf:kafka:registerSchemasTask ``` -In the Confluent Cloud Console, navigate to `Topics` in the lefthand navigation, select the `proto-events` topic, and click `Schema`. Validate that a `Value` schema has been set. +In the Confluent Cloud Console, navigate to `Topics` in the left-hand navigation, select the `proto-events` topic, and click `Schema`. Validate that a `Value` schema has been set. ### Build the application diff --git a/optimize-producer-throughput/kafka/README.md b/optimize-producer-throughput/kafka/README.md index eabe9e0d..1b753134 100644 --- a/optimize-producer-throughput/kafka/README.md +++ b/optimize-producer-throughput/kafka/README.md @@ -96,9 +96,9 @@ Now run the same test but with producer configuration tuned for higher throughpu First, create a cluster if you haven't already. You can do this in the Confluent Cloud Console by navigating to your environment and then clicking `Add cluster`. -Once you have a cluster running, navigate to `Topics` in the lefthand navigation and create a topic `topic-perf` with the default topic configuration. +Once you have a cluster running, navigate to `Topics` in the left-hand navigation and create a topic `topic-perf` with the default topic configuration. -Next, go to the Cluster Overview page and click `Clients` in the lefthand navigation. Click `Java` and generate a configuration file that includes API keys. +Next, go to the Cluster Overview page and click `Clients` in the left-hand navigation. Click `Java` and generate a configuration file that includes API keys. pool that you have created. Copy the configuration file locally to `optimize-producer-throughput/kafka/cloud.properties`. Now, run a baseline performance test with Docker: diff --git a/rekeying/ksql/README.md b/rekeying/ksql/README.md index 5293c3e7..ddd898ba 100644 --- a/rekeying/ksql/README.md +++ b/rekeying/ksql/README.md @@ -183,9 +183,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/serialization/ksql/README.md b/serialization/ksql/README.md index 43175853..6af0e965 100644 --- a/serialization/ksql/README.md +++ b/serialization/ksql/README.md @@ -145,9 +145,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/session-windows/ksql/README.md b/session-windows/ksql/README.md index d52adc85..005d04c4 100644 --- a/session-windows/ksql/README.md +++ b/session-windows/ksql/README.md @@ -182,9 +182,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/splitting/ksql/README.md b/splitting/ksql/README.md index dc698755..c926d937 100644 --- a/splitting/ksql/README.md +++ b/splitting/ksql/README.md @@ -183,9 +183,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/time-concepts/ksql/README.md b/time-concepts/ksql/README.md index b358ebb5..63321ac5 100644 --- a/time-concepts/ksql/README.md +++ b/time-concepts/ksql/README.md @@ -168,9 +168,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/transforming/ksql/README.md b/transforming/ksql/README.md index f6d782e5..4940faed 100644 --- a/transforming/ksql/README.md +++ b/transforming/ksql/README.md @@ -165,9 +165,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor. diff --git a/tumbling-windows/ksql/README.md b/tumbling-windows/ksql/README.md index 49d3d674..de5a0c23 100644 --- a/tumbling-windows/ksql/README.md +++ b/tumbling-windows/ksql/README.md @@ -191,9 +191,9 @@ You can run the example backing this tutorial in one of two ways: locally with t ### Run the commands - Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the lefthand navigation, + Login to the [Confluent Cloud Console](https://confluent.cloud/). Select `Environments` in the left-hand navigation, and then click the `ksqldb-tutorial` environment tile. Click the `ksqldb-tutorial` Kafka cluster tile, and then - select `ksqlDB` in the lefthand navigation. + select `ksqlDB` in the left-hand navigation. The cluster may take a few minutes to be provisioned. Once its status is `Up`, click the cluster name and scroll down to the editor.