Skip to content

GH-2423: Upgrade to Kafka 3.3.1, Streams Proc. API #2427

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 3, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ ext {
jaywayJsonPathVersion = '2.6.0'
junit4Version = '4.13.2'
junitJupiterVersion = '5.9.0'
kafkaVersion = '3.2.3'
kafkaVersion = '3.3.1'
log4jVersion = '2.18.0'
micrometerVersion = '1.10.0-M6'
micrometerTracingVersion = '1.0.0-M8'
Expand Down
2 changes: 1 addition & 1 deletion spring-kafka-docs/src/main/asciidoc/appendix.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ If you wish to use a different version of `kafka-clients` or `kafka-streams`, an
.Maven
----
<properties>
<kafka.version>3.2.3</kafka.version>
<kafka.version>3.3.1</kafka.version>
</properties>

<dependency>
Expand Down
2 changes: 1 addition & 1 deletion spring-kafka-docs/src/main/asciidoc/quick-tour.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ However, the quickest way to get started is to use https://start.spring.io[start

This quick tour works with the following versions:

* Apache Kafka Clients 3.2.x
* Apache Kafka Clients 3.3.x
* Spring Framework 6.0.x
* Minimum Java version: 17

Expand Down
24 changes: 13 additions & 11 deletions spring-kafka-docs/src/main/asciidoc/streams.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -239,12 +239,13 @@ Starting with version 2.7, the default is to never clean up local state.
[[streams-header-enricher]]
==== Header Enricher

Version 2.3 added the `HeaderEnricher` implementation of `Transformer`.
Version 3.0 added the `HeaderEnricherProcessor` extension of `ContextualProcessor`; providing the same functionality as the deprecated `HeaderEnricher` which implemented the deprecated `Transformer` interface.
This can be used to add headers within the stream processing; the header values are SpEL expressions; the root object of the expression evaluation has 3 properties:

* `context` - the `ProcessorContext`, allowing access to the current record metadata
* `record` - the `org.apache.kafka.streams.processor.api.Record` (`key`, `value`, `timestamp`, `headers`)
* `key` - the key of the current record
* `value` - the value of the current record
* `context` - the `ProcessorContext`, allowing access to the current record metadata

The expressions must return a `byte[]` or a `String` (which will be converted to `byte[]` using `UTF-8`).

Expand All @@ -253,18 +254,18 @@ To use the enricher within a stream:
====
[source, java]
----
.transform(() -> enricher)
.process(() -> new HeaderEnricherProcessor(expressions))
----
====

The transformer does not change the `key` or `value`; it simply adds headers.
The processor does not change the `key` or `value`; it simply adds headers.

IMPORTANT: If your stream is multi-threaded, you need a new instance for each record.
IMPORTANT: You need a new instance for each record.

====
[source, java]
----
.transform(() -> new HeaderEnricher<..., ...>(expressionMap))
.process(() -> new HeaderEnricherProcessor<..., ...>(expressionMap))
----
====

Expand All @@ -276,19 +277,20 @@ Here is a simple example, adding one literal header and one variable:
Map<String, Expression> headers = new HashMap<>();
headers.put("header1", new LiteralExpression("value1"));
SpelExpressionParser parser = new SpelExpressionParser();
headers.put("header2", parser.parseExpression("context.timestamp() + ' @' + context.offset()"));
HeaderEnricher<String, String> enricher = new HeaderEnricher<>(headers);
headers.put("header2", parser.parseExpression("record.timestamp() + ' @' + record.offset()"));
ProcessorSupplier supplier = () -> new HeaderEnricher<String, String> enricher = new HeaderEnricher<>(headers);
KStream<String, String> stream = builder.stream(INPUT);
stream
.transform(() -> enricher)
.process(() -> supplier)
.to(OUTPUT);
----
====

[[streams-messaging]]
==== `MessagingTransformer`
==== `MessagingProcessor`

Version 2.3 added the `MessagingTransformer` this allows a Kafka Streams topology to interact with a Spring Messaging component, such as a Spring Integration flow.
Version 3.0 added the `MessagingProcessor` extension of `ContextualProcessor`; providing the same functionality as the deprecated `MessagingTransformer` which implemented the deprecated `Transformer` interface.
This allows a Kafka Streams topology to interact with a Spring Messaging component, such as a Spring Integration flow.
The transformer requires an implementation of `MessagingFunction`.

====
Expand Down
2 changes: 1 addition & 1 deletion spring-kafka-docs/src/main/asciidoc/whats-new.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ For changes in earlier version, see <<history>>.
[[x30-kafka-client]]
==== Kafka Client Version

This version requires the 3.2.0 `kafka-clients`.
This version requires the 3.3.1 `kafka-clients`.

[[x30-eos]]
==== Exactly Once Semantics
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,6 @@
import org.apache.kafka.clients.consumer.StickyAssignor;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.RoundRobinPartitioner;
import org.apache.kafka.clients.producer.UniformStickyPartitioner;
import org.apache.kafka.clients.producer.internals.DefaultPartitioner;
import org.apache.kafka.common.message.CreateTopicsRequestData.CreatableTopic;
import org.apache.kafka.common.protocol.Message;
import org.apache.kafka.common.serialization.ByteArrayDeserializer;
Expand Down Expand Up @@ -97,6 +95,7 @@
*/
public class KafkaRuntimeHints implements RuntimeHintsRegistrar {

@SuppressWarnings("deprecation")
@Override
public void registerHints(RuntimeHints hints, @Nullable ClassLoader classLoader) {
ReflectionHints reflectionHints = hints.reflection();
Expand Down Expand Up @@ -147,9 +146,9 @@ public void registerHints(RuntimeHints hints, @Nullable ClassLoader classLoader)
RoundRobinAssignor.class,
StickyAssignor.class,
// standard partitioners
DefaultPartitioner.class,
org.apache.kafka.clients.producer.internals.DefaultPartitioner.class,
RoundRobinPartitioner.class,
UniformStickyPartitioner.class,
org.apache.kafka.clients.producer.UniformStickyPartitioner.class,
// standard serialization
ByteArrayDeserializer.class,
ByteArraySerializer.class,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,10 @@
*
* @author Gary Russell
* @since 2.3
* @deprecated in favor of {@link HeaderEnricherProcessor}.
*
*/
@Deprecated
public class HeaderEnricher<K, V> implements Transformer<K, V, KeyValue<K, V>> {

private final Map<String, Expression> headerExpressions = new HashMap<>();
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,117 @@
/*
* Copyright 2019-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.springframework.kafka.streams;

import java.nio.charset.StandardCharsets;
import java.util.HashMap;
import java.util.Map;

import org.apache.kafka.common.header.Headers;
import org.apache.kafka.common.header.internals.RecordHeader;
import org.apache.kafka.streams.processor.api.ContextualProcessor;
import org.apache.kafka.streams.processor.api.ProcessorContext;
import org.apache.kafka.streams.processor.api.Record;

import org.springframework.expression.Expression;

/**
* Manipulate the headers.
*
* @param <K> the input key type.
* @param <V> the input value type.
*
* @author Gary Russell
* @since 3.0
*
*/
public class HeaderEnricherProcessor<K, V> extends ContextualProcessor<K, V, K, V> {

private final Map<String, Expression> headerExpressions = new HashMap<>();

/**
* Construct an instance with the provided header expressions.
* @param headerExpressions the header expressions; name:expression.
*/
public HeaderEnricherProcessor(Map<String, Expression> headerExpressions) {
this.headerExpressions.putAll(headerExpressions);
}

@Override
public void process(Record<K, V> record) {
Headers headers = record.headers();
Container<K, V> container = new Container<>(context(), record.key(), record.value(), record);
this.headerExpressions.forEach((name, expression) -> {
Object headerValue = expression.getValue(container);
if (headerValue instanceof String) {
headerValue = ((String) headerValue).getBytes(StandardCharsets.UTF_8);
}
else if (!(headerValue instanceof byte[])) {
throw new IllegalStateException("Invalid header value type: " + headerValue.getClass());
}
headers.add(new RecordHeader(name, (byte[]) headerValue));
});
context().forward(record);
}

@Override
public void close() {
// NO-OP
}

/**
* Container object for SpEL evaluation.
*
* @param <K> the key type.
* @param <V> the value type.
*
*/
public static final class Container<K, V> {

private final ProcessorContext<K, V> context;

private final K key;

private final V value;

private final Record record;

Container(ProcessorContext<K, V> context, K key, V value, Record record) {
this.context = context;
this.key = key;
this.value = value;
this.record = record;
}

public ProcessorContext getContext() {
return this.context;
}

public K getKey() {
return this.key;
}

public V getValue() {
return this.value;
}

public Record getRecord() {
return this.record;
}

}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
/*
* Copyright 2019-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.springframework.kafka.streams.messaging;

import java.util.ArrayList;
import java.util.List;
import java.util.Optional;

import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.header.Headers;
import org.apache.kafka.common.record.TimestampType;
import org.apache.kafka.streams.kstream.Transformer;
import org.apache.kafka.streams.processor.api.ContextualProcessor;
import org.apache.kafka.streams.processor.api.ProcessorContext;
import org.apache.kafka.streams.processor.api.Record;
import org.apache.kafka.streams.processor.api.RecordMetadata;

import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.kafka.support.converter.MessagingMessageConverter;
import org.springframework.messaging.Message;
import org.springframework.util.Assert;

/**
* A {@link Transformer} implementation that invokes a {@link MessagingFunction}
* converting to/from spring-messaging {@link Message}. Can be used, for example,
* to invoke a Spring Integration flow.
*
* @param <Kin> the input key type.
* @param <Vin> the input value type.
* @param <Kout> the output key type.
* @param <Vout> the output value type.
*
* @author Gary Russell
* @since 2.3
*
*/
public class MessagingProcessor<Kin, Vin, Kout, Vout> extends ContextualProcessor<Kin, Vin, Kout, Vout> {

private final MessagingFunction function;

private final MessagingMessageConverter converter;

/**
* Construct an instance with the provided function and converter.
* @param function the function.
* @param converter the converter.
*/
public MessagingProcessor(MessagingFunction function, MessagingMessageConverter converter) {
Assert.notNull(function, "'function' cannot be null");
Assert.notNull(converter, "'converter' cannot be null");
this.function = function;
this.converter = converter;
}

@SuppressWarnings("unchecked")
@Override
public void process(Record<Kin, Vin> record) {
ProcessorContext<Kout, Vout> context = context();
RecordMetadata meta = context.recordMetadata().orElse(null);
Assert.state(meta != null, "No record metadata present");
Headers headers = record.headers();
ConsumerRecord<Object, Object> rebuilt = new ConsumerRecord<Object, Object>(meta.topic(),
meta.partition(), meta.offset(),
record.timestamp(), TimestampType.NO_TIMESTAMP_TYPE,
0, 0,
record.key(), record.value(),
headers, Optional.empty());
Message<?> message = this.converter.toMessage(rebuilt, null, null, null);
message = this.function.exchange(message);
List<String> headerList = new ArrayList<>();
headers.forEach(header -> headerList.add(header.key()));
headerList.forEach(name -> headers.remove(name));
ProducerRecord<?, ?> fromMessage = this.converter.fromMessage(message, "dummy");
fromMessage.headers().forEach(header -> {
if (!header.key().equals(KafkaHeaders.TOPIC)) {
headers.add(header);
}
});
context.forward(new Record<>((Kout) message.getHeaders().get(KafkaHeaders.KEY), (Vout) message.getPayload(),
record.timestamp(), headers));
}

@Override
public void close() {
// NO-OP
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -44,8 +44,10 @@
*
* @author Gary Russell
* @since 2.3
* @deprecated in favor of {@link MessagingProcessor}.
*
*/
@Deprecated
public class MessagingTransformer<K, V, R> implements Transformer<K, V, KeyValue<K, R>> {

private final MessagingFunction function;
Expand Down
Loading