英文:
SpringBoot @KafkaListener got MessageConversionException: Cannot convert from A to B
问题
我正在使用SpringBoot 2.7.5遇到了一个问题,@KafkaListener遇到了MessageConversionException。完整的错误日志如下:
Bean [example.package.api.kafka.HelloKafkaListener@30e312a2]; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent] for GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}], failedMessage=GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}]; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent] for GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}], failedMessage=GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}]
消费者配置如下:
consumer:
group-id: my-topic:HelloKafkaListener
auto-offset-reset: latest
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: io.confluent.kafka.serializers.KafkaAvroDeserializer
request.timeout.ms: 5000
我使用的版本如下:
<confluent.version>7.3.0</confluent.version>
<avro.version>1.11.1</avro.version>
<spring.kafka.version>2.8.10</spring.kafka.version>
我的pom.xml文件如下:
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-client</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
</exclusion>
<exclusion>
<groupId>io.swagger.core.v3</groupId>
<artifactId>swagger-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-streams-avro-serde</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
</exclusions>
</dependency>
我的event.avsc文件如下:
{
"namespace": "my.name.space",
"type": "record",
"name": "ExampleEvent",
"doc": "A sample event",
"fields": [
{
"name": "exampleField",
"type": "string"
}
]
}
我使用confluent工具在终端中发送消息如下:
./kafka-avro-console-producer \
--broker-list localhost:59092 --topic my-topic \
--property schema.registry.url=http://kafka-schema:80 \
--property value.schema="$(cat ~/codebase/avro/event.avsc)"
{"exampleField":"HELLO WORLD"}
如果使用./kafka-avro-console-consumer
来获取消息,它可以正常工作。
我的监听器如下:
public class HelloKafkaListener {
private final HelloKafkaService helloKafkaService;
@Autowired
public HelloKafkaListener(HelloKafkaService helloKafkaService) {
this.helloKafkaService = helloKafkaService;
}
@KafkaListener(
topics = "my-topic",
groupId = "my-topic:HelloKafkaListener"
)
public void process(@Payload ExampleEvent event) {
this.helloKafkaService.handleMessage(event);
log.info("Processing event: " + event.getExampleField());
}
}
如果我使用ConsumerRecord record
,它可以正常工作,我可以看到record.value().toString()
是{"exampleField": "HELLO WORLD"}
,但如果切换到ExampleEvent event
,它会抛出异常。有人知道为什么会发生这种情况吗?非常感谢!
在查看评论后,我创建了一个DefaultKafkaConsumerFactory
和ConcurrentKafkaListenerContainerFactory
,如下所示:
@Bean
public DefaultKafkaConsumerFactory<String, ExampleEvent> helloConsumerFactory(
@Value(value = "${spring.kafka.bootstrap-servers}") String bootstrapAddress,
@Value(value =
<details>
<summary>英文:</summary>
I am using SpringBoot 2.7.5 got an issue which the @KafkaListener got the MessageConversionException. The whole error logs show below:
Bean [example.package.api.kafka.HelloKafkaListener@30e312a2]; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent] for GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}], failedMessage=GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}]; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent] for GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}], failedMessage=GenericMessage [payload={"exampleField": "HELLO WORLD"}, headers={kafka_offset=37, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer@3bb288b4, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=my-topic, kafka_receivedTimestamp=1678197991282, kafka_groupId=my-topic:HelloKafkaListener}]
The consumer configurations:
consumer:
group-id: my-topic:HelloKafkaListener
auto-offset-reset: latest
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: io.confluent.kafka.serializers.KafkaAvroDeserializer
request.timeout.ms: 5000
The versions I used are:
<confluent.version>7.3.0</confluent.version>
<avro.version>1.11.1</avro.version>
<spring.kafka.version>2.8.10</spring.kafka.version>
My pom.xml file:
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-client</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
</exclusion>
<exclusion>
<groupId>io.swagger.core.v3</groupId>
<artifactId>swagger-annotations</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-streams-avro-serde</artifactId>
<version>${confluent.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
</exclusion>
</exclusions>
</dependency>
My event.avsc is like this:
{
"namespace": "my.name.space",
"type": "record",
"name": "ExampleEvent",
"doc": "A sample event",
"fields": [
{
"name": "exampleField",
"type": "string"
}
]
}
I use the confluent tool send the message in the terminal like this:
./kafka-avro-console-producer \
--broker-list localhost:59092 --topic my-topic \
--property schema.registry.url=http://kafka-schema:80 \
--property value.schema="$(cat ~/codebase/avro/event.avsc)"
{"exampleField":"HELLO WORLD"}
And if using the `./kafka-avro-console-consumer` to get the message, it works well.
My listener is like this:
public class HelloKafkaListener {
private final HelloKafkaService helloKafkaService;
@Autowired
public HelloKafkaListener(HelloKafkaService helloKafkaService) {
this.helloKafkaService = helloKafkaService;
}
@KafkaListener(
topics = "my-topic",
groupId = "my-topic:HelloKafkaListener"
)
// public void process(ConsumerRecord record) {
// log.info(record.value().toString()); // it works: {"exampleField": "HELLO WORLD"}
// }
public void process(@Payload ExampleEvent event) {
this.helloKafkaService.handleMessage(event);
log.info("Processing event: " + event.getExampleField());
}
}
If I use `ConsumerRecord record`, it works I can see the `record.value().toString()` is `{"exampleField": "HELLO WORLD"}`, but switch to `ExampleEvent event`, it throws exception. Anyone knows why this happens? Many Thanks!
Thank you for the help! After reviewing the comments, I created a `DefaultKafkaConsumerFactory` and `ConcurrentKafkaListenerContainerFactory` like this:
@Bean
public DefaultKafkaConsumerFactory<String, ExampleEvent> helloConsumerFactory(
@Value(value = "${spring.kafka.bootstrap-servers}") String bootstrapAddress,
@Value(value = "${spring.kafka.properties.schema.registry.url}") String schemaRegistryUrl
) {
var props = Map.<String, Object>of(
ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress,
ConsumerConfig.GROUP_ID_CONFIG, groupId
);
var keyDeserializer = new ErrorHandlingDeserializer<>(new StringDeserializer());
var config = new HashMap<String, Object>();
config.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl);
config.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true);
var jsonDeserializer = new KafkaAvroDeserializer(null, config);
var valueDeserializer = new ErrorHandlingDeserializer<>(jsonDeserializer);
var factory = new DefaultKafkaConsumerFactory(props, keyDeserializer, valueDeserializer, false);
return factory;
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, ExampleEvent>
helloKafkaListenerContainerFactory(ConsumerFactory<String, ExampleEvent> helloConsumerFactory) {
ConcurrentKafkaListenerContainerFactory helloContainerFactory =
new ConcurrentKafkaListenerContainerFactory<>();
helloContainerFactory.setConsumerFactory(helloConsumerFactory);
return helloContainerFactory;
}
Updated Listener:
@KafkaListener(
topics = "my-topic",
groupId = "my-topic:HelloKafkaListener",
containerFactory = "helloKafkaListenerContainerFactory"
)
// public void process(ConsumerRecord record) {
// log.info(record.value().toString()); // it works: {"exampleField": "HELLO WORLD"}
// }
public void process(@Payload ExampleEvent event) {
this.helloKafkaService.handleMessage(event);
log.info("Processing event: " + event.getExampleField());
}
However, still got the same issue.
</details>
# 答案1
**得分**: 2
Check this [article][1]. You have to provide the message converter for your custom java object `ExampleEvent`. The `@Payload` just by itself is able to receive just a `String` message.
Try with
```java
@Bean
public ConsumerFactory<String, Greeting> exampleConsumerFactory() {
return new DefaultKafkaConsumerFactory<>(
props,
new StringDeserializer(),
new JsonDeserializer<>(ExampleEvent.class));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, ExampleEvent>
exampleKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, ExampleEvent> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(exampleConsumerFactory());
return factory;
}
And then register this by
@KafkaListener(
topics = "my-topic",
groupId = "my-topic:HelloKafkaListener",
containerFactory = "exampleKafkaListenerContainerFactory" //<-------
)
public void process(ExampleEvent event) {
this.helloKafkaService.handleMessage(event);
log.info("Processing event: " + event.getExampleField());
}
Also ensure that in your dependencies you already have the following otherwise add it manually
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
英文:
Check this article. You have to provide the message converter for your custom java object ExampleEvent
. The @Payload
just by it self is able to receive just a String
message.
Try with
@Bean
public ConsumerFactory<String, Greeting> exampleConsumerFactory() {
return new DefaultKafkaConsumerFactory<>(
props,
new StringDeserializer(),
new JsonDeserializer<>(ExampleEvent.class));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, ExampleEvent>
exampleKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, ExampleEvent> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(exampleConsumerFactory());
return factory;
}
And then register this by
@KafkaListener(
topics = "my-topic",
groupId = "my-topic:HelloKafkaListener",
containerFactory = "exampleKafkaListenerContainerFactory" //<-------
)
public void process(ExampleEvent event) {
this.helloKafkaService.handleMessage(event);
log.info("Processing event: " + event.getExampleField());
}
Also ensure that in your dependencies you already have the following otherwise add it manually
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
答案2
得分: 1
[example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent]
鉴于包是相同的,通常这意味着反序列化程序是由不同于用于加载监听器的类加载器加载的,例如因为正在使用Spring Dev Tools。
当通过属性指定时,反序列化程序是由Kafka客户端而不是Spring加载的。
避免这种情况的一种方法是将反序列化程序直接添加到消费者工厂,而不是通过属性。您可以使用工厂的构造函数或setter,如其他答案中所示。
英文:
>[example.packageapi.ExampleEvent] to [example.packageapi.ExampleEvent]
Given that the packages are the same, this usually means that the deserializer was loaded by a different class loader to that used to load the listener, for example because Spring Dev Tools is being used.
When specified via a property, the deserializer is loaded by the Kafka clients instead of Spring.
One way to avoid that is to add the deserializer directly to the consumer factory instead of via a property. You can use the constructor or setter on the factory, as shown in the other answer.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论