英文:
Kafka message compression not working as expected
问题
我的项目使用Apache Kafka来发送消息。随着时间的推移,负载变得很大,已经超过了默认的1MB大小。我们考虑在生产者端压缩消息,但它没有按预期工作。
以下是生产者配置,
spring:
kafka:
consumer:
group-id: group_name
bootstrap-servers: broker-server:9343
producer:
bootstrap-servers: ${spring.kafka.consumer.bootstrap-servers}
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
compression-type: gzip
但是,使用上述配置,如果我发送的消息超过1MB,我会遇到以下错误,
org.springframework.kafka.KafkaException: 发送失败;嵌套异常是 org.apache.kafka.common.errors.RecordTooLargeException: 序列化后的消息大小为1909152字节,大于1048576字节,这是max.request.size配置的值
是否需要其他配置来启用压缩?
英文:
My project uses Apache Kafka for sending messages. Over the period of time, payload becomes heavy and has crossed the default size of 1MB. We thought of compressing the message at Producer end, but it is not working as expected.
Here is the producer config,
spring:
kafka:
consumer:
group-id: group_name
bootstrap-servers: broker-server:9343
producer:
bootstrap-servers: ${spring.kafka.consumer.bootstrap-servers}
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
compression-type: gzip
but with above configuration, if I send a message more than 1 MB, I am facing below error,
> org.springframework.kafka.KafkaException: Send failed; nested exception is org.apache.kafka.common.errors.RecordTooLargeException: The message is 1909152 bytes when serialized which is larger than 1048576, which is the value of the max.request.size configuration
Is there any other configuration required to enable compression ?
答案1
得分: 3
max-request-size
配置将在这种情况下帮助您。该配置用于指定请求的最大大小。默认情况下,它设置为1MB,这就是您看到错误的原因。
您可以在您的属性文件中添加以下属性:
spring:
kafka:
producer:
max-request-size: 2097152
注意2MB = 2097152字节(您可以根据您的要求进行更改)。
<details>
<summary>英文:</summary>
The configuration ```max-request-size``` will help you in this case. This configuration is used to specify the maximum size of a request. By default, It is set to 1MB, that why you are seeing the error.
You can add the following property in your property file:
spring:
kafka:
producer:
max-request-size: 2097152
Note 2MB = 2097152 bytes (you can change it according to your requirement).
</details>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论