当前位置:首页> 正文

kafka 解决大消息发送和接收报错问题

kafka 解决大消息发送和接收报错问题

kafka消息超过一定大小会报错如下:

The message is 2044510 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

配置以下配置设置最大上传大小解决:kafka:bootstrap-servers: xxx#生产者producer:# key/value的序列化key-serializer: org.apache.kafka.common.serialization.StringSerializervalue-serializer: org.apache.kafka.common.serialization.StringSerializer# 消息投递失败,重试次数retries: 1#最大发送数量batch-size: 20000#32MB的批处理缓冲区buffer-memory: 33554432properties:max:request:size: 20971520

kafka消息超过一定大小kafka服务器接收端会报错如下

RecordTooLargeException: The request included a message larger than the max message size the server will accept.
1
kakfa配置

1.server.properties中添加
message.max.bytes=5242880

每个分区试图获取的消息字节数。要大于等于message.max.bytes

replica.fetch.max.bytes=6291456

原有的socket.send.buffer.bytes,socket.receive.buffer.bytes,socket.request.max.bytes也要改成和message.max.bytes一样的大小

1
2
3
4
2.producer.properties中添加

请求的最大大小为字节。要小于 message.max.bytes

max.request.size = 5242880

3.consumer.properties中添加

每个提取请求中为每个主题分区提取的消息字节数。要大于等于message.max.bytes

fetch.message.max.bytes=6291456

4.重启kakfa

关闭kakfa

sh kafka-server-stop.sh

启动 kakfa

nohup sh kafka-server-start.sh ../config/server.properties &
1
2
3
4

展开全文阅读

相关内容