首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Spring Cloud DataFlow -无法转换来自Kafka的消息

Spring Cloud DataFlow -无法转换来自Kafka的消息
EN

Stack Overflow用户
提问于 2017-07-17 14:54:54
回答 1查看 1.1K关注 0票数 0

使用Spring Cloud DataFlow 1.2.2版本,配置如下:

代码语言:javascript
复制
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.type=kafka
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.brokers=<MY_BROKER>
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.zkNodes=<MY_ZK>

我正在尝试创建一个从特定主题读取的流,并将其刷新到长接收器中,如下所示:

代码语言:javascript
复制
stream create --name metricsStream --definition ":metrics --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' > bridge | log" --deploy

查看日志文件,我可以看到以下错误:

代码语言:javascript
复制
2017-07-17 09:44:01,700  INFO -kafka-listener-1 log-sink:202 - [B@79d0a6b6 2017-07-17 09:44:01,700 ERROR -kafka-listener-1 o.s.c.s.b.k.KafkaMessageChannelBinder:283 - Could not convert message: 7B226D657472696354696D657374616D70223A313530303233373037302C226D65747269634E616D65223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F313136222C224074696D657374616D70223A22323031372D30372D31365432303A33313A32352E3438325A222C22706F7274223A33363133302C226D657472696356616C7565223A302C224076657273696F6E223A2231222C22686F7374223A223137322E32362E312E313135222C226D657373616765223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F31313620302031353030323337303730227D java.lang.StringIndexOutOfBoundsException: String index out of range: 380

我还尝试为kafka源代码的消费者/生产者配置一些属性

代码语言:javascript
复制
stream create --name metricsStream --definition ":metrics --spring.kafka.consumer.valueDerserializer=org.apache.kafka.common.serialization.StringDeserializer --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' --spring.cloud.stream.bindings.input.consumer.headerMode=raw --spring.cloud.stream.bindings.output.producer.headerMode=raw  --outputType='text/plain;charset=UTF-8' > bridge | log" --deploy

但我得到的结果是一样的

下面是Spring DataFlow打印的消费者详细信息:

代码语言:javascript
复制
2017-07-17 09:43:57,267  INFO main o.a.k.c.c.ConsumerConfig:180 - ConsumerConfig values:    auto.commit.interval.ms = 100   auto.offset.reset = earliest    bootstrap.servers = [172.26.1.63:9092]  check.crcs = true   client.id = consumer-2  connections.max.idle.ms = 540000    enable.auto.commit = false  exclude.internal.topics = true  fetch.max.bytes = 52428800  fetch.max.wait.ms = 500     fetch.min.bytes
= 1     group.id = metrics_KafkaToHdfs_5    heartbeat.interval.ms = 3000    interceptor.classes = null  key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer    max.partition.fetch.bytes = 1048576     max.poll.interval.ms = 300000   max.poll.records = 500  metadata.max.age.ms = 300000    metric.reporters = []   metrics.num.samples = 2     metrics.sample.window.ms = 30000    partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]     receive.buffer.bytes = 65536    reconnect.backoff.ms = 50   request.timeout.ms = 305000     retry.backoff.ms = 100  sasl.kerberos.kinit.cmd = /usr/bin/kinit    sasl.kerberos.min.time.before.relogin = 60000   sasl.kerberos.service.name = null   sasl.kerberos.ticket.renew.jitter
= 0.05  sasl.kerberos.ticket.renew.window.factor = 0.8  sasl.mechanism = GSSAPI     security.protocol = PLAINTEXT   send.buffer.bytes = 131072  session.timeout.ms = 10000  ssl.cipher.suites = null    ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]   ssl.endpoint.identification.algorithm = null    ssl.key.password = null     ssl.keymanager.algorithm = SunX509  ssl.keystore.location = null    ssl.keystore.password = null    ssl.keystore.type = JKS     ssl.protocol = TLS  ssl.provider = null     ssl.secure.random.implementation = null     ssl.trustmanager.algorithm = PKIX   ssl.truststore.location = null  ssl.truststore.password = null  ssl.truststore.type = JKS   value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer

我看到了类似的问题,但没有有效的答案what is the property to accept binary json message in spring-cloud-stream kafka binder

我的Kafka的指标主题包含JSON行。我应该如何配置Spring才能以DataFlow格式(或者至少是类似于JSON的字符串格式?

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2017-07-17 15:09:10

您是否尝试过配置输入内容类型?

spring.cloud.stream.bindings.input.content-type=application/json

或者使用Spring Cloud Dataflow中的前缀:

代码语言:javascript
复制
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.bindings.input.content-type=application/json
票数 3
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/45137749

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档