英文:
How to configure Kafka environment from different frameworks of Js and Go to ship JSON
问题
我正在构建一些微服务,以及一个与之相关的日志微服务。
日志微服务的代码将在其他微服务将其日志发送到Kafka之后出现。作为一个微服务,日志微服务必须消费所有从基于JS的微服务发送的日志数据,并且我必须在GO中接收JSON数据。
是否有其他方法可以在不使用解析器的情况下实现?(就像g-RPC将整个数据转换为二进制以实现更快的传输,并且在任何环境下都可以理解一样。)
对于使用消息代理时不同环境如何协同工作,我了解得很少。
英文:
I am building a couple of micro-services and a logging micro-service along with it.
The logging micro-service code will come after the other micro-services have sent their logs to kafka. Logging as a micro-service has to consume all the log data which is sent from JS-based micro-services and I have to receive the JSON in GO.
Is there any other way without using a parser? (Just like g-RPC changes the whole data to binary for faster transportation and is understandable for every environment.)
I have very less knowledge about how the different environments work together when using a message broker.
答案1
得分: 0
你可以使用模式(Protobuf、Avro、JSON Schema),这样它就会被转换为二进制格式,尽管为了充分利用其优势,你需要一个模式注册表,否则它将将模式嵌入到每个消息中。
英文:
You can use schemas (Protobuf, Avro, JSON Schema), so it will be converted to binary, although to utilize its full benefits, you will need a schema registry otherwise, it will embed schema to every message.
答案2
得分: 0
我不确定我完全理解你的问题。
Kafka存储字节,而且有JS和Golang两种客户端。这两种语言/环境都需要有可用的序列化和反序列化库。关于Golang,你不需要一个结构体来读取JSON。
> 就像g-RPC将整个数据转换为二进制以实现更快的传输,并且对于每个环境都是可理解的
你可以在Kafka主题中使用Protobuf。
另外,你可以使用Elasticsearch或Splunk等工具来消费任何Kafka事件,并在反序列化记录后索引字段,而无需任何Go消费者。
但是是的,你需要在某个地方使用解析器将数据索引为可搜索的格式。
英文:
I'm not sure I fully understand what you're asking.
Kafka stores bytes, and there's clients in both JS or Golang. Both languages/environments need to have available libraries for serialization and deserialization. Regarding golang, you don't need a struct to read JSON.
> like g-RPC changes the whole data to binary for faster transportation and is understandable for every environment
You can use Protobuf in Kafka topics.
Alternatively, you can use tools like Elasticsearch or Splunk to consume any Kafka event and index the fields after deserializing those records without needing any Go consumer.
But yes, a parser will be needed somewhere to index your data into a searchable format.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论