在Databricks上运行Kafka Broker。

huangapple go评论52阅读模式
英文:

run Kafka Broker on Databricks

问题

I would like to install a Kafka Broker on Databricks directly.<br>
The purpose is only to demo a local kafka use case using Spark Streaming.

Here is how I proceeded :

wget https://packages.confluent.io/archive/7.4/confluent-7.4.0.tar.gz
tar -xvf confluent-7.4.0.tar.gz 
./confluent-7.4.0/bin/zookeeper-server-start ./etc/kafka/zookeeper.properties&amp;
./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&amp;

That last command however throw a bizarre error :

./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&amp;

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/databricks/jars/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-slf4j-impl--org.apache.logging.log4j__log4j-slf4j-impl__2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/databricks/driver/confluent-7.4.0/share/java/kafka/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[2023-07-06 12:48:45,041] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2023-07-06 12:48:45,629] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Ljava/lang/Object;
at kafka.Kafka$.getPropsFromArgs(Kafka.scala:42) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka$.main(Kafka.scala:91) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka.main(Kafka.scala) ~[kafka_2.13-7.4.0-ce.jar:?]
Exception in thread &quot;main&quot; java.lang.NoSuchMethodError: scala.Option.orNull(Lscala/$less$colon$less;)Ljava/lang/Object;
at kafka.utils.Exit$.exit(Exit.scala:28)
at kafka.Kafka$.main(Kafka.scala:127)
at kafka.Kafka.main(Kafka.scala)
英文:

I would like to install a Kafka Broker on Databricks directly.<br>
The purpose is only to demo a local kafka use case using Spark Streaming.

Here is how I proceeded :

wget https://packages.confluent.io/archive/7.4/confluent-7.4.0.tar.gz
tar -xvf confluent-7.4.0.tar.gz 
./confluent-7.4.0/bin/zookeeper-server-start ./etc/kafka/zookeeper.properties&amp;
./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&amp;

That last command however throw a bizarre error :

./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&amp;

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/databricks/jars/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-slf4j-impl--org.apache.logging.log4j__log4j-slf4j-impl__2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/databricks/driver/confluent-7.4.0/share/java/kafka/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[2023-07-06 12:48:45,041] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2023-07-06 12:48:45,629] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Ljava/lang/Object;
at kafka.Kafka$.getPropsFromArgs(Kafka.scala:42) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka$.main(Kafka.scala:91) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka.main(Kafka.scala) ~[kafka_2.13-7.4.0-ce.jar:?]
Exception in thread &quot;main&quot; java.lang.NoSuchMethodError: scala.Option.orNull(Lscala/$less$colon$less;)Ljava/lang/Object;
at kafka.utils.Exit$.exit(Exit.scala:28)
at kafka.Kafka$.main(Kafka.scala:127)
at kafka.Kafka.main(Kafka.scala)`

Any idea what could trigger this ?
Thanks

答案1

得分: 1

错误提示指出您正在为代理使用错误的Scala版本

尝试下载2.12版本,如果可用。您还不需要Confluent平台。

英文:

The error is saying you're using an incorrect Scala version for the broker

Try downloading a 2.12 version, if available. You also don't need Confluent Platform

答案2

得分: 0

@OneCricketeer的回答让我再次确认Scala / Kafka版本的兼容性问题。确实"自Apache Kafka 3.0以来,不再支持Scala 2.12,将在Apache Kafka 4.0中移除"。而且Databricks目前还不支持Scala 2.13。因此,我选择使用Kafka 2.2.2,因为它是默认支持2.12的最后一个版本。

https://index.scala-lang.org/apache/kafka/artifacts/kafka

在Databricks上运行Kafka Broker。

我不需要Confluent,但以防万一:我使用了5.2版本来获取2.2版本的Kafka,并且成功使其工作。

在Databricks上运行Kafka Broker。

英文:

@OneCricketeer's answer led me to double check Scala / Kafka version compatibility

Indeed : "Scala 2.12 support has been deprecated since Apache Kafka 3.0 and will be removed in Apache Kafka 4.0"

And Databricks in NOT supporting Scala 2.13 (yet)

So I stuck to Kafka 2.2.2 as it is the last to support 2.12 by default

https://index.scala-lang.org/apache/kafka/artifacts/kafka

在Databricks上运行Kafka Broker。

I dont need confluent at all, but just in case :

I used the 5.2 version to get the 2.2 kafka, and got it to work

在Databricks上运行Kafka Broker。

huangapple
  • 本文由 发表于 2023年7月6日 21:15:21
  • 转载请务必保留本文链接:https://go.coder-hub.com/76629250.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定