SparkSession初始化引发ExceptionInInitializerError错误。

huangapple go评论77阅读模式
英文:

SparkSession initilization throws ExceptionInInitializerError

问题

Sure, here is the translated content you provided:

我尝试运行一个简单的Spark Structured Streaming作业,但在调用`SparkSession`的`getOrCreate()`时出现错误...

我像这样创建了`SparkSession`:
```java
SparkSession spark = SparkSession
                .builder()
                .appName("CountryCount")
                .master("local[*]")
                .getOrCreate();

使用这个pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <artifactId>spark-streaming</artifactId>
    <version>1.0</version>
    <packaging>jar</packaging>

    <properties>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
        <spark.version>3.0.0</spark.version>
        <mvn-shade.version>3.2.4</mvn-shade.version>
        <slf4j.version>1.7.30</slf4j.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>${mvn-shade.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 -->
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>${slf4j.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.12</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-10_2.12</artifactId>
            <version>${spark.version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>${mvn-shade.version}</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <transformers>
                                <transformer
                                        implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                            </transformers>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

然而,我遇到了以下异常:

Exception in thread "main" java.lang.ExceptionInInitializerError
	at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:442)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
	at JobCountryCount.createJob(JobCountryCount.java:43)
	at JobCountryCount.<init>(JobCountryCount.java:27)
	at JobCountryCount.main(JobCountryCount.java:21)
Caused by: java.lang.NullPointerException
	at org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
	at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
	at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
	... 14 more

提前感谢您!


<details>
<summary>英文:</summary>
I&#39;m trying to run a simple Spark Structured Streaming job, but I get an error when calling `getOrCreate()` on `SparkSession`...
I create the `SparkSession` like this:
```java
SparkSession spark = SparkSession
.builder()
.appName(&quot;CountryCount&quot;)
.master(&quot;local[*]&quot;)
.getOrCreate();

Using this pom.xml:

&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
&lt;project xmlns=&quot;http://maven.apache.org/POM/4.0.0&quot;
         xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;
         xsi:schemaLocation=&quot;http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd&quot;&gt;

    &lt;artifactId&gt;spark-streaming&lt;/artifactId&gt;
    &lt;version&gt;1.0&lt;/version&gt;
    &lt;packaging&gt;jar&lt;/packaging&gt;

    &lt;properties&gt;
        &lt;maven.compiler.source&gt;11&lt;/maven.compiler.source&gt;
        &lt;maven.compiler.target&gt;11&lt;/maven.compiler.target&gt;
        &lt;spark.version&gt;3.0.0&lt;/spark.version&gt;
        &lt;mvn-shade.version&gt;3.2.4&lt;/mvn-shade.version&gt;
        &lt;slf4j.version&gt;1.7.30&lt;/slf4j.version&gt;
    &lt;/properties&gt;

    &lt;dependencies&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
            &lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
            &lt;version&gt;${mvn-shade.version}&lt;/version&gt;
        &lt;/dependency&gt;

        &lt;!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 --&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
            &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
            &lt;version&gt;${slf4j.version}&lt;/version&gt;
        &lt;/dependency&gt;

        &lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-core_2.12&lt;/artifactId&gt;
            &lt;version&gt;${spark.version}&lt;/version&gt;
        &lt;/dependency&gt;
        &lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming --&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-streaming_2.12&lt;/artifactId&gt;
            &lt;version&gt;${spark.version}&lt;/version&gt;
            &lt;scope&gt;provided&lt;/scope&gt;
        &lt;/dependency&gt;
        &lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql --&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-sql_2.12&lt;/artifactId&gt;
            &lt;version&gt;${spark.version}&lt;/version&gt;
        &lt;/dependency&gt;
        &lt;!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 --&gt;
        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-streaming-kafka-0-10_2.12&lt;/artifactId&gt;
            &lt;version&gt;${spark.version}&lt;/version&gt;
        &lt;/dependency&gt;
    &lt;/dependencies&gt;

    &lt;build&gt;
        &lt;plugins&gt;
            &lt;plugin&gt;
                &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
                &lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
                &lt;version&gt;${mvn-shade.version}&lt;/version&gt;
                &lt;executions&gt;
                    &lt;execution&gt;
                        &lt;goals&gt;
                            &lt;goal&gt;shade&lt;/goal&gt;
                        &lt;/goals&gt;
                        &lt;configuration&gt;
                            &lt;transformers&gt;
                                &lt;transformer
                                        implementation=&quot;org.apache.maven.plugins.shade.resource.ServicesResourceTransformer&quot;/&gt;
                            &lt;/transformers&gt;
                        &lt;/configuration&gt;
                    &lt;/execution&gt;
                &lt;/executions&gt;
            &lt;/plugin&gt;
        &lt;/plugins&gt;
    &lt;/build&gt;
&lt;/project&gt;

However, I get the following exception:

Exception in thread &quot;main&quot; java.lang.ExceptionInInitializerError
	at org.apache.spark.storage.BlockManagerMasterEndpoint.&lt;init&gt;(BlockManagerMasterEndpoint.scala:93)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
	at org.apache.spark.SparkContext.&lt;init&gt;(SparkContext.scala:442)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
	at JobCountryCount.createJob(JobCountryCount.java:43)
	at JobCountryCount.&lt;init&gt;(JobCountryCount.java:27)
	at JobCountryCount.main(JobCountryCount.java:21)
Caused by: java.lang.NullPointerException
	at org.apache.commons.lang3.SystemUtils.isJavaVersionAtLeast(SystemUtils.java:1654)
	at org.apache.spark.storage.StorageUtils$.&lt;init&gt;(StorageUtils.scala:207)
	at org.apache.spark.storage.StorageUtils$.&lt;clinit&gt;(StorageUtils.scala)
	... 14 more

Thank you in advance!

答案1

得分: 2

看起来你的类路径上的Apache commons-lang库版本低于3.8,不支持JDK11。请查看https://issues.apache.org/jira/browse/LANG-1384。

由于Apache Spark 3.0.0正在使用3.9,我猜测你的环境可能也使用了旧版本的Spark(或Hadoop)。你可以在你的代码中打印classOf[org.apache.commons.lang3.SystemUtils].getResource("SystemUtils.class"),它会告诉你这个类来自哪里。

英文:

Looks like the version of Apache commons-lang library on your classpath is less than 3.8 which doesn't support JDK11. See https://issues.apache.org/jira/browse/LANG-1384 .

Since Apache Spark 3.0.0 is using 3.9, my hunch is your environment may also have an old Spark (or Hadoop) version. You can print classOf[org.apache.commons.lang3.SystemUtils].getResource(&quot;SystemUtils.class&quot;) in your codes. It will tell you where the class comes from.

huangapple
  • 本文由 发表于 2020年7月30日 18:42:41
  • 转载请务必保留本文链接:https://go.coder-hub.com/63171460.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定