没有找到字段异常,在使用构建器创建Spark会话时发生

huangapple go评论74阅读模式
英文:

NoSuchFieldException while creating a spark session using builder

问题

以下是您提供的代码的翻译部分:

我正在尝试使用以下代码创建一个Spark会话:

sparkSession = SparkSession.builder().appName(appName).master("local")
                .config("hive.metastore.uris", thriftURL).enableHiveSupport().getOrCreate();

但是它出现了一个NoSuchField异常,错误信息如下:

2020-10-27 20:51:26.963  WARN 11206 --- [  restartedMain] org.apache.hadoop.util.NativeCodeLoader  : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-10-27 20:51:27.053  INFO 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Submitted application: HdfsPoc
2020-10-27 20:51:27.328 ERROR 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Error initializing SparkContext.

java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
	at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	...
Caused by: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
	at java.lang.Class.getDeclaredField(Class.java:2070) ~[na:1.8.0_231]
	at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:127) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	...

以下是在pom.xml文件中添加的Spark和Hadoop依赖项:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.2.1</version>
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.2.1</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-hive_2.11</artifactId>
    <version>2.2.1</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.10.0</version>
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
        </exclusion>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs-client</artifactId>
    <version>2.10.0</version>
    <scope>provided</scope>
</dependency>

根据我阅读的内容,错误可能是由于依赖版本不匹配引起的。您可以尝试解决这个错误的方法。

英文:

I am trying to create a spark session using :

sparkSession = SparkSession.builder().appName(appName).master(&quot;local&quot;)
                .config(&quot;hive.metastore.uris&quot;, thriftURL).enableHiveSupport().getOrCreate();

But it is giving a NoSuchField Exception as follows:

2020-10-27 20:51:26.963  WARN 11206 --- [  restartedMain] org.apache.hadoop.util.NativeCodeLoader  : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2020-10-27 20:51:27.053  INFO 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Submitted application: HdfsPoc
2020-10-27 20:51:27.328 ERROR 11206 --- [  restartedMain] org.apache.spark.SparkContext            : Error initializing SparkContext.

java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
	at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.network.server.TransportServer.init(TransportServer.java:95) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.network.server.TransportServer.&lt;init&gt;(TransportServer.java:74) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:118) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:457) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:456) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2231) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) ~[scala-library-2.11.8.jar:na]
	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2223) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:56) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:246) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.SparkContext.&lt;init&gt;(SparkContext.scala:432) ~[spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516) [spark-core_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918) [spark-sql_2.11-2.2.1.jar:2.2.1]
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910) [spark-sql_2.11-2.2.1.jar:2.2.1]
	at scala.Option.getOrElse(Option.scala:121) [scala-library-2.11.8.jar:na]
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910) [spark-sql_2.11-2.2.1.jar:2.2.1]
	at com.csg.ipro.util.SparkCommonUtility.createSparkSessionWithHive(SparkCommonUtility.java:54) [classes/:na]
	at com.csg.ipro.util.SparkCommonUtility.getSparkSessionWithHive(SparkCommonUtility.java:28) [classes/:na]
	at com.csg.ipro.model.FileWatcher.createSession(FileWatcher.java:84) [classes/:na]
	at com.csg.ipro.service.WatcherService.startMonitoring(WatcherService.java:43) [classes/:na]
	at com.csg.ipro.IproStreamApplication.main(IproStreamApplication.java:21) [classes/:na]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_231]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_231]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_231]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_231]
	at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49) [spring-boot-devtools-2.3.4.RELEASE.jar:2.3.4.RELEASE]
Caused by: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
	at java.lang.Class.getDeclaredField(Class.java:2070) ~[na:1.8.0_231]
	at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:127) ~[spark-network-common_2.11-2.2.1.jar:2.2.1]
	... 31 common frames omitted

Following are the spark and hadoop dependencies added in the pom.xml file:

        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-core_2.11&lt;/artifactId&gt;
            &lt;version&gt;2.2.1&lt;/version&gt;
            &lt;exclusions&gt;
                &lt;exclusion&gt;
                    &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
                    &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
                &lt;/exclusion&gt;
            &lt;/exclusions&gt;
        &lt;/dependency&gt;

        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-sql_2.11&lt;/artifactId&gt;
            &lt;version&gt;2.2.1&lt;/version&gt;
        &lt;/dependency&gt;

        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
            &lt;artifactId&gt;spark-hive_2.11&lt;/artifactId&gt;
            &lt;version&gt;2.2.1&lt;/version&gt;
        &lt;/dependency&gt;

        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
            &lt;artifactId&gt;hadoop-common&lt;/artifactId&gt;
            &lt;version&gt;2.10.0&lt;/version&gt;
            &lt;exclusions&gt;
                &lt;exclusion&gt;
                    &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
                    &lt;artifactId&gt;slf4j-api&lt;/artifactId&gt;
                &lt;/exclusion&gt;
                &lt;exclusion&gt;
                    &lt;groupId&gt;org.slf4j&lt;/groupId&gt;
                    &lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
                &lt;/exclusion&gt;
            &lt;/exclusions&gt;
        &lt;/dependency&gt;

        &lt;dependency&gt;
            &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
            &lt;artifactId&gt;hadoop-hdfs-client&lt;/artifactId&gt;
            &lt;version&gt;2.10.0&lt;/version&gt;
            &lt;scope&gt;provided&lt;/scope&gt;
        &lt;/dependency&gt;

From whatever I read while trying to find a solution, the dependency versions do not match. But I tried a lot of permutations, none of them worked for me. How can I resolve this error?

答案1

得分: 1

这是因为错误版本的netty被添加进来了。以下代码解决了我的这个问题:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.3.0</version>
    <exclusions>
        <exclusion>
            <groupId>io.netty</groupId>
            <artifactId>netty-all</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>io.netty</groupId>
    <artifactId>netty-all</artifactId>
    <version>4.1.17.Final</version>
</dependency>
英文:

This is because wrong version of netty getting added. below solved this issue for me:

&lt;dependency&gt;
			&lt;groupId&gt;org.apache.spark&lt;/groupId&gt;
			&lt;artifactId&gt;spark-sql_2.11&lt;/artifactId&gt;
			&lt;version&gt;2.3.0&lt;/version&gt;
			&lt;exclusions&gt;
			&lt;exclusion&gt;
				&lt;groupId&gt;io.netty&lt;/groupId&gt;
				&lt;artifactId&gt;netty-all&lt;/artifactId&gt;
			&lt;/exclusion&gt;
			&lt;/exclusions&gt;
		&lt;/dependency&gt;

&lt;dependency&gt;
			&lt;groupId&gt;io.netty&lt;/groupId&gt;
			&lt;artifactId&gt;netty-all&lt;/artifactId&gt;
			&lt;version&gt;4.1.17.Final&lt;/version&gt;
		&lt;/dependency&gt;

huangapple
  • 本文由 发表于 2020年10月28日 02:38:15
  • 转载请务必保留本文链接:https://go.coder-hub.com/64560920.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定