英文:
Adding hive and hadoop jars to IntelliJ project
问题
我最近试图在本地运行一些单元测试(Spark-Scala 项目),遇到了以下问题:
ClassNotFound java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hive.ql.exec.Utilities when creating Hive client using classpath:
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
在 IntelliJ 项目中,应该在哪里包含这些 JAR 文件?例如,hive
在终端中可以正常工作,但 IntelliJ 似乎需要一些额外的配置?
谢谢!
英文:
I've been recently trying to run some unit tests locally (Spark-Scala project), and encountered the following
ClassNotFound java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hive.ql.exec.Utilities when creating Hive client using classpath:
Please make sure that jars for your version of hive and hadoop are included in the paths passed to spark.sql.hive.metastore.jars.
Where in an IntelliJ project is one supposed to include these jars? E.g., hive
is working fine from terminal, but IntelliJ seems to require some additional configuration?
Thanks!
答案1
得分: 1
由于您正在使用Maven,必须在pom.xml中添加对hive
的依赖项。
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>3.1.3</version>
</dependency>
添加后,通过Maven工具窗口重新加载项目。
英文:
As you're using Maven, it's necessary to add the dependency for hive
in the pom.xml
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>3.1.3</version>
</dependency>
After adding it, reload the project via the Maven tool window:
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论