英文:
HBase client - java.lang.ClassNotFoundException: org.apache.hadoop.crypto.key.KeyProviderTokenIssuer
问题
我正在尝试运行一个连接到 HBase 的旧项目。
它有以下依赖之一:
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.2.0-cdh5.7.2</version>
</dependency>
当应用程序启动并在 `org.apache.hadoop.hbase.client.ConnectionFactory` 类的 `createConnection` 方法内部到达以下代码时:
try{
....
return (Connection) constructor.newInstance(conf, managed, pool, user);
} catch (Exception e) {
throw new IOException(e);
}
抛出并捕获了一个异常,错误信息为:
java.lang.NoClassDefFoundError: org/apache/hadoop/crypto/key/KeyProviderTokenIssuer
因此,我在 Google 上寻找了 `KeyProviderTokenIssuer` 这个类,但没有找到它应该来自哪里。
为什么系统在尝试使用这个类,我应该从哪里获取它?`Crypto` 包不是 `hbase-client` 依赖的一部分,我在 [https://mvnrepository.com/][1] 上也没有找到这样的内容。
这里是否可能存在某种库不匹配?
我在 Windows 上运行。这可能有关吗?
[1]: https://mvnrepository.com/
英文:
I'm trying to run a legacy project that connects to HBase.
It has (among other dependencies):
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.2.0-cdh5.7.2</version>
</dependency>
When the application starts and reaches this code within the method of createConnection
in the class of org.apache.hadoop.hbase.client.ConnectionFactory
:
try{
....
return (Connection) constructor.newInstance(conf, managed, pool, user);
} catch (Exception e) {
throw new IOException(e);
}
An exception is thrown and caught, saying that:
java.lang.NoClassDefFoundError: org/apache/hadoop/crypto/key/KeyProviderTokenIssuer
So I was looking for this class of KeyProviderTokenIssuer
in Google but didn't find where it should come from.
Why the system is trying to use this class and where should I get it from? Crypto
package is not part of the hbase-client
dependency and I don't see such in https://mvnrepository.com/
Is it possible that there is some library mismatch here?
I'm running on Windows. Can it be related?
答案1
得分: 2
我经过几个步骤成功解决了这个问题:
-
在按照这篇帖子的指示后,我下载了
hadoop-common-2.2.0-bin-master.zip
文件,并将其完整解压到了C:\Program Files\apache\hadoop\bin
文件夹中。 -
我将
HADOOP_HOME
参数添加到系统变量中,指向C:\Program Files\apache\hadoop
。 -
我将
%HADOOP_HOME%\bin
的值添加到了PATH
变量中。 -
由于我的Hadoop版本是2.6.0,我检查并确保所有与Hadoop相关的依赖项都是这个版本的。
-
我运行了
mvn dependency:tree
命令,并发现其中一个依赖项的jar包携带了org.apache.hadoop:hadoop-hdfs-client:jar:3.2.0
的jar包,所以我在依赖项中排除了它:
<dependency>
<groupId>com.example</groupId>
<artifactId>bla</artifactId>
<version>1.0.1</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs-client</artifactId>
</exclusion>
</exclusions>
</dependency>
一些帮助我的网址:
英文:
I was able to overcome this issue after performing several steps:
-
Following this post , I downloaded the file of
hadoop-common-2.2.0-bin-master.zip
and fully extracted it to the folder ofC:\Program Files\apache\hadoop\bin
-
I added the
HADOOP_HOME
param to the system variables, pointing it toC:\Program Files\apache\hadoop
-
I added to the
PATH
variable the value of%HADOOP_HOME%\bin
-
Since my Hadoop is version 2.6.0 I checked and made sure that all Hadoop related dependencies are in that version.
-
I run
mvn dependency:tree
and found that one of the dependencies jars is bringing with it the jar oforg.apache.hadoop:hadoop-hdfs-client:jar:3.2.0
so I excluded it from the dependency:<dependency> <groupId>com.example</groupId> <artifactId>bla</artifactId> <version>1.0.1</version> <exclusions> <exclusion> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs-client</artifactId> </exclusion> </exclusions> </dependency>
Some URLs that helped me:
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论