无法在hadoop3.1.3中运行jar文件。

huangapple go评论58阅读模式
英文:

Could not run jar file in hadoop3.1.3

问题

我在命令提示符(以管理员身份运行)中尝试了以下命令:
```plaintext
hadoop jar C:\Users\tejashri\Desktop\Hadoopproject\WordCount.jar  WordcountDemo.WordCount  /work /out

但是我收到了以下错误消息:
我的应用程序被停止了。

2020-04-04 23:53:27,918 INFO client.RMProxy: 正在连接到资源管理器:/0.0.0.0:8032
2020-04-04 23:53:28,881 WARN mapreduce.JobResourceUploader: 未执行Hadoop命令行选项解析。实现Tool接口,并使用ToolRunner执行应用程序,以解决此问题。
2020-04-04 23:53:28,951 INFO mapreduce.JobResourceUploader: 禁用路径的纠错编码:/tmp/hadoop-yarn/staging/tejashri/.staging/job_1586024027199_0006
2020-04-04 23:53:29,162 INFO sasl.SaslDataTransferClient: SASL加密信任检查:localHostTrusted = false,remoteHostTrusted = false
2020-04-04 23:53:29,396 INFO input.FileInputFormat: 要处理的总输入文件数:1
2020-04-04 23:53:29,570 INFO sasl.SaslDataTransferClient: SASL加密信任检查:localHostTrusted = false,remoteHostTrusted = false
2020-04-04 23:53:29,762 INFO sasl.SaslDataTransferClient: SASL加密信任检查:localHostTrusted = false,remoteHostTrusted = false
2020-04-04 23:53:29,802 INFO mapreduce.JobSubmitter: 分片数:1
2020-04-04 23:53:30,059 INFO sasl.SaslDataTransferClient: SASL加密信任检查:localHostTrusted = false,remoteHostTrusted = false
2020-04-04 23:53:30,156 INFO mapreduce.JobSubmitter: 提交作业的令牌:job_1586024027199_0006
2020-04-04 23:53:30,156 INFO mapreduce.JobSubmitter: 使用令牌执行:[]
2020-04-04 23:53:30,504 INFO conf.Configuration: 未找到resource-types.xml
2020-04-04 23:53:30,507 INFO resource.ResourceUtils: 无法找到'resource-types.xml'。
2020-04-04 23:53:30,586 INFO impl.YarnClientImpl: 已提交应用程序application_1586024027199_0006
2020-04-04 23:53:30,638 INFO mapreduce.Job: 跟踪作业的URL:http://LAPTOP-2UBC7TG1:8088/proxy/application_1586024027199_0006/
2020-04-04 23:53:30,640 INFO mapreduce.Job: 正在运行作业:job_1586024027199_0006
2020-04-04 23:53:35,676 INFO mapreduce.Job: 作业job_1586024027199_0006在uber模式下运行:false
2020-04-04 23:53:35,679 INFO mapreduce.Job:  map 0% reduce 0%
2020-04-04 23:53:35,698 INFO mapreduce.Job: 由于Application application_1586024027199_0006的AM容器appattempt_1586024027199_0006_000002退出,作业job_1586024027199_0006以状态FAILED失败
此尝试失败。诊断信息:[2020-04-04 23:53:34.955]来自容器启动的异常。
容器ID:container_1586024027199_0006_02_000001
退出代码:1
Shell输出:         1个文件已移动。
"设置环境变量"
"设置作业资源"
"复制调试信息"

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>rem 创建启动脚本的副本

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>copy "launch_container.cmd" "C:/hadoop/logs/userlogs/application_1586024027199_0006/container_1586024027199_0006_02_000001/launch_container.cmd"
        已复制1个文件。

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>rem 确定目录内容

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>dir  1>>"C:/hadoop/logs/userlogs/application_1586024027199_0006/container_1586024027199_0006_02_000001/directory.info"
"启动容器"


[2020-04-04 23:53:34.959]容器以非零退出代码1退出。stderr的最后4096个字节:
' "C:\Program Files\Java\jdk1.8.0_171" '不被识别为内部或外部命令、可操作的程序或批处理文件。


[2020-04-04 23:53:34.960]容器以非零退出代码1退出。stderr的最后4096个字节:
' "C:\Program Files\Java\jdk1.8.0_171" '不被识别为内部或外部命令、可操作的程序或批处理文件。


有关更详细的输出,请查看应用程序跟踪页面:http://LAPTOP-2UBC7TG1:8088/cluster/app/application_158602402719

<details>
<summary>英文:</summary>

I tried this command in command prompt (run as administrator):

hadoop jar C:\Users\tejashri\Desktop\Hadoopproject\WordCount.jar WordcountDemo.WordCount /work /out

but i got this error message:
my application got stopped.

2020-04-04 23:53:27,918 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
2020-04-04 23:53:28,881 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2020-04-04 23:53:28,951 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /tmp/hadoop-yarn/staging/tejashri/.staging/job_1586024027199_0006
2020-04-04 23:53:29,162 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-04-04 23:53:29,396 INFO input.FileInputFormat: Total input files to process : 1
2020-04-04 23:53:29,570 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-04-04 23:53:29,762 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-04-04 23:53:29,802 INFO mapreduce.JobSubmitter: number of splits:1
2020-04-04 23:53:30,059 INFO sasl.SaslDataTransferClient: SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
2020-04-04 23:53:30,156 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1586024027199_0006
2020-04-04 23:53:30,156 INFO mapreduce.JobSubmitter: Executing with tokens: []
2020-04-04 23:53:30,504 INFO conf.Configuration: resource-types.xml not found
2020-04-04 23:53:30,507 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
2020-04-04 23:53:30,586 INFO impl.YarnClientImpl: Submitted application application_1586024027199_0006
2020-04-04 23:53:30,638 INFO mapreduce.Job: The url to track the job: http://LAPTOP-2UBC7TG1:8088/proxy/application_1586024027199_0006/
2020-04-04 23:53:30,640 INFO mapreduce.Job: Running job: job_1586024027199_0006
2020-04-04 23:53:35,676 INFO mapreduce.Job: Job job_1586024027199_0006 running in uber mode : false
2020-04-04 23:53:35,679 INFO mapreduce.Job: map 0% reduce 0%
2020-04-04 23:53:35,698 INFO mapreduce.Job: Job job_1586024027199_0006 failed with state FAILED due to: Application application_1586024027199_0006 failed 2 times due to AM Container for appattempt_1586024027199_0006_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2020-04-04 23:53:34.955]Exception from container-launch.
Container id: container_1586024027199_0006_02_000001
Exit code: 1
Shell output: 1 file(s) moved.
"Setting up env variables"
"Setting up job resources"
"Copying debugging information"

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>rem Creating copy of launch script

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>copy "launch_container.cmd" "C:/hadoop/logs/userlogs/application_1586024027199_0006/container_1586024027199_0006_02_000001/launch_container.cmd"
1 file(s) copied.

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>rem Determining directory contents

C:\hadoop\hdfstmp\nm-local-dir\usercache\tejashri\appcache\application_1586024027199_0006\container_1586024027199_0006_02_000001>dir 1>>"C:/hadoop/logs/userlogs/application_1586024027199_0006/container_1586024027199_0006_02_000001/directory.info"
"Launching container"

[2020-04-04 23:53:34.959]Container exited with a non-zero exit code 1. Last 4096 bytes of stderr :
'"C:\Program Files\Java\jdk1.8.0_171"' is not recognized as an internal or external command,
operable program or batch file.

[2020-04-04 23:53:34.960]Container exited with a non-zero exit code 1. Last 4096 bytes of stderr :
'"C:\Program Files\Java\jdk1.8.0_171"' is not recognized as an internal or external command,
operable program or batch file.

For more detailed output, check the application tracking page: http://LAPTOP-2UBC7TG1:8088/cluster/app/application_1586024027199_0006 Then click on links to logs of each attempt.
. Failing the application.
2020-04-04 23:53:35,743 INFO mapreduce.Job: Counters: 0


</details>


# 答案1
**得分**: 1

"'C:\Program Files\Java\jdk1.8.0_171'" is not recognized as an internal or external command,
operable program or batch file."

The `JAVA_HOME` variable is not properly set in `hadoop-env.cmd`.

Also, move the JDK installation to a folder without whitespaces (say, `C:\Java\jdk1.8.0_171`)

Update the `JAVA_HOME` and `PATH` environment variables

Add this line in `hadoop-env.cmd`,

set JAVA_HOME=C:\Java\jdk1.8.0_171


Restart the hadoop daemons and run the Job.

<details>
<summary>英文:</summary>

`&#39;&quot;C:\Program Files\Java\jdk1.8.0_171&quot;&#39; is not recognized as an internal or external command,
operable program or batch file.`

The `JAVA_HOME` variable is not properly set in `hadoop-env.cmd`. 

Also, move the JDK installation to a folder without whitespaces (say, `C:\Java\jdk1.8.0_171`)

Update the `JAVA_HOME` and `PATH` environment variables

Add this line in `hadoop-env.cmd`, 

set JAVA_HOME=C:\Java\jdk1.8.0_171


Restart the hadoop daemons and run the Job.

</details>



huangapple
  • 本文由 发表于 2020年4月5日 01:35:47
  • 转载请务必保留本文链接:https://go.coder-hub.com/61032119.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定