英文:
Trying to open spark after installation and getting an error: Unable to find any JVMs matching version "1.8"
问题
Description:
我使用 Homebrew 在我的 MacBook 上安装了 Spark。我按照这个链接上的指南进行操作:https://www.tutorialkart.com/apache-spark/how-to-install-spark-on-mac-os/。
步骤包括先安装 Java,然后是 Scala,最后安装 Spark。Java 和 Scala 都安装成功了。Spark 也安装成功了。
当我尝试使用下面的命令验证 Spark 安装时,遇到了一个错误。
输入命令: spark-shell
期望行为: 期望在终端上启动 Spark
实际行为: 我收到以下错误:
无法找到与版本 "1.8" 匹配的任何 JVM。
警告: 发生了非法反射访问操作
警告: 非法反射访问: org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/2.4.5/libexec/jars/spark-unsafe_2.11-2.4.5.jar) 到方法 java.nio.Bits.unaligned()
警告: 请考虑将此情况报告给 org.apache.spark.unsafe.Platform 的维护者
警告: 使用 --illegal-access=warn 以启用进一步的非法反射访问操作的警告
警告: 所有非法访问操作将在未来版本中被拒绝
主线程中的异常 "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
...
我尝试过的:
我尝试通过以下命令更改 JAVA_HOME:
export JAVA_HOME=/usr/local/opt/java
之前的 JAVA_HOME 路径是 /opt/anaconda3
。我可以看到 JAVA_HOME 已经更改为 usr/local/opt/java
。
但是我仍然收到错误。感谢您的答复/反馈。谢谢!
英文:
Description:
I installed spark on my MacBook following using Homebrew. I followed the instruction process from: https://www.tutorialkart.com/apache-spark/how-to-install-spark-on-mac-os/.
The step by step process included installing Java, followed by Scala and then Spark. Java and Scala got installed successful. Spark got installed successfully as well.
When I tried to verify spark installation using below input command, I ran into an error.
Input command: spark-shell
Expected Behavior: Expect Spark to start on terminal
Actual Behavior: I get this below error:
Unable to find any JVMs matching version "1.8".
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/2.4.5/libexec/jars/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3756)
at java.base/java.lang.String.substring(String.java:1902)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
What I tried:
I tried to change JAVA_HOME using below commands:
export JAVA_HOME=/usr/local/opt/java
Previous JAVA_HOME path was /opt/anaconda3
. I can see that JAVA_HOME got changed to usr/local/opt/java
.
I am still getting the error. Appreciate your answers/feedback. Thanks!!!
答案1
得分: 4
Follow these steps for macOS
Step 1. install java 8 as Spark 2.2 onwards requires java8. See Spark Documentation! for details.
brew install openjdk@8
Then set update java path
export JAVA_HOME=/usr/local/opt/openjdk@8/libexec/openjdk.jdk/Contents/Home
英文:
Follow these steps for macOS
Step 1. install java 8 as Spark 2.2 onwards requires java8. See Spark Documentation! for details.
> brew install openjdk@8
Then set update java path
> export JAVA_HOME=/usr/local/opt/openjdk@8/libexec/openjdk.jdk/Contents/Home
答案2
得分: 2
我在网上搜索并看到了安装pyspark的指令。
我在终端上运行了这个命令pip install pyspark
安装完pyspark之后,spark和pyspark都在运行。
不太清楚发生了什么,但我现在可以运行spark了。
感谢@Elliott的互动和指引!
英文:
I was searching across the web and saw instructions to install pyspark.
I ran this command pip install pyspark
on Terminal
After I installed pyspark, both spark and pyspark are running.
Not sure what happened, but I am able to run spark now.
Thanks @Elliott for interacting and giving some directions!
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论