英文:
Spark 3.2 with Scala 2.12 version
问题
一段时间以前,Spark网站上有一个带有Scala 2.12版本的Spark 3.2版本。这是我正在尝试找到的版本。如果它在以下任何一个网站上,我都找不到它。
具体寻找带有Scala 2.12版本的Spark 3.2。
它仍然可用吗?
链接到更详细的Spark存档(特别是3.2.4,但其他版本也位于此链接上面的文件夹中):
英文:
Some time ago, the Spark site had a Spark version 3.2 that cames with the scala 2.12 version. This is a version I am trying to find. If it is in one of the sites below, I cannot find it there.
Specifically looking for Spark 3.2 with a Scala 2.12 version.
Is it still available?
Link to a more thorough archive of spark (specifically 3.2.4 but others reside in the folder above this link)
答案1
得分: 3
如详细说明在下载页面中:
> 请注意,Spark 3 通常使用 Scala 2.12 预构建,并且 Spark 3.2+ 还提供了额外的使用 Scala 2.13 预构建的发行版。
此外,当您单击选择要下载的版本时,您会发现有两个选项适用于 Spark 3.2。一个标明使用 Scala 2.13,另一个则没有明确说明,因为它是默认选项。
您可以在Maven Central中找到任何已发布的版本。如果这还不够,您可以使用所需的设置来构建 Spark。
英文:
As it is detailed in the downloads
> Note that Spark 3 is pre-built with Scala 2.12 in general and Spark 3.2+ provides additional pre-built distribution with Scala 2.13.
Also when you click to select which version you want to download, you can see that there are two options for Spark 3.2. One says that comes with scala 2.13 and the other one doesn't clarify anything because it's the default.
You can find any released published in Maven Central. If that is not enough, you can build spark with the setting you need
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论