英文:
How can I tell when Foundry has updated Spark?
问题
以下是翻译好的部分:
-
有一些非常有用的Spark功能,这些功能在Foundry自动提供的Spark版本中尚不可用。
-
Spark升级似乎没有在Foundry版本发布页面上宣布。
-
似乎没有一种方法可以查看关于最新的Foundry库,例如
transforms
,以及查看其pyspark依赖项的信息。 -
唯一的方法似乎是尝试升级一个代码仓库:
- 使用代码仓库升级功能生成一个Pull Request
- 检查
transforms-python/conda-versions.run.linux-64.lock
中的pyspark版本。
这将产生一个毫无意义的Pull Request,因此不是一个好的选择。
英文:
There are very useful Spark features that are not yet available in the Spark version provided automatically by Foundry.
Spark upgrades don't seem to be announced on the Foundry Releases page.
There doesn't seem to be a way to view information about the latest Foundry libraries e.g. transforms
, and see what the pyspark dependency is.
The only way to find out seems to be trying to upgrade a repo:
- Use the Code Repositories Upgrade function to generate a Pull Request
- Check the pyspark version in
transforms-python/conda-versions.run.linux-64.lock
.
Which has the side-effect of creating a meaningless Pull Request, so isn't a good option.
答案1
得分: 1
代替创建一个虚拟PR,您可以创建一个新的仓库并检查conda-versions.run.linux-64.lock
。然后删除该仓库。
这并不比您已经在做的工作好多少,但至少这样可以避免额外的PR。
英文:
Instead of creating a dummy PR, you could create a new repo and check conda-versions.run.linux-64.lock
. Then delete that repo.
This isn't much better answer than what you already do, but at least this lets you avoid the extra PR.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论