可以将pyRFC安装到Databricks Spark集群上吗?

huangapple go评论138阅读模式
英文:

Is it possible to install pyRFC onto a Databricks Spark cluster?

问题

有一个用于pyRFC的Py-pi,但与所有其他C-python库一样,它有许多依赖关系,并且需要设置环境变量等。

是否有可能在Databricks集群上安装类似pyRFC的C-python库?如果可以,那么如何包含SDK依赖项?

也许已经有人尝试了Java版本?

英文:

There is a Py-pi for pyRFC, but like all other C-python libraries, it has a lot of dependencies, and requires the setting of environment variables, etc.

Is it possible to install a c-python library like pyRFC onto a Databricks cluster? If so, how would you have to go about including the SDK dependencies?

Perhaps, someone has tried with the Java version already?

答案1

得分: 2

是的,这是可能的。通常通过将集群初始化脚本附加到一个集群来实现。集群初始化脚本的任务是在所有集群节点上设置所有必要的依赖项、编译库/安装包等。通常,人们会下载他们的包等,并将它们放在DBFS上,然后在初始化脚本内部使用/dbfs挂载点来访问它们。

脚本可能如下所示(仅示例):

#!/bin/bash

# 将SAP SDK解压到某个位置
tar zxvf /dbfs/FileStore/SAP-SDK.tar.gz

# 安装包
pip install pyrfc
英文:

Yes, it's possible. It's usually done by attaching a cluster init script to a cluster. The task of the cluster init script is to setup all necessary dependencies, compile libraries/install packages, etc. on all cluster nodes. Usually, people are downloading their packages, etc. and put them on DBFS, and then accessing them from inside the init script using the /dbfs mount.

Script could look like this (just example):

#!/bin/bash

# Unpack SAP SDK into some location
tar zxvf /dbfs/FileStore/SAP-SDK.tar.gz

# install package
pip install pyrfc

huangapple
  • 本文由 发表于 2023年3月7日 20:25:10
  • 转载请务必保留本文链接:https://go.coder-hub.com/75661949.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定