Databricks dbutils.fs.ls() error Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel is not whitelisted

huangapple go评论61阅读模式
英文:

Databricks dbutils.fs.ls() error Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel is not whitelisted

问题

When executing dbutils.fs.ls() to list files in Databricks, I get the error:

py4j.security.Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel(org.apache.spark.SparkContext) is not whitelisted.

I also get the error when I try dbutils.fs.ls().

Unfortunately, the following failed spark.databricks.pyspark.enablePy4JSecurity False

I have done some research and can someone advise if any of the following will fix the issue:

  • Enabling credential passthrough for standard and high-concurrency clusters.

  • Configuring credential passthrough and initializing storage resources in ADLS accounts.

  • Accessing ADLS resources directly when credential passthrough is enabled.

  • Accessing ADLS resources through a mount point when credential passthrough is enable

英文:

When executing dbtuils.fs.ls() to list files in Databricks I get the error:

py4j.security.Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel(org.apache.spark.SparkContext) is not whitelisted.

I also get the error when I try dbtuils.fs.ls().

Unfortunately, the following failed spark.databricks.pyspark.enablePy4JSecurity False

I have done some research and can someone advise if any of the following will fix the issue:

  • Enabling credential passthrough for standard and high-concurrency clusters.

  • Configuring credential passthrough and initializing storage resources in ADLS accounts.

  • Accessing ADLS resources directly when credential passthrough is enabled.

  • Accessing ADLS resources through a mount point when credential passthrough is enable

Thanks

答案1

得分: 1

根据微软文档

> 你面临的错误是因为在使用高并发集群并启用凭据透传时,会在各种库操作中出现此问题。
> 当您访问一个Azure Databricks尚未明确指定为适用于Azure Data Lake Storage凭据透传集群的安全方法时,将引发此问题。

解决方法如下:

  • 如果适用,作为解决方法使用不同的集群模式。
  • 为此,将spark.databricks.pyspark.enableProcessIsolation更新为false,以便您的集群以无隔离共享访问模式运行。
  • 在您的情况下,作为解决方法使用标准集群。

Databricks dbutils.fs.ls() error Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel is not whitelisted

英文:

As per MS Document

> The error you are facing is because, when utilizing a High Concurrency cluster and enabling credential pass through, this problem occurs with various library operations.
> When you visit a method that Azure Databricks hasn't expressly designated as safe for Azure Data Lake Storage credential passthrough clusters, this issue is raised.

To workaround you can:

  • Use a different cluster mode as a workaround if it describes your situation.
  • Update spark.databricks.pyspark.enableProcessIsolation to false for this your cluster needs to be with no isolation shared access mode
  • Use standard clusters in your case as a workaround.

Databricks dbutils.fs.ls() error Py4JSecurityException: Constructor public com.databricks.backend.daemon.dbutils.FSUtilsParallel is not whitelisted

huangapple
  • 本文由 发表于 2023年5月22日 17:07:34
  • 转载请务必保留本文链接:https://go.coder-hub.com/76304570.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定