Azure ML 和数据外泄预防

huangapple go评论64阅读模式
英文:

Azure ML and Data exfiltration prevention

问题

我正在尝试理解ML中的DEP工作原理。

Microsoft建议的架构说明,我必须使用服务端点以及服务端点策略,以防止ML计算子网访问未列入白名单的存储帐户(https://learn.microsoft.com/en-us/azure/machine-learning/how-to-network-isolation-planning#recommended-architecture-with-data-exfiltration-prevention)。

我在网上找到的一些其他示例不使用服务端点,而是更喜欢使用存储帐户的私有端点。仅使用PE是否能防止数据外泄?我不确定,因为从我迄今为止看到的情况来看,只要您具有存储帐户的适当访问权限,就可以通过ML工作区将任何存储帐户添加为数据存储。

因此,我有点困惑,希望有人能对此进行解释。

英文:

I'm trying to understand how DEP works in ML.

The Microsoft recommended architecture states that I must use a service endpoint along with a service endpoint policy to prevent ML compute subnets from gaining access to non-white listed storage accounts (https://learn.microsoft.com/en-us/azure/machine-learning/how-to-network-isolation-planning#recommended-architecture-with-data-exfiltration-prevention)

Some other examples I found on the web don't use service endpoints and instead prefer private endpoints for storage accounts. Does using PEs alone prevent data exfiltration? I'm not sure, because from what I've seen so far, it's possible to add any storage account as a datastore through the ML workspace as long as you have the appropriate access rights for the storage account.

So I'm a bit confused and would appreciate if someone could shed some light on this.

答案1

得分: 0

使用仅私有终结点可能无法阻止数据外泄,但可以减少攻击面和数据外泄的机会。建议结合使用 Azure 虚拟网络、Azure 私有链接以及 Azure 策略 来保护您的 Azure 机器学习资源。

英文:

using private endpoints alone may not prevent data exfiltration, but it can reduce the attack surface and the chances of data exfiltration. It is recommended to use a combination of Azure Virtual Network, Azure Private Link, and Azure Policy to secure your Azure Machine Learning resources.

huangapple
  • 本文由 发表于 2023年2月26日 21:27:18
  • 转载请务必保留本文链接:https://go.coder-hub.com/75572304.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定