英文:
Refer dataflow definition from Storage link service in azure python sdk for ADF
问题
我正在使用Azure Python SDK来操作Azure数据工厂(ADF),我想要动态创建一个数据流。
我在链接的存储帐户中有以JSON格式定义的数据流。是否有一种方法可以从存储帐户引用定义并创建数据流?
类似于以下方式:
adf_client.data_flows.create_or_update(location='path_to_dataflow_json')
英文:
I am using azure pythOn SDK for ADF, I am looking to create a dataflow dynamically.
I have data flow definitions in json format available on linked storage account.
Is there a way, I can refer definition from storage account and create a dataflow?
like below
adf_client.data_flows.create_or_update(location = 'path_to_dataflow_json')
答案1
得分: 1
-
不可以直接引用JSON文件的路径。必须提供定义以便创建或更新数据流。
-
当我安装所需的包并在
adf_client.data_flows.create_or_update
上使用Python的help
时,您可以查看参数。
credentials = ClientSecretCredential(client_id='<client_id>', client_secret='<client_secret>', tenant_id='<tenant_id>')
adf_client = DataFactoryManagementClient(credentials, subscription_id)
help(adf_client.data_flows.create_or_update)
- 正如您在上面的图片中看到的,数据流定义需要
azure.mgmt.datafactory.models.DataFlowResource
,但没有接受定义文件路径的参数。
英文:
-
No, you cannot refer to the path of the JSON file directly. The definition has to be given in order for you to create or update the dataflow.
-
When I install the required packages and use pythons
help
on theadf_client.data_flows.create_or_update
, you can see what arguments.
credentials = ClientSecretCredential(client_id='<client_id', client_secret='<client_secret', tenant_id='<tenant_id>')
adf_client = DataFactoryManagementClient(credentials, subscription_id)
help(adf_client.data_flows.create_or_update)
- As you can see in the above image, the dataflow definition requires
azure.mgmt.datafactory.models.DataFlowResource
but there is no parameter that accepts path to the definition file.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论