英文:
How to auto import new DAGs in Azure Data Factory Managed Airflow?
问题
我们从存储容器中导入文件后,如果有文件被修改、排除,或者容器中有新文件,自动导入是有道理的。
但目前这项任务需要手动操作,所以如果我在容器中修改了一个DAG,就必须一遍又一遍地转到Managed Airflow并执行“导入文件”。
是否有一种方法可以设置自动刷新同步或使用API构建等效的解决方案?
英文:
Once we imported files from a Container in Storage, it would make sense to auto import if any file is modified or excluded or if there is a new file in the Container.
But currently this job is manually, so if I modify a DAG in the Container I have to go to Managed Airflow and do the "Import files" again and again.
Is there a way to set the auto refresh sync or an API to use to build an equivalent solution?
答案1
得分: 3
POST https://management.azure.com/subscriptions/
请求主体
{"IntegrationRuntimeName":"Airflow1","LinkedServiceName":"AzureBlobStorage1","StorageFolderPath":"airflow/","CopyFolderStructure":true,"Overwrite":true}
英文:
You can use the import DAG REST API for automation:
Request body
{"IntegrationRuntimeName":"Airflow1","LinkedServiceName":"AzureBlobStorage1","StorageFolderPath":"airflow/","CopyFolderStructure":true,"Overwrite":true}
答案2
得分: 1
补充Abhishek Narain的正确答案,DAG REST API遵循标准的HTTP 202模式,因此响应的头部包含“Location” URI,可用于通过GET请求轮询导入的状态。
英文:
To add to Abhishek Narain's correct answer, the DAG REST API follows the standard HTTP 202 pattern, so the headers of the response contains the "Location" URI that can be polled with a GET request for the status of the import.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论