英文:
Azure Data Factory Trigger json file: Add all pipelines in pipelines property
问题
有没有一个可以输入的值,可以将触发器应用于分支中的所有管道?目前,我必须手动将触发器应用于每个管道,而在将来会有更多的管道,如果有一个适用于所有管道的值,那么这将自动化该过程。我尚未在Microsoft文档中找到任何相关信息。
我已添加了当前配置的代码,但尚未添加任何管道。
{
"name": "DailyTrigger",
"properties": {
"description": "这个触发器将在每天早上8点执行一次。",
"annotations": [ ],
"runtimeState": "Started",
"pipelines": [ ],
"type": "ScheduleTrigger",
"typeProperties": {
"recurrence": {
"frequency": "Day",
"interval": 1,
"startTime": "2023-04-04T07:55:00",
"timeZone": "GMT标准时间",
"schedule": {
"minutes": [0],
"hours": [8]
}
}
}
}
}
到目前为止,我尝试的一切都没有奏效。我无法找到可以使用的字符或其他内容。
英文:
Is there a value that can be entered that will apply the trigger to all pipelines in the branch? Currently I have to apply the trigger to each pipeline manually and in the future there will be more pipelines, if there is a value that applies to all then this would automate the process. I have not been able to find anything in Microsoft documents about it.
I have added the code of the current configuration with no pipelines added yet.
{
    "name": "DailyTrigger",
    "properties": {
        "description": "This trigger will execute once a day at 8am.",
        "annotations": [ ],
        "runtimeState": "Started",
        "pipelines": [ ],
        "type": "ScheduleTrigger",
        "typeProperties": {
            "recurrence": {
                "frequency": "Day",
                "interval": 1,
                "startTime": "2023-04-04T07:55:00",
                "timeZone": "GMT Standard Time",
                "schedule": {
                    "minutes": [
                        0
                    ],
                    "hours": [
                        8
                    ]
                }
            }
        }
    }
}
Nothing I have tried so far has worked. I couldn't find a character or anything to use from looking.
答案1
得分: 0
以下是代码段的中文翻译:
- 以下是我拥有的一个触发器,没有附加管道。
您可以使用 Rest API 来满足这一需求。我已经在 Azure Databricks 笔记本中运行了以下 Python 代码。
# 使用 !pip install msal 安装 msal 以获取承载令牌
import msal
import requests
import json
tenant_id = '<tenant_id>'
client_id = '<client_id>'
client_secret = '<client_secret>'
app = msal.ConfidentialClientApplication(client_id=client_id, authority=f"https://login.microsoftonline.com/{tenant_id}", client_credential=client_secret)
# 获取令牌
scopes = ["https://management.azure.com/.default"]
result = app.acquire_token_for_client(scopes=scopes)
print(result)
# 设置要更新触发器的终结点 URL
headers = {'Authorization': 'Bearer ' + result['access_token'], 'Content-Type': 'application/json'}
# 获取当前触发器定义
trigger = requests.get(trigger_url, headers=headers).json()
trigger['properties']['pipelines'] = [{
"parameters": {},
"pipelineReference": {"referenceName": "pipeline1", "type": "PipelineReference"},
}, {
"parameters": {},
"pipelineReference": {"referenceName": "pipeline2", "type": "PipelineReference"},
}]
# 发送 PUT 请求以更新触发器
response = requests.put('https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/triggers/{triggerName}?api-version=2018-06-01', headers=headers, data=json.dumps(updated_trigger)).json()
print(response)
- 执行上述代码后,我成功将触发器附加到所需的管道。
- 如果您想进一步自动化此过程,可以使用另一个 Rest API
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelines?api-version=2018-06-01
,该 API 返回所有管道的详细信息(GET 方法)。 - 如果您想启动触发器,可以使用 API
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/triggers/{triggerName}/start?api-version=2018-06-01
(POST 方法)
注意: 首先创建一个服务主体,并在 Azure 数据工厂中为此应用添加角色分配。我已分配了 Contributor 角色。
英文:
- The following is a trigger that I have which has no pipelines attached.
You can achieve this requirement using Rest API. I have the run the following python code in Azure Databricks notebook.
#install msal using !pip install msal for getting bearer token
import msal
import requests
import json
tenant_id = '<tenant_id>'
client_id = '<client_id>'
client_secret = '<client_secret>'
app = msal.ConfidentialClientApplication(client_id=client_id, authority=f"https://login.microsoftonline.com/{tenant_id}", client_credential=client_secret)
# Get the token
scopes = ["https://management.azure.com/.default"]
result = app.acquire_token_for_client(scopes=scopes)
print(result)
# Set the endpoint URL for the trigger you want to update
headers = {'Authorization': 'Bearer ' + result['access_token'], 'Content-Type': 'application/json'}
# Get the current trigger definition
trigger = requests.get(trigger_url, headers=headers).json()
trigger['properties']['pipelines'] = [{
"parameters": {},
"pipelineReference": {"referenceName": "pipeline1", "type": "PipelineReference"},
},{
"parameters": {},
"pipelineReference": {"referenceName": "pipeline2", "type": "PipelineReference"},
}]
# Send a PUT request to update the trigger
response = requests.put('https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/triggers/{triggerName}?api-version=2018-06-01', headers=headers, data=json.dumps(updated_trigger)).json()
print(response)
- After executing the above code, I was able to attach the trigger to required pipelines.
- If you want to further automate this process, use another Rest API
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelines?api-version=2018-06-01
which returns the details of all pipelines (GET method). - If you want to start the trigger, you can use the API
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/triggers/{triggerName}/start?api-version=2018-06-01
(POST method)
NOTE: Create a service principle first and add role assignment to this app in your azure data factory. I have assigned it Contributor role.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论