英文:
Setup E-mail notification on failure run with DBX deployment
问题
我正在使用DBX将工作流部署到Databricks。在这里,我想添加一个步骤,每当工作流失败时,都会发送电子邮件到email123@email.com。我的deploy.yml文件的大纲如下:
deployments:
- name: my_workflow
schedule:
quartz_cron_expression: "0 0 5 * * ?"
timezone_id: "Europe/Berlin"
format: MULTI_TASK
job_clusters:
- job_cluster_key: "basic-job-cluster"
<<: *base-job-cluster
tasks:
task_key: "my-task"
job_cluster_key: "basic-job-cluster"
spark_python_task:
python_files: "file://my_file.py"
<<插入通知代码>>
我尚未找到相关文档,如果您能指向相关文档,我也会很高兴。
英文:
I am deploying workflows to Databricks using DBX. Here I want to add a step which will send an e-mail to email123@email.com whenever the workflow fails. The outline of my deloyment.yml file is as below:
deployments:
- name: my_workflow
schedule:
quartz_cron_expression: "0 0 5 * * ?"
timezone_id: "Europe/Berline"
format: MULTI_TASK
job_clusters:
- job_cluster_key: "basic-job-cluster"
<<: *base-job-cluster
tasks:
task_key: "my-task
job_cluster_key: "basic-job-cluster"
spark_python_task:
python_files: "file://my_file.py"
<< Insert notification code here >>
I have not been able to find documentation about it, so if you can point me to that I will also be happy.
答案1
得分: 0
我找到了DBX文档中的解决方案。对于其他查找,请添加以下内容:
email_notifications:
on_failure: ["user@email.com"]
文档链接:https://dbx.readthedocs.io/en/latest/reference/deployment/
英文:
I found the solution in DBX documentation. For other looking add the following:
email_notifications:
on_failure: [ "user@email.com" ]
Link to documentation: https://dbx.readthedocs.io/en/latest/reference/deployment/
答案2
得分: 0
更新您的deployment.yml文件如下:
deployments:
- name: my_workflow
email_notifications:
on_start: ["user@email.com"]
on_success: ["user@email.com"]
on_failure: ["user@email.com"]
schedule:
quartz_cron_expression: "0 0 5 * * ?"
timezone_id: "Europe/Berline"
format: MULTI_TASK
job_clusters:
- job_cluster_key: "basic-job-cluster"
<<: *base-job-cluster
tasks:
task_key: "my-task"
job_cluster_key: "basic-job-cluster"
spark_python_task:
python_files: "file://my_file.py"
英文:
Update your deployment.yml file to below:
deployments:
- name: my_workflow
email_notifications:
on_start: [ "user@email.com" ]
on_success: [ "user@email.com" ]
on_failure: [ "user@email.com" ]
schedule:
quartz_cron_expression: "0 0 5 * * ?"
timezone_id: "Europe/Berline"
format: MULTI_TASK
job_clusters:
- job_cluster_key: "basic-job-cluster"
<<: *base-job-cluster
tasks:
task_key: "my-task
job_cluster_key: "basic-job-cluster"
spark_python_task:
python_files: "file://my_file.py"
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论