将参数值传递给databricks_global_init_script资源。

huangapple go评论64阅读模式
英文:

How to pass argument value to databricks_global_init_script resource

问题

I am trying to call datadog-install-driver-workers.sh using terraform resource databricks_global_init_script and this script required 2 input values to pass DD_API_KEY and DD_ENV How do I pass these values along with source script path?

resource "databricks_global_init_script" "init1" {
source = "${path.module}/datadog-install-driver-workers.sh"
name = "my init script"
}

英文:

I am trying to call datadog-install-driver-workers.sh using terraform resource databricks_global_init_script and this script required 2 input values to pass DD_API_KEY and DD_ENV How do I pass these these values along with source script path?

resource "databricks_global_init_script" "init1" {
  source = "${path.module}/datadog-install-driver-workers.sh"
  name   = "my init script"
}

答案1

得分: 1

我会采用不同的方法——而不是要求每个作业和集群指定必要的值,你可以使用 templatefile 函数在脚本中替换必要的值,就像这样:

locals {
  script_path = "${path.module}/datadog-install-driver-workers.sh"
  params = {
    DD_ENV = "dev"
    DD_API_KEY = "aaaaa"
  }
}

resource "databricks_global_init_script" "init" {
  name          = "datadog script"
  content_base64 = base64encode(templatefile(local.script_path, local.params))
}

脚本模板如下:

#!/bin/bash
#

DD_ENV="${DD_ENV}"
DD_API_KEY="${DD_API_KEY}"

echo "Some code that outputs ${DD_ENV}"

这样将会生成正确的结果:

将参数值传递给databricks_global_init_script资源。

唯一需要考虑的是,你可能需要转义使用与 Terraform 相同语法的 shell 变量替换:${var} 变成 $${var} - 请参阅 文档

英文:

I would go a different route - instead of the requiring each job & cluster to specify necessary values, you can use the templatefile function to substitute necessary values in the script, like this:

locals {
  script_path = "${path.module}/datadog-install-driver-workers.sh"
  params = {
    DD_ENV = "dev"
    DD_API_KEY = "aaaaa"
  }
}

resource "databricks_global_init_script" "init" {
  name = "datadog script"
  content_base64 = base64encode(templatefile(local.script_path, local.params))
}

with the script template as following:

#!/bin/bash
#


DD_ENV="${DD_ENV}"
DD_API_KEY="${DD_API_KEY}"

echo "Some code that outputs $${DD_ENV}"

and this will generate it correctly:

将参数值传递给databricks_global_init_script资源。

The only thing that you need to take into account is that you may need to escape shell variables substitutions that use the same syntax as Terraform: ${var} to $${var} - see documentation.

答案2

得分: 0

初始化脚本不接受任何参数。你的代码已经很好了。

Datadog文档提到DD_API_KEYDD_ENV应该被设置为环境变量。这些可以在集群创建期间定义。在UI中,可以在高级选项 -> Spark -> 环境变量 (文档) 中完成此操作。同时,Cluster API 和 Terraform databricks_cluster 资源 支持 spark_env_vars 参数,你可以使用它。例如,在Cluster API中,这将是相关的有效负载:

{
  "cluster_name": "my-cluster",
  "spark_env_vars": {
    "DD_API_KEY": "blahblah",
    "DD_ENV": "blah"
  },
  [...其他属性...]
}

请注意,你可以使用集群策略来强制定义特定的环境变量。

英文:

Init scripts do not take any parameters. Your code is fine as is.

Datadog docs mention that DD_API_KEY and DD_ENV are expected to be set as environment variables. Those can be defined during cluster creation. Using UI, this is done on Advances Options -> Spark -> Environment variables (docs). Also both Cluster API and Terraform databricks_cluster resource support spark_env_vars parameter that you can use. For example, with Cluster API this would be the relevant payload:

{
  "cluster_name": "my-cluster",
  "spark_env_vars": {
    "DD_API_KEY": "blahblah",
    "DD_ENV": "blah"
  },
  [...other attributes...]
}

Note that you could enforce that specific environment variables are always defined using cluster policies.

huangapple
  • 本文由 发表于 2023年5月14日 04:32:49
  • 转载请务必保留本文链接:https://go.coder-hub.com/76244758.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定