实施与阶段相关的S3存储桶。

huangapple go评论109阅读模式
英文:

Implement s3 bucket depending on stage

问题

我想在我的Python Lambda中实现一个依赖于部署阶段的变量S3存储桶。
以下是我的代码:

  1. def count_objects(year, last_month):
  2. amount_s3objects = 0
  3. response = boto3.client('s3').list_objects_v2(Bucket='my-bucket_name', Prefix=f'xxx/xxx/year={year}/month={last_month}')
  4. for object in response['Contents']:
  5. amount_s3objects += 1
  6. days_of_last_month = monthrange(year, last_month)[1]
  7. return amount_s3objects, days_of_last_month

现在,当我部署到dev阶段时,存储桶应该是"my-bucket_name-test",当部署到prod阶段时,存储桶应该是"my-bucket_name-prod"。
我已经尝试使用事件来实现它,但我不知道应该触发哪个事件,因为该函数只计算S3前缀中的对象。
这是我的serverless.yaml,如果有帮助的话:

  1. service: name
  2. frameworkVersion: '3'
  3. configValidationMode: error
  4. custom:
  5. pythonRequirements:
  6. dockerizePip: true
  7. stage: ${opt:stage, self:provider.stage}
  8. profile:
  9. dev: dev
  10. prod: prod
  11. deploymentBucket:
  12. dev: deployment-bucket-test
  13. prod: deployment-bucket
  14. serverlessIfElse:
  15. - If: '"${self:custom.stage}" == "prod"'
  16. Set:
  17. provider.s3.athena-result-bucket.name: athena-results
  18. provider.s3.my-bucket_name.name: my-bucket_name-prod
  19. provider:
  20. name: aws
  21. runtime: python3.9
  22. region: eu-central-1
  23. stage: dev
  24. profile: ${self:custom.profile.${self:custom.stage}}
  25. deploymentMethod: direct
  26. deploymentBucket:
  27. name: ${self:custom.deploymentBucket.${self:custom.stage}}
  28. s3:
  29. athena-result-bucket:
  30. name: test-athena-results
  31. my-bucket_name-bucket:
  32. name: my-bucket_name-test
  33. functions:
  34. lambda_handler:
  35. handler: handler.lambda_handler
  36. timeout: 300
  37. events:
  38. - eventBridge:
  39. schedule: cron(0 0 ? * 4#1 *)
  40. - s3:
  41. bucket: athena-result-bucket
  42. event: s3:ObjectCreated:*
  43. - s3:
  44. bucket: my-bucket_name
  45. event: s3:ObjectCreated:*
  46. package:
  47. exclude:
  48. - node_modules/**
  49. plugins:
  50. - serverless-python-requirements
  51. - serverless-plugin-ifelse
  52. - serverless-better-credentials

请注意,我已经更新了存储桶的命名以反映您的要求。

英文:

I want to implement a variable s3 bucket which is depending on the deploying stage in my Python Lambda.
This is my code here:

  1. def count_objects(year, last_month):
  2. amount_s3objects = 0
  3. response = boto3.client('s3').list_objects_v2(Bucket='my-bucket_name',Prefix=f'xxx/xxx/year={year}/month={last_month}')
  4. for object in response['Contents']:
  5. amount_s3objects+=1
  6. days_of_last_month = monthrange(year, last_month)[1]
  7. return amount_s3objects, days_of_last_month

Now when I deploy to dev the bucket should be "my-bucket_name-test" and when deploying to prod the bucket should be "my-bucket_name-prod".
I've already tried to do it with an event, but I don't know which event should trigger that because the function only counts the objects in the s3 prefix.
This is my serverless.yaml, if that helps:

  1. service: name
  2. frameworkVersion: '3'
  3. configValidationMode: error
  4. custom:
  5. pythonRequirements:
  6. dockerizePip: true
  7. stage: ${opt:stage, self:provider.stage}
  8. profile:
  9. dev: dev
  10. prod: prod
  11. deploymentBucket:
  12. dev: deployment-bucket-test
  13. prod: deployment-bucket
  14. serverlessIfElse:
  15. - If: '"${self:custom.stage}" == "prod"'
  16. Set:
  17. provider.s3.athena-result-bucket.name: athena-results
  18. provider.s3.my-bucket_name.name: my-bucket_name-prod
  19. provider:
  20. name: aws
  21. runtime: python3.9
  22. region: eu-central-1
  23. stage: dev
  24. profile: ${self:custom.profile.${self:custom.stage}}
  25. deploymentMethod: direct
  26. deploymentBucket:
  27. name: ${self:custom.deploymentBucket.${self:custom.stage}}
  28. s3:
  29. athena-result-bucket:
  30. name: test-athena-results
  31. my-bucket_name-bucket:
  32. name: my-bucket_name-test
  33. # glue:
  34. # reporting-databse:
  35. functions:
  36. lambda_handler:
  37. handler: handler.lambda_handler
  38. timeout: 300
  39. events:
  40. - eventBridge:
  41. schedule: cron(0 0 ? * 4#1 *)
  42. - s3:
  43. bucket: athena-result-bucket
  44. event: s3:ObjectCreated:*
  45. # rules:
  46. # - prefix: "D_CSV/"
  47. # - suffix: ".7z"
  48. - s3:
  49. bucket: my-bucket_name
  50. event: s3:ObjectCreated:*
  51. # rules:
  52. # - prefix: "D_CSV/"
  53. # - suffix: ".7z
  54. package:
  55. exclude:
  56. - node_modules/**
  57. plugins:
  58. - serverless-python-requirements
  59. - serverless-plugin-ifelse
  60. - serverless-better-credentials

答案1

得分: 2

这将解决您的Lambda关注点:

  1. import os
  2. def lambda_handler(event, context):
  3. bucket_name = os.environ.get('BUCKET_NAME')
  4. # 您的其他代码在这里...
  5. # 使用bucket_name变量调用count_objects函数:
  6. result = count_objects(bucket_name, year, last_month)
  7. # 您的其余代码...

使用serverless-dotenv-plugin来处理环境变量。

现在关于如何动态处理阶段:在部署代码时,向CLI命令添加--stage参数:

  1. sls deploy --stage dev
  2. sls deploy --stage prod

此外,您可以像这样命名您的存储桶:

  1. BucketName: yourbucketname-${opt:stage, self:custom.defaultStage}
英文:

This above will resolve your lambda concern:

  1. import os
  2. def lambda_handler(event, context):
  3. bucket_name = os.environ.get('BUCKET_NAME')
  4. # Your other code here...
  5. # Call count_objects function with the bucket_name variable:
  6. result = count_objects(bucket_name, year, last_month)
  7. # Rest of your code...

Use serverless-dotenv-plugin to work with environment variables.

Now regarding how to handle the stage dynamically: when deploying the code, add the --stage parameter to the CLI command:

  1. sls deploy --stage dev
  2. sls deploy --stage prod

Also you would name your bucket like this:

  1. BucketName: yourbucketname-${opt:stage, self:custom.defaultStage}

huangapple
  • 本文由 发表于 2023年7月24日 17:23:21
  • 转载请务必保留本文链接:https://go.coder-hub.com/76753061.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定