实施与阶段相关的S3存储桶。

huangapple go评论67阅读模式
英文:

Implement s3 bucket depending on stage

问题

我想在我的Python Lambda中实现一个依赖于部署阶段的变量S3存储桶。
以下是我的代码:

def count_objects(year, last_month):
    amount_s3objects = 0
    response = boto3.client('s3').list_objects_v2(Bucket='my-bucket_name', Prefix=f'xxx/xxx/year={year}/month={last_month}')
    for object in response['Contents']:
        amount_s3objects += 1
    days_of_last_month = monthrange(year, last_month)[1]
    return amount_s3objects, days_of_last_month

现在,当我部署到dev阶段时,存储桶应该是"my-bucket_name-test",当部署到prod阶段时,存储桶应该是"my-bucket_name-prod"。
我已经尝试使用事件来实现它,但我不知道应该触发哪个事件,因为该函数只计算S3前缀中的对象。
这是我的serverless.yaml,如果有帮助的话:

service: name

frameworkVersion: '3'
configValidationMode: error

custom:
  pythonRequirements:
    dockerizePip: true
  stage: ${opt:stage, self:provider.stage}
  profile:
    dev: dev
    prod: prod 
  deploymentBucket:
    dev: deployment-bucket-test
    prod: deployment-bucket
  serverlessIfElse:
    - If: '"${self:custom.stage}" == "prod"'
      Set:
        provider.s3.athena-result-bucket.name: athena-results
        provider.s3.my-bucket_name.name: my-bucket_name-prod

provider:
  name: aws
  runtime: python3.9
  region: eu-central-1
  stage: dev
  profile: ${self:custom.profile.${self:custom.stage}}
  deploymentMethod: direct
  deploymentBucket:
    name: ${self:custom.deploymentBucket.${self:custom.stage}}
  s3:
    athena-result-bucket:
      name: test-athena-results
    my-bucket_name-bucket:   
      name: my-bucket_name-test

functions:
  lambda_handler:
    handler: handler.lambda_handler
    timeout: 300
    events:
      - eventBridge:
          schedule: cron(0 0 ? * 4#1 *)
      - s3:
          bucket: athena-result-bucket
          event: s3:ObjectCreated:*
      - s3:
          bucket: my-bucket_name
          event: s3:ObjectCreated:*
          
package:
  exclude:
    - node_modules/**

plugins:
  - serverless-python-requirements
  - serverless-plugin-ifelse
  - serverless-better-credentials

请注意,我已经更新了存储桶的命名以反映您的要求。

英文:

I want to implement a variable s3 bucket which is depending on the deploying stage in my Python Lambda.
This is my code here:

def count_objects(year, last_month):
    amount_s3objects = 0
    response = boto3.client('s3').list_objects_v2(Bucket='my-bucket_name',Prefix=f'xxx/xxx/year={year}/month={last_month}')
    for object in response['Contents']:
        amount_s3objects+=1
    days_of_last_month = monthrange(year, last_month)[1]
    return amount_s3objects, days_of_last_month

Now when I deploy to dev the bucket should be "my-bucket_name-test" and when deploying to prod the bucket should be "my-bucket_name-prod".
I've already tried to do it with an event, but I don't know which event should trigger that because the function only counts the objects in the s3 prefix.
This is my serverless.yaml, if that helps:

service: name

frameworkVersion: '3'
configValidationMode: error

custom:
  pythonRequirements:
    dockerizePip: true
  stage: ${opt:stage, self:provider.stage}
  profile:
    dev: dev
    prod: prod 
  deploymentBucket:
    dev: deployment-bucket-test
    prod: deployment-bucket
  serverlessIfElse:
    - If: '"${self:custom.stage}" == "prod"'
      Set:
        provider.s3.athena-result-bucket.name: athena-results
        provider.s3.my-bucket_name.name: my-bucket_name-prod

provider:
  name: aws
  runtime: python3.9
  region: eu-central-1
  stage: dev
  profile: ${self:custom.profile.${self:custom.stage}}
  deploymentMethod: direct
  deploymentBucket:
    name: ${self:custom.deploymentBucket.${self:custom.stage}}
  s3:
    athena-result-bucket:
      name: test-athena-results
    my-bucket_name-bucket:   
      name: my-bucket_name-test

  # glue:
  #   reporting-databse: 



functions:
  lambda_handler:
    handler: handler.lambda_handler
    timeout: 300
    events:
      - eventBridge:
          schedule: cron(0 0 ? * 4#1 *)
      - s3:
          bucket: athena-result-bucket
          event: s3:ObjectCreated:*
          # rules:
          #   - prefix: "D_CSV/"
          #   - suffix: ".7z"
      - s3:
          bucket: my-bucket_name
          event: s3:ObjectCreated:*
          # rules:
          #   - prefix: "D_CSV/"
          #   - suffix: ".7z 
          
package:
  exclude:
    - node_modules/**

plugins:
  - serverless-python-requirements
  - serverless-plugin-ifelse
  - serverless-better-credentials



答案1

得分: 2

这将解决您的Lambda关注点:

import os

def lambda_handler(event, context):
    bucket_name = os.environ.get('BUCKET_NAME')
    # 您的其他代码在这里...
    # 使用bucket_name变量调用count_objects函数:
    result = count_objects(bucket_name, year, last_month)
    # 您的其余代码...

使用serverless-dotenv-plugin来处理环境变量。

现在关于如何动态处理阶段:在部署代码时,向CLI命令添加--stage参数:

sls deploy --stage dev
sls deploy --stage prod

此外,您可以像这样命名您的存储桶:

BucketName: yourbucketname-${opt:stage, self:custom.defaultStage}
英文:

This above will resolve your lambda concern:

import os

def lambda_handler(event, context):
    bucket_name = os.environ.get('BUCKET_NAME')
    # Your other code here...
    # Call count_objects function with the bucket_name variable:
    result = count_objects(bucket_name, year, last_month)
    # Rest of your code...

Use serverless-dotenv-plugin to work with environment variables.

Now regarding how to handle the stage dynamically: when deploying the code, add the --stage parameter to the CLI command:

sls deploy --stage dev
sls deploy --stage prod

Also you would name your bucket like this:

BucketName: yourbucketname-${opt:stage, self:custom.defaultStage}

huangapple
  • 本文由 发表于 2023年7月24日 17:23:21
  • 转载请务必保留本文链接:https://go.coder-hub.com/76753061.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定