如何将多个 CloudWatch 日志导出到 S3 存储桶中的一个文本文件。

huangapple go评论60阅读模式
英文:

How to export multiple CloudWatch logs into on text file in an S3 bucket

问题

I'm developing a Lambda function that is suppose to send the messages of multiple CloudWatch logs into one single text file stored in S3.
With the use of Python, here's the code I've compiled:

import boto3, json

client = boto3.client('logs')
s3client = boto3.client('s3')

def lambda_handler(event, context):
    stream_response = client.describe_log_streams(
        logGroupName = '/LogGroup',
        orderBy = 'LastEventTime'
    )
    
    for log_stream in stream_response['logStreams']:
        latestlogStreamName = log_stream['logStreamName']
        
        print(latestlogStreamName)
        
        response = client.get_log_events(
            logGroupName = '/LogGroup',
            logStreamName = latestlogStreamName
        )
        
        print(json.dumps(response, indent = 4))
        
        for log_events in response['events']:
            log_messages = log_events['message']
            #print(log_messages)
            
            s3client.put_object(
                Body = log_messages,
                Bucket = 'S3_Bucket',
                Key = 'Bucket_Object/TextFile.txt'
            )

It does work fine, but when I go to S3 and check up that file in the bucket, I only see the latest message displayed. Then, I noticed that the file has older versions, each contain a separate message.

Is there a way I can export all the messages into one single text file in S3?
How should I export the messages properly?

英文:

I'm developing a Lambda function that is suppose to send the messages of multiple CloudWatch logs into one single text file stored in S3.
With the use of Python, here's the code I've compiled:

import boto3, json

client = boto3.client('logs')
s3client = boto3.client('s3')

def lambda_handler(event, context):
	stream_response = client.describe_log_streams(
		logGroupName = '/LogGroup',
		orderBy = 'LastEventTime'
	)
	
	for log_stream in stream_response['logStreams']:
		latestlogStreamName = log_stream['logStreamName']
		
		print(latestlogStreamName)
		
		response = client.get_log_events(
			logGroupName = '/LogGroup',
			logStreamName = latestlogStreamName
		)
		
		print(json.dumps(response, indent = 4))
		
		for log_events in response['events']:
			log_messages = log_events['message']
			#print(log_messages)
			
			s3client.put_object(
				Body = log_messages,
				Bucket = 'S3_Bucket',
				Key = 'Bucket_Object/TextFile.txt'
			)

It does work fine, but when I go to S3 and check up that file in the bucket, I only see the latest message displayed. Then, I noticed that the file has older versions, each contain a separate message.

Is there a way I can export all the messages into one single text file in S3?
How should I export the messages properly?

答案1

得分: 1

s3client.put_object覆盖文件。S3不允许追加文件,您需要收集从CloudWatch查询的所有日志,并为每个日志创建一个单独的新文件

我还建议禁用版本控制,除非您有业务/法律要求需要启用。

英文:

s3client.put_object overwrites the file. S3 doesn't allow to append to files, you need to collect all the logs that you query from CloudWatch and create a separate, new file for each.

I would also suggest to disable versions, except you have a business/legal requirement to do so.

huangapple
  • 本文由 发表于 2023年2月27日 13:40:02
  • 转载请务必保留本文链接:https://go.coder-hub.com/75577071.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定