要将AWS Batch任务的日志同步到Cloudwatch,可以使用AWS Batch的onComplete功能。在Batch任务执行结束后,可以设置一段Lambda代码来激活,以便将任务输出的日志数据发送到Cloudwatch。以下是一个简单的Lambda函数示例:
import boto3
import json
import base64
def lambda_handler(event, context):
print("Received event: " + json.dumps(event, indent=2))
output = event['outputPayloads'][0]['data']
log_stream_name = event['jobName'] + '/' + event['jobId']
client = boto3.client('logs')
response = client.create_log_stream(
logGroupName='/aws/batch/job',
logStreamName=log_stream_name)
response = client.put_log_events(
logGroupName='/aws/batch/job',
logStreamName=log_stream_name,
logEvents=[
{
'timestamp': int(round(time.time() * 1000)),
'message': str(base64.b64decode(output), 'utf-8')
}
]
)
return response
以上代码监听AWS Batch任务的onComplete事件,并解析任务的输出。然后将输出转换为Cloudwatch日志流,最后发送到Cloudwatch。
然后,您还需要配置AWS Batch作业定义中的日志设置。可以添加以下代码片段以将作业定义中的日志驱动作为Cloudwatch:
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/aws/batch/job",
"awslogs-region": "",
"awslogs-stream-prefix": "log-stream"
}
}
在上面的示例中,所有AWS Batch任务的日志将被存储在/aws/batch/job日志组内,日志流前缀为"log-stream"。
完成上述设置后,您将能够及时跟踪Batch任务的执行情况,并将输出日志推送到Cloudwatch。