要解决这个问题,可以考虑使用VPC Flow Logs,这是一种在流量通过VPC时记录每个流量事件的服务。VPC Flow Logs将以下数据记录到CloudWatch日志组中:源IP地址,目的地IP地址,源端口,目的地端口,协议类型,请求的字节数,响应的字节数,和开始和停止时间戳。以下是一个使用VPC Flow Logs的代码示例:
import boto3
import json
import logging
import os
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def lambda_handler(event, context):
client = boto3.client('logs')
# Log Group Name
log_group_name = '/aws/vpc/flow-logs/' + os.environ['VPC_ID']
# Filter Pattern for Required Data
filter_pattern = '{ $.srcPort = * }'
# Start Time (Optional)
start_query = 0
# End Time (Optional)
end_query = 0
# Query VPC Flow Logs
response = client.start_query(
logGroupName=log_group_name,
startTime=start_query,
endTime=end_query,
queryString=filter_pattern,
limit=123
)
# Get Query ID
query_id = response['queryId']
logging.info(f'VPC Flow Logs Query initiated with Query ID: {query_id}')
# Process and log results
response = client.get_query_results(queryId=query_id)
for record in response['results']:
logging.info(str(json.loads(record[1]['value'])))
在上述代码中,我们创建了一个/aws/vpc/flow-logs/[VPC_ID]
的日志组,然后传递过滤器模式{ $.srcPort = * }
,以获取在不同端口上的所有流量事件。最后,我们使用get_query_results()
方法来处理和记录日志。