使用Amazon S3的传输加速功能可以提高上传速度。该功能利用网状分布的边缘节点来最大化传输速度,尤其适合长距离传输和大文件上传。
代码示例:
import boto3
s3 = boto3.resource('s3') s3_client = boto3.client('s3')
bucket = s3.Bucket('bucket_name') file_path = 'path/to/local/file' s3_client.upload_file(file_path, 'bucket_name', 'object_key', ExtraArgs={ 'ACL': 'public-read', 'ServerSideEncryption': 'AES256', 'StorageClass': 'STANDARD_IA', 'TransferAcceleration': 'Enabled' })
将大文件分成多个部分,分别上传可以缩短上传时间并提高上传速度。同时,压缩文件可以减小文件大小,加快上传速度。
代码示例:
import boto3 import os
s3 = boto3.resource('s3')
file_path = 'path/to/local/file' file_size = os.path.getsize(file_path)
if file_size > 5 * 1024 * 1024: mpu = s3.create_multipart_upload(Bucket='bucket_name', Key='object_key') part_size = 5 * 1024 * 1024
with open(file_path, 'rb') as f:
file_part = f.read(part_size)
parts = []
part_number = 1
while file_part:
response = s3.upload_part(Bucket='bucket_name', Key='object_key', UploadId=mpu['UploadId'], PartNumber=part_number, Body=file_part)
parts.append({'Etag': response['ETag'], 'PartNumber': part_number})
part_number += 1
file_part = f.read(part_size)
s3.complete_multipart_upload(Bucket='bucket_name', Key='object_key', UploadId=mpu['UploadId'], MultipartUpload={'Parts': parts})
else: s3.Object('bucket_name', 'object_key').upload_file(file_path)