在Airflow DAG的代码中,将以下AWS连接信息添加到您的S3Hook实例中:
from airflow.providers.amazon.aws.hooks.base_aws import AwsBaseHook
hook = AwsBaseHook(aws_conn_id='aws_default')
s3_client = hook.get_client_type(client_type='s3')
然后,尝试将以下代码添加到您的python文件中:
from airflow.providers.amazon.aws.hooks.s3 import S3Hook
credential_hook = AwsBaseHook(aws_conn_id='aws_default')
credential = credential_hook.get_credentials()
s3_hook = S3Hook(aws_conn_id='aws_default')
s3_client = s3_hook.get_conn()
s3_client.download_file(
Bucket='??>',
Key='??>',
Filename='??>'
)
请注意,您需要将??替换为所需的AWS S3 bucked名称/密钥/文件名。同时,在使用此代码之前,您需要设置AWS_DEFAULT_REGION和AWS_ACCESS_KEY_ID环境变量。