BigQuery加载和传输作业文件限制。在使用BigQuery加载或传输作业时,文件大小和数量都有限制。要解决这个问题,可以分割文件或使用Cloud Storage作为中介。
代码示例:
使用Cloud Storage作为中介:
from google.cloud import bigquery
client = bigquery.Client()
# Set the configuration options.
job_config = bigquery.LoadJobConfig(
source_format=bigquery.SourceFormat.NEWLINE_DELIMITED_JSON,
write_disposition=bigquery.WriteDisposition.WRITE_TRUNCATE,
create_disposition=bigquery.CreateDisposition.CREATE_IF_NEEDED,
)
# Set the destination table.
table_ref = client.dataset('my_dataset').table('my_table')
# Set the Cloud Storage URL.
uri = 'gs://my_bucket/my_file_*.json'
# Load the data from Cloud Storage.
load_job = client.load_table_from_uri(
uri,
table_ref,
job_config=job_config,
)
# Wait for the job to complete.
load_job.result()
文件分割:
from google.cloud import bigquery
client = bigquery.Client()
# Set the configuration options.
job_config = bigquery.LoadJobConfig(
source_format=bigquery.SourceFormat.NEWLINE_DELIMITED_JSON,
write_disposition=bigquery.WriteDisposition.WRITE_TRUNCATE,
create_disposition=bigquery.CreateDisposition.CREATE_IF_NEEDED,
)
# Set the destination table.
table_ref = client.dataset('my_dataset').table('my_table')
# Load the data from multiple files.
load_jobs = []
for i in range(10):
filename = 'my_file_{0}.json'.format(i)
with open(filename, 'rb') as source_file:
load_job = client.load_table_from_file(
source_file,
table_ref,
job_config=job_config,
)
load_jobs.append(load_job)
# Wait for the jobs to complete.
for load_job in load_jobs:
load_job.result()