-
Notifications
You must be signed in to change notification settings - Fork 125
"403 Exceeded rate limits: too many table update operations for this table" when uploading just 2 hours every hour. #375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Is this specific to |
I'm only using pandas-gbq to to push rows to a bq table. |
Can you share more about your workload? Is the schema changing? How many |
You'll have the best results if you can minimize the number of calls to |
I don't recall which of my pipeline this was. I believe I moved on to Google's Datastore because of the Rate limit errors. |
Thanks for getting back to us. There is a feature request open at #300 to use the streaming API for writes, which avoids some rate limits with load jobs (but at the expense of some complexity around writing to new/recreated tables). This hasn't been implemented in pandas-gbq yet, but is available in the I'll close this issue as it isn't reproducible, but feel free to follow #300 or open a new issue if you have a reproducible example. |
Error message:
pandas_gbq.gbq.GenericGBQException: Reason: 403 Exceeded rate limits: too many table update operations for this table. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
I am uploading a very small amount of rows (less than 20) to a BQ table every hour through a cloud function running on a schedule. I'm receiving the error quite frequently. There are no other scripts writing to that table.
I get the error after a log that says:
2 out of 2 rows loaded.
1 out of 1 rows loaded.
Both those logs are for a single upload command.
The text was updated successfully, but these errors were encountered: