Skip to content

Commit c89f751

Browse files
yoshi-automationparthea
authored andcommitted
chore: upgrade gapic-generator-python to 0.46.3 (#196)
* changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore: upgrade to gapic-generator-python 0.44.0 chore: add GAPIC metadata feat: add support for self-signed JWT PiperOrigin-RevId: 370525906 Source-Author: Google APIs <[email protected]> Source-Date: Mon Apr 26 13:12:26 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 60e129d0672a1be2c70b41bf76aadc7ad1b1ca0f Source-Link: googleapis/googleapis@60e129d * chore: revert to gapic-generator-python 0.43.3 PiperOrigin-RevId: 371362703 Source-Author: Google APIs <[email protected]> Source-Date: Fri Apr 30 10:44:40 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 5a04154e7c7c0e98e0e4085f6e2c67bd5bff6ff8 Source-Link: googleapis/googleapis@5a04154 * fix: add async client to %name_%version/init.py chore: add autogenerated snippets chore: remove auth, policy, and options from the reserved names list feat: support self-signed JWT flow for service accounts chore: enable GAPIC metadata generation chore: sort subpackages in %namespace/%name/init.py PiperOrigin-RevId: 372197450 Source-Author: Google APIs <[email protected]> Source-Date: Wed May 5 13:39:02 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 83a7e1c8c2f7421ded45ed323eb1fda99ef5ea46 Source-Link: googleapis/googleapis@83a7e1c * chore: upgrade gapic-generator-python to 0.46.1 PiperOrigin-RevId: 373400747 Source-Author: Google APIs <[email protected]> Source-Date: Wed May 12 10:34:35 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 162641cfe5573c648df679a6dd30385650a08704 Source-Link: googleapis/googleapis@162641c * chore: upgrade gapic-generator-python to 0.46.3 PiperOrigin-RevId: 373649163 Source-Author: Google APIs <[email protected]> Source-Date: Thu May 13 13:40:36 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 7e1b14e6c7a9ab96d2db7e4a131981f162446d34 Source-Link: googleapis/googleapis@7e1b14e
1 parent a1e3788 commit c89f751

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

52 files changed

+1775
-1081
lines changed

packages/google-cloud-bigquery-storage/.coveragerc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22
branch = True
33

44
[report]
5-
fail_under = 100
65
show_missing = True
76
omit =
87
google/cloud/bigquery_storage/__init__.py

packages/google-cloud-bigquery-storage/.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,6 @@ repos:
2626
hooks:
2727
- id: black
2828
- repo: https://gitlab.com/pycqa/flake8
29-
rev: 3.9.0
29+
rev: 3.9.2
3030
hooks:
3131
- id: flake8

packages/google-cloud-bigquery-storage/CONTRIBUTING.rst

Lines changed: 1 addition & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -160,21 +160,7 @@ Running System Tests
160160
auth settings and change some configuration in your project to
161161
run all the tests.
162162

163-
- System tests will be run against an actual project and
164-
so you'll need to provide some environment variables to facilitate
165-
authentication to your project:
166-
167-
- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
168-
Such a file can be downloaded directly from the developer's console by clicking
169-
"Generate new JSON key". See private key
170-
`docs <https://cloud.google.com/storage/docs/authentication#generating-a-private-key>`__
171-
for more details.
172-
173-
- Once you have downloaded your json keys, set the environment variable
174-
``GOOGLE_APPLICATION_CREDENTIALS`` to the absolute path of the json file::
175-
176-
$ export GOOGLE_APPLICATION_CREDENTIALS="/Users/<your_username>/path/to/app_credentials.json"
177-
163+
- System tests will be run against an actual project. You should use local credentials from gcloud when possible. See `Best practices for application authentication <https://cloud.google.com/docs/authentication/best-practices-applications#local_development_and_testing_with_the>`__. Some tests require a service account. For those tests see `Authenticating as a service account <https://cloud.google.com/docs/authentication/production>`__.
178164

179165
*************
180166
Test Coverage

packages/google-cloud-bigquery-storage/google/cloud/bigquery_storage/__init__.py

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
# -*- coding: utf-8 -*-
2-
32
# Copyright 2020 Google LLC
43
#
54
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,6 +15,7 @@
1615
#
1716

1817
from google.cloud.bigquery_storage_v1 import BigQueryReadClient
18+
1919
from google.cloud.bigquery_storage_v1 import gapic_types as types
2020
from google.cloud.bigquery_storage_v1 import __version__
2121
from google.cloud.bigquery_storage_v1.types.arrow import ArrowRecordBatch
@@ -30,27 +30,27 @@
3030
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamResponse
3131
from google.cloud.bigquery_storage_v1.types.storage import StreamStats
3232
from google.cloud.bigquery_storage_v1.types.storage import ThrottleState
33-
from google.cloud.bigquery_storage_v1.types.stream import DataFormat
3433
from google.cloud.bigquery_storage_v1.types.stream import ReadSession
3534
from google.cloud.bigquery_storage_v1.types.stream import ReadStream
35+
from google.cloud.bigquery_storage_v1.types.stream import DataFormat
3636

3737
__all__ = (
38+
"BigQueryReadClient",
3839
"__version__",
3940
"types",
4041
"ArrowRecordBatch",
4142
"ArrowSchema",
4243
"ArrowSerializationOptions",
4344
"AvroRows",
4445
"AvroSchema",
45-
"BigQueryReadClient",
4646
"CreateReadSessionRequest",
47-
"DataFormat",
4847
"ReadRowsRequest",
4948
"ReadRowsResponse",
50-
"ReadSession",
51-
"ReadStream",
5249
"SplitReadStreamRequest",
5350
"SplitReadStreamResponse",
5451
"StreamStats",
5552
"ThrottleState",
53+
"ReadSession",
54+
"ReadStream",
55+
"DataFormat",
5656
)
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
{
2+
"comment": "This file maps proto services/RPCs to the corresponding library clients/methods",
3+
"language": "python",
4+
"libraryPackage": "google.cloud.bigquery_storage_v1",
5+
"protoPackage": "google.cloud.bigquery.storage.v1",
6+
"schema": "1.0",
7+
"services": {
8+
"BigQueryRead": {
9+
"clients": {
10+
"grpc": {
11+
"libraryClient": "BigQueryReadClient",
12+
"rpcs": {
13+
"CreateReadSession": {
14+
"methods": [
15+
"create_read_session"
16+
]
17+
},
18+
"ReadRows": {
19+
"methods": [
20+
"read_rows"
21+
]
22+
},
23+
"SplitReadStream": {
24+
"methods": [
25+
"split_read_stream"
26+
]
27+
}
28+
}
29+
},
30+
"grpc-async": {
31+
"libraryClient": "BigQueryReadAsyncClient",
32+
"rpcs": {
33+
"CreateReadSession": {
34+
"methods": [
35+
"create_read_session"
36+
]
37+
},
38+
"ReadRows": {
39+
"methods": [
40+
"read_rows"
41+
]
42+
},
43+
"SplitReadStream": {
44+
"methods": [
45+
"split_read_stream"
46+
]
47+
}
48+
}
49+
}
50+
}
51+
}
52+
}
53+
}

packages/google-cloud-bigquery-storage/google/cloud/bigquery_storage_v1/services/__init__.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
# -*- coding: utf-8 -*-
2-
32
# Copyright 2020 Google LLC
43
#
54
# Licensed under the Apache License, Version 2.0 (the "License");

packages/google-cloud-bigquery-storage/google/cloud/bigquery_storage_v1/services/big_query_read/__init__.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
# -*- coding: utf-8 -*-
2-
32
# Copyright 2020 Google LLC
43
#
54
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +13,6 @@
1413
# See the License for the specific language governing permissions and
1514
# limitations under the License.
1615
#
17-
1816
from .client import BigQueryReadClient
1917
from .async_client import BigQueryReadAsyncClient
2018

packages/google-cloud-bigquery-storage/google/cloud/bigquery_storage_v1/services/big_query_read/async_client.py

Lines changed: 16 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
11
# -*- coding: utf-8 -*-
2-
32
# Copyright 2020 Google LLC
43
#
54
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,26 +13,24 @@
1413
# See the License for the specific language governing permissions and
1514
# limitations under the License.
1615
#
17-
1816
from collections import OrderedDict
1917
import functools
2018
import re
2119
from typing import Dict, AsyncIterable, Awaitable, Sequence, Tuple, Type, Union
2220
import pkg_resources
2321

2422
import google.api_core.client_options as ClientOptions # type: ignore
25-
from google.api_core import exceptions # type: ignore
23+
from google.api_core import exceptions as core_exceptions # type: ignore
2624
from google.api_core import gapic_v1 # type: ignore
2725
from google.api_core import retry as retries # type: ignore
28-
from google.auth import credentials # type: ignore
26+
from google.auth import credentials as ga_credentials # type: ignore
2927
from google.oauth2 import service_account # type: ignore
3028

3129
from google.cloud.bigquery_storage_v1.types import arrow
3230
from google.cloud.bigquery_storage_v1.types import avro
3331
from google.cloud.bigquery_storage_v1.types import storage
3432
from google.cloud.bigquery_storage_v1.types import stream
35-
from google.protobuf import timestamp_pb2 as timestamp # type: ignore
36-
33+
from google.protobuf import timestamp_pb2 # type: ignore
3734
from .transports.base import BigQueryReadTransport, DEFAULT_CLIENT_INFO
3835
from .transports.grpc_asyncio import BigQueryReadGrpcAsyncIOTransport
3936
from .client import BigQueryReadClient
@@ -55,35 +52,31 @@ class BigQueryReadAsyncClient:
5552
parse_read_stream_path = staticmethod(BigQueryReadClient.parse_read_stream_path)
5653
table_path = staticmethod(BigQueryReadClient.table_path)
5754
parse_table_path = staticmethod(BigQueryReadClient.parse_table_path)
58-
5955
common_billing_account_path = staticmethod(
6056
BigQueryReadClient.common_billing_account_path
6157
)
6258
parse_common_billing_account_path = staticmethod(
6359
BigQueryReadClient.parse_common_billing_account_path
6460
)
65-
6661
common_folder_path = staticmethod(BigQueryReadClient.common_folder_path)
6762
parse_common_folder_path = staticmethod(BigQueryReadClient.parse_common_folder_path)
68-
6963
common_organization_path = staticmethod(BigQueryReadClient.common_organization_path)
7064
parse_common_organization_path = staticmethod(
7165
BigQueryReadClient.parse_common_organization_path
7266
)
73-
7467
common_project_path = staticmethod(BigQueryReadClient.common_project_path)
7568
parse_common_project_path = staticmethod(
7669
BigQueryReadClient.parse_common_project_path
7770
)
78-
7971
common_location_path = staticmethod(BigQueryReadClient.common_location_path)
8072
parse_common_location_path = staticmethod(
8173
BigQueryReadClient.parse_common_location_path
8274
)
8375

8476
@classmethod
8577
def from_service_account_info(cls, info: dict, *args, **kwargs):
86-
"""Creates an instance of this client using the provided credentials info.
78+
"""Creates an instance of this client using the provided credentials
79+
info.
8780
8881
Args:
8982
info (dict): The service account private key info.
@@ -98,7 +91,7 @@ def from_service_account_info(cls, info: dict, *args, **kwargs):
9891
@classmethod
9992
def from_service_account_file(cls, filename: str, *args, **kwargs):
10093
"""Creates an instance of this client using the provided credentials
101-
file.
94+
file.
10295
10396
Args:
10497
filename (str): The path to the service account private key json
@@ -115,7 +108,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs):
115108

116109
@property
117110
def transport(self) -> BigQueryReadTransport:
118-
"""Return the transport used by the client instance.
111+
"""Returns the transport used by the client instance.
119112
120113
Returns:
121114
BigQueryReadTransport: The transport used by the client instance.
@@ -129,12 +122,12 @@ def transport(self) -> BigQueryReadTransport:
129122
def __init__(
130123
self,
131124
*,
132-
credentials: credentials.Credentials = None,
125+
credentials: ga_credentials.Credentials = None,
133126
transport: Union[str, BigQueryReadTransport] = "grpc_asyncio",
134127
client_options: ClientOptions = None,
135128
client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO,
136129
) -> None:
137-
"""Instantiate the big query read client.
130+
"""Instantiates the big query read client.
138131
139132
Args:
140133
credentials (Optional[google.auth.credentials.Credentials]): The
@@ -166,7 +159,6 @@ def __init__(
166159
google.auth.exceptions.MutualTlsChannelError: If mutual TLS transport
167160
creation failed for any reason.
168161
"""
169-
170162
self._client = BigQueryReadClient(
171163
credentials=credentials,
172164
transport=transport,
@@ -244,7 +236,6 @@ async def create_read_session(
244236
This corresponds to the ``max_stream_count`` field
245237
on the ``request`` instance; if ``request`` is provided, this
246238
should not be set.
247-
248239
retry (google.api_core.retry.Retry): Designation of what errors, if any,
249240
should be retried.
250241
timeout (float): The timeout for this request.
@@ -269,7 +260,6 @@ async def create_read_session(
269260

270261
# If we have keyword arguments corresponding to fields on the
271262
# request, apply these.
272-
273263
if parent is not None:
274264
request.parent = parent
275265
if read_session is not None:
@@ -286,7 +276,8 @@ async def create_read_session(
286276
maximum=60.0,
287277
multiplier=1.3,
288278
predicate=retries.if_exception_type(
289-
exceptions.DeadlineExceeded, exceptions.ServiceUnavailable,
279+
core_exceptions.DeadlineExceeded,
280+
core_exceptions.ServiceUnavailable,
290281
),
291282
deadline=600.0,
292283
),
@@ -345,7 +336,6 @@ def read_rows(
345336
This corresponds to the ``offset`` field
346337
on the ``request`` instance; if ``request`` is provided, this
347338
should not be set.
348-
349339
retry (google.api_core.retry.Retry): Designation of what errors, if any,
350340
should be retried.
351341
timeout (float): The timeout for this request.
@@ -372,7 +362,6 @@ def read_rows(
372362

373363
# If we have keyword arguments corresponding to fields on the
374364
# request, apply these.
375-
376365
if read_stream is not None:
377366
request.read_stream = read_stream
378367
if offset is not None:
@@ -386,7 +375,9 @@ def read_rows(
386375
initial=0.1,
387376
maximum=60.0,
388377
multiplier=1.3,
389-
predicate=retries.if_exception_type(exceptions.ServiceUnavailable,),
378+
predicate=retries.if_exception_type(
379+
core_exceptions.ServiceUnavailable,
380+
),
390381
deadline=86400.0,
391382
),
392383
default_timeout=86400.0,
@@ -433,7 +424,6 @@ async def split_read_stream(
433424
request (:class:`google.cloud.bigquery_storage_v1.types.SplitReadStreamRequest`):
434425
The request object. Request message for
435426
`SplitReadStream`.
436-
437427
retry (google.api_core.retry.Retry): Designation of what errors, if any,
438428
should be retried.
439429
timeout (float): The timeout for this request.
@@ -445,7 +435,6 @@ async def split_read_stream(
445435
Response message for SplitReadStream.
446436
"""
447437
# Create or coerce a protobuf request object.
448-
449438
request = storage.SplitReadStreamRequest(request)
450439

451440
# Wrap the RPC method; this adds retry and timeout information,
@@ -457,7 +446,8 @@ async def split_read_stream(
457446
maximum=60.0,
458447
multiplier=1.3,
459448
predicate=retries.if_exception_type(
460-
exceptions.DeadlineExceeded, exceptions.ServiceUnavailable,
449+
core_exceptions.DeadlineExceeded,
450+
core_exceptions.ServiceUnavailable,
461451
),
462452
deadline=600.0,
463453
),

0 commit comments

Comments
 (0)