Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
b596c49
Add configuration.query.userDefinedFunctionResources to bigquery.jobs.
takbab Jul 16, 2016
39fee52
wrap to < 79 characters for PEP8
takbab Jul 16, 2016
b602d18
pylint 1.6.3 complains about not calling .
tseaver Jul 18, 2016
cdf2e37
Bump minimum pylint version to 1.6.3.
tseaver Jul 18, 2016
df47b57
Merge pull request #1990 from tseaver/pylint-1.6.3-compat
tseaver Jul 18, 2016
21ae0c9
Adding BigQuery support for legacy SQL.
dhermes Jul 18, 2016
eacccab
Fix branch coverage miss in 'gcloud.streaming.transfer'.
tseaver Jul 18, 2016
e653b95
Merge pull request #1994 from tseaver/1914-fix-branch-miss-coveralls-…
tseaver Jul 18, 2016
d171791
Merge pull request #1992 from dhermes/fix-1607
dhermes Jul 18, 2016
d3dcfcc
changed attribute name. pylint wants a max of 30 chars.
Jul 19, 2016
a54d078
PEP257 Clean-up of docstrings.
dhermes Jul 18, 2016
8ee5e03
Removing catching-non-exception from pylint disable.
dhermes Jul 19, 2016
6bd2442
Making HappyBase enabled/disabled checks into no-ops.
dhermes Jul 19, 2016
699c027
Making HappyBase compact_table() a no-op.
dhermes Jul 19, 2016
0cb6e60
Fixing language about HappyBase table compaction.
dhermes Jul 19, 2016
24c4358
Merge pull request #1997 from dhermes/fix-1989
dhermes Jul 19, 2016
08aeba1
Emulating HappyBase in counter_set.
dhermes Jul 19, 2016
22dd5e8
Updating docs for differences of HappyBase.
dhermes Jul 19, 2016
8f3469c
Merge pull request #1996 from dhermes/partial-revert-1974
dhermes Jul 19, 2016
e0addef
Merge pull request #2000 from dhermes/counter_set-happybase-just-put
dhermes Jul 19, 2016
a50936d
Fix datastore __init__ docstring.
daspecster Jul 20, 2016
b213cb8
Merge pull request #2005 from daspecster/fix-datastore-init-doc
daspecster Jul 20, 2016
9374f5b
Add configuration.query.userDefinedFunctionResources to bigquery.jobs.
takbab Jul 16, 2016
516899c
wrap to < 79 characters for PEP8
takbab Jul 16, 2016
4075bb8
changed attribute name. pylint wants a max of 30 chars.
Jul 19, 2016
34485f6
corrected.
Jul 20, 2016
dabf186
Merge remote-tracking branch 'origin/feature-bigquery-udf' into featu…
takbab Jul 20, 2016
af61528
corrected.
takbab Jul 20, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 15 additions & 15 deletions gcloud/_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,18 +26,17 @@
from threading import local as Local

from google.protobuf import timestamp_pb2
try:
from google.appengine.api import app_identity
except ImportError:
app_identity = None
import six
from six.moves.http_client import HTTPConnection
from six.moves import configparser

from gcloud.environment_vars import PROJECT
from gcloud.environment_vars import CREDENTIALS

try:
from google.appengine.api import app_identity
except ImportError:
app_identity = None


_NOW = datetime.datetime.utcnow # To be replaced by tests.
_RFC3339_MICROS = '%Y-%m-%dT%H:%M:%S.%fZ'
Expand Down Expand Up @@ -77,15 +76,17 @@ def push(self, resource):
def pop(self):
"""Pop a resource from our stack.

:raises: IndexError if the stack is empty.
:rtype: object
:returns: the top-most resource, after removing it.
:raises IndexError: if the stack is empty.
"""
return self._stack.pop()

@property
def top(self):
"""Get the top-most resource

:rtype: object
:returns: the top-most item, or None if the stack is empty.
"""
if len(self._stack) > 0:
Expand Down Expand Up @@ -141,8 +142,7 @@ def _ensure_tuple_or_list(arg_name, tuple_or_list):

:rtype: list of str
:returns: The ``tuple_or_list`` passed in cast to a ``list``.
:raises: class:`TypeError` if the ``tuple_or_list`` is not a tuple or
list.
:raises TypeError: if the ``tuple_or_list`` is not a tuple or list.
"""
if not isinstance(tuple_or_list, (tuple, list)):
raise TypeError('Expected %s to be a tuple or list. '
Expand Down Expand Up @@ -392,6 +392,8 @@ def _rfc3339_nanos_to_datetime(dt_str):

:rtype: :class:`datetime.datetime`
:returns: The datetime object created from the string.
:raises ValueError: If the timestamp does not match the RFC 3339
regular expression.
"""
with_nanos = _RFC3339_NANOS.match(dt_str)
if with_nanos is None:
Expand Down Expand Up @@ -439,8 +441,7 @@ def _to_bytes(value, encoding='ascii'):
:rtype: str / bytes
:returns: The original value converted to bytes (if unicode) or as passed
in if it started out as bytes.
:raises: :class:`TypeError <exceptions.TypeError>` if the value
could not be converted to bytes.
:raises TypeError: if the value could not be converted to bytes.
"""
result = (value.encode(encoding)
if isinstance(value, six.text_type) else value)
Expand All @@ -460,8 +461,7 @@ def _bytes_to_unicode(value):
:returns: The original value converted to unicode (if bytes) or as passed
in if it started out as unicode.

:raises: :class:`ValueError` if the value could not be converted to
unicode.
:raises ValueError: if the value could not be converted to unicode.
"""
result = (value.decode('utf-8')
if isinstance(value, six.binary_type) else value)
Expand Down Expand Up @@ -522,9 +522,9 @@ def _name_from_project_path(path, project, template):

:rtype: str
:returns: Name parsed from ``path``.
:raises: :class:`ValueError` if the ``path`` is ill-formed or if
the project from the ``path`` does not agree with the
``project`` passed in.
:raises ValueError: if the ``path`` is ill-formed or if the project from
the ``path`` does not agree with the ``project``
passed in.
"""
if isinstance(template, str):
template = re.compile(template)
Expand Down
3 changes: 3 additions & 0 deletions gcloud/bigquery/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -432,6 +432,9 @@ def exists(self, client=None):
:type client: :class:`gcloud.bigquery.client.Client` or ``NoneType``
:param client: the client to use. If not passed, falls back to the
``client`` stored on the current dataset.

:rtype: bool
:returns: Boolean indicating existence of the dataset.
"""
client = self._require_client(client)

Expand Down
23 changes: 23 additions & 0 deletions gcloud/bigquery/job.py
Original file line number Diff line number Diff line change
Expand Up @@ -319,6 +319,9 @@ def exists(self, client=None):
:type client: :class:`gcloud.bigquery.client.Client` or ``NoneType``
:param client: the client to use. If not passed, falls back to the
``client`` stored on the current dataset.

:rtype: bool
:returns: Boolean indicating existence of the job.
"""
client = self._require_client(client)

Expand Down Expand Up @@ -869,6 +872,8 @@ class _AsyncQueryConfiguration(object):
_flatten_results = None
_priority = None
_use_query_cache = None
_use_legacy_sql = None
_udf_resources = None
_write_disposition = None


Expand Down Expand Up @@ -927,6 +932,18 @@ def __init__(self, name, query, client):
https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.query.useQueryCache
"""

use_legacy_sql = _TypedProperty('use_legacy_sql', bool)
"""See:
https://cloud.google.com/bigquery/docs/\
reference/v2/jobs#configuration.query.useLegacySql
"""

udf_resources = _TypedProperty(
'udf_resources', list)
"""See:
https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.query.userDefinedFunctionResources
"""

This comment was marked as spam.


write_disposition = WriteDisposition('write_disposition')
"""See:
https://cloud.google.com/bigquery/docs/reference/v2/jobs#configuration.query.writeDisposition
Expand Down Expand Up @@ -965,6 +982,12 @@ def _populate_config_resource(self, configuration):
configuration['priority'] = self.priority
if self.use_query_cache is not None:
configuration['useQueryCache'] = self.use_query_cache
if self.use_legacy_sql is not None:
configuration['useLegacySql'] = self.use_legacy_sql
if self.udf_resources is not None:
configuration['userDefinedFunctionResources'] = (
self.udf_resources
)
if self.write_disposition is not None:
configuration['writeDisposition'] = self.write_disposition

Expand Down
10 changes: 10 additions & 0 deletions gcloud/bigquery/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ class _SyncQueryConfiguration(object):
_timeout_ms = None
_preserve_nulls = None
_use_query_cache = None
_use_legacy_sql = None


class QueryResults(object):
Expand Down Expand Up @@ -233,6 +234,12 @@ def schema(self):
https://cloud.google.com/bigquery/docs/reference/v2/jobs/query#useQueryCache
"""

use_legacy_sql = _TypedProperty('use_legacy_sql', bool)
"""See:
https://cloud.google.com/bigquery/docs/\
reference/v2/jobs/query#useLegacySql
"""

def _set_properties(self, api_response):
"""Update properties from resource in body of ``api_response``

Expand Down Expand Up @@ -264,6 +271,9 @@ def _build_resource(self):
if self.use_query_cache is not None:
resource['useQueryCache'] = self.use_query_cache

if self.use_legacy_sql is not None:
resource['useLegacySql'] = self.use_legacy_sql

if self.dry_run is not None:
resource['dryRun'] = self.dry_run

Expand Down
3 changes: 3 additions & 0 deletions gcloud/bigquery/table.py
Original file line number Diff line number Diff line change
Expand Up @@ -461,6 +461,9 @@ def exists(self, client=None):
:type client: :class:`gcloud.bigquery.client.Client` or ``NoneType``
:param client: the client to use. If not passed, falls back to the
``client`` stored on the current dataset.

:rtype: bool
:returns: Boolean indicating existence of the table.
"""
client = self._require_client(client)

Expand Down
11 changes: 11 additions & 0 deletions gcloud/bigquery/test_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -1219,6 +1219,7 @@ class TestQueryJob(unittest2.TestCase, _Base):
JOB_TYPE = 'query'
QUERY = 'select count(*) from persons'
DESTINATION_TABLE = 'destination_table'
UDF = {"resourceUri": "gs://backet/functions.js", "inlineCode": ""}

def _getTargetClass(self):
from gcloud.bigquery.job import QueryJob
Expand Down Expand Up @@ -1248,6 +1249,11 @@ def _verifyBooleanResourceProperties(self, job, config):
config['useQueryCache'])
else:
self.assertTrue(job.use_query_cache is None)
if 'useLegacySql' in config:
self.assertEqual(job.use_legacy_sql,
config['useLegacySql'])
else:
self.assertTrue(job.use_legacy_sql is None)

def _verifyResourceProperties(self, job, resource):
self._verifyReadonlyResourceProperties(job, resource)
Expand Down Expand Up @@ -1310,6 +1316,7 @@ def test_ctor(self):
self.assertTrue(job.flatten_results is None)
self.assertTrue(job.priority is None)
self.assertTrue(job.use_query_cache is None)
self.assertTrue(job.use_legacy_sql is None)
self.assertTrue(job.write_disposition is None)

def test_from_api_repr_missing_identity(self):
Expand Down Expand Up @@ -1420,6 +1427,8 @@ def test_begin_w_alternate_client(self):
'flattenResults': True,
'priority': 'INTERACTIVE',
'useQueryCache': True,
'useLegacySql': True,
'userDefinedFunctionResources': [self.UDF],
'writeDisposition': 'WRITE_TRUNCATE',
}
RESOURCE['configuration']['query'] = QUERY_CONFIGURATION
Expand All @@ -1439,6 +1448,8 @@ def test_begin_w_alternate_client(self):
job.flatten_results = True
job.priority = 'INTERACTIVE'
job.use_query_cache = True
job.use_legacy_sql = True
job.udf_resources = [self.UDF]
job.write_disposition = 'WRITE_TRUNCATE'

job.begin(client=client2)
Expand Down
3 changes: 3 additions & 0 deletions gcloud/bigquery/test_query.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,7 @@ def test_ctor(self):
self.assertTrue(query.max_results is None)
self.assertTrue(query.preserve_nulls is None)
self.assertTrue(query.use_query_cache is None)
self.assertTrue(query.use_legacy_sql is None)

def test_job_wo_jobid(self):
client = _Client(self.PROJECT)
Expand Down Expand Up @@ -206,6 +207,7 @@ def test_run_w_alternate_client(self):
query.preserve_nulls = True
query.timeout_ms = 20000
query.use_query_cache = False
query.use_legacy_sql = True
query.dry_run = True

query.run(client=client2)
Expand All @@ -226,6 +228,7 @@ def test_run_w_alternate_client(self):
'preserveNulls': True,
'timeoutMs': 20000,
'useQueryCache': False,
'useLegacySql': True,
}
self.assertEqual(req['data'], SENT)
self._verifyResourceProperties(query, RESOURCE)
Expand Down
27 changes: 13 additions & 14 deletions gcloud/bigtable/happybase/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,26 +21,25 @@
-------------------------

Some concepts from HBase/Thrift do not map directly to the Cloud
Bigtable API. As a result, the following instance methods and functions
could not be implemented:
Bigtable API. As a result

* :meth:`Table.regions() <gcloud.bigtable.happybase.table.Table.regions>`
could not be implemented since tables in Cloud Bigtable do not expose
internal storage details
* :meth:`Connection.enable_table() \
<gcloud.bigtable.happybase.connection.Connection.enable_table>` - no
concept of enabled/disabled
<gcloud.bigtable.happybase.connection.Connection.enable_table>`
does nothing since Cloud Bigtable has no concept of enabled/disabled
* :meth:`Connection.disable_table() \
<gcloud.bigtable.happybase.connection.Connection.disable_table>` - no
concept of enabled/disabled
<gcloud.bigtable.happybase.connection.Connection.disable_table>`
does nothing since Cloud Bigtable has no concept of enabled/disabled
* :meth:`Connection.is_table_enabled() \
<gcloud.bigtable.happybase.connection.Connection.is_table_enabled>`
- no concept of enabled/disabled
always returns :data:`True` since Cloud Bigtable has no concept of
enabled/disabled
* :meth:`Connection.compact_table() \
<gcloud.bigtable.happybase.connection.Connection.compact_table>` -
table storage is opaque to user
* :meth:`Table.regions() <gcloud.bigtable.happybase.table.Table.regions>`
- tables in Cloud Bigtable do not expose internal storage details
* :meth:`Table.counter_set() \
<gcloud.bigtable.happybase.table.Table.counter_set>` - method can't
be atomic, so we disable it
<gcloud.bigtable.happybase.connection.Connection.compact_table>`
does nothing since Cloud Bigtable handles table compactions automatically
and does not expose an API for it
* The ``__version__`` value for the HappyBase package is :data:`None`.
However, it's worth nothing this implementation was based off HappyBase
0.9.
Expand Down
Loading