Skip to content

File support #145

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 29 commits into from
Oct 14, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
68585b3
add file utils
leahein Jul 12, 2020
aabd2de
initial add for file support
leahein Jul 12, 2020
f6ffdd3
Implement file handling in aiohttp.py
quasimik Jul 16, 2020
e637983
Fix JSON serialization, remove comments, conform to double quotes
quasimik Jul 31, 2020
9e86b03
Fix more JSON serialization
quasimik Jul 31, 2020
c2e38dc
Cleanup
quasimik Jul 31, 2020
e6ae391
FF, merge
quasimik Jul 31, 2020
8839141
Blackened
quasimik Jul 31, 2020
194d645
fix: safe check if parameters are none on aiohttp
KingDarBoja Sep 19, 2020
e02399a
Merge branch 'master' into file-support
KingDarBoja Sep 19, 2020
2a020ad
chore: generate docs with sphinx (#117)
KingDarBoja Sep 20, 2020
25c243a
GitHub Actions: do the tests for each push
leszekhanusz Sep 27, 2020
40f2aaf
Tests: add pypy3 tests again
leszekhanusz Sep 27, 2020
57bb4aa
GitHub Actions: try to send coverage to coveralls.io
leszekhanusz Sep 27, 2020
c2f1840
GitHub actions: try to send coverage to coveralls.io (2)
leszekhanusz Sep 27, 2020
1ba67a7
GitHub Actions: migrating from coveralls to codecov
leszekhanusz Sep 27, 2020
e72bd6b
GitHub Actions: fix typo
leszekhanusz Sep 27, 2020
9e6dc7e
README.md fix badges, add link to doc and leave only basic example (#…
leszekhanusz Sep 27, 2020
53c7a32
Single-sourcing the version in a __version__.py file (#142)
leszekhanusz Sep 27, 2020
4d11c89
Bump version number
leszekhanusz Sep 27, 2020
eec9220
Only upload files if the upload_files flag is True
leszekhanusz Oct 3, 2020
f647803
Adding tests for the file upload functionality
leszekhanusz Oct 3, 2020
62c6a58
Add docs
leszekhanusz Oct 3, 2020
cda8263
Merge branch 'master' into file-support
leszekhanusz Oct 6, 2020
27a01c7
Merge branch 'master' into file-support
leszekhanusz Oct 6, 2020
e488612
fix file upload tests on windows and add a binary file upload test
leszekhanusz Oct 10, 2020
e92eee2
fix mypy
leszekhanusz Oct 10, 2020
19e4e25
Merge branch 'master' into file-support
leszekhanusz Oct 14, 2020
a96ed24
Merge branch 'master' into file-support
leszekhanusz Oct 14, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ The main features of GQL are:
* Possibility to [validate the queries locally](https://gql.readthedocs.io/en/latest/usage/validation.html) using a GraphQL schema provided locally or fetched from the backend using an instrospection query
* Supports GraphQL queries, mutations and subscriptions
* Supports [sync or async usage](https://gql.readthedocs.io/en/latest/async/index.html), [allowing concurrent requests](https://gql.readthedocs.io/en/latest/advanced/async_advanced_usage.html#async-advanced-usage)
* Supports [File uploads](https://gql.readthedocs.io/en/latest/usage/file_upload.html)

## Installation

Expand Down
2 changes: 2 additions & 0 deletions docs/transports/aiohttp.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
.. _aiohttp_transport:

AIOHTTPTransport
================

Expand Down
69 changes: 69 additions & 0 deletions docs/usage/file_upload.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
File uploads
============

GQL supports file uploads with the :ref:`aiohttp transport <aiohttp_transport>`
using the `GraphQL multipart request spec`_.

.. _GraphQL multipart request spec: https://github.com/jaydenseric/graphql-multipart-request-spec

Single File
-----------

In order to upload a single file, you need to:

* set the file as a variable value in the mutation
* provide the opened file to the `variable_values` argument of `execute`
* set the `upload_files` argument to True

.. code-block:: python

transport = AIOHTTPTransport(url='YOUR_URL')

client = Client(transport=sample_transport)

query = gql('''
mutation($file: Upload!) {
singleUpload(file: $file) {
id
}
}
''')

with open("YOUR_FILE_PATH", "rb") as f:

params = {"file": f}

result = client.execute(
query, variable_values=params, upload_files=True
)

File list
---------

It is also possible to upload multiple files using a list.

.. code-block:: python

transport = AIOHTTPTransport(url='YOUR_URL')

client = Client(transport=sample_transport)

query = gql('''
mutation($files: [Upload!]!) {
multipleUpload(files: $files) {
id
}
}
''')

f1 = open("YOUR_FILE_PATH_1", "rb")
f2 = open("YOUR_FILE_PATH_1", "rb")

params = {"files": [f1, f2]}

result = client.execute(
query, variable_values=params, upload_files=True
)

f1.close()
f2.close()
1 change: 1 addition & 0 deletions docs/usage/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@ Usage
subscriptions
variables
headers
file_upload
71 changes: 61 additions & 10 deletions gql/transport/aiohttp.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
import json
import logging
from ssl import SSLContext
from typing import Any, AsyncGenerator, Dict, Optional, Union

Expand All @@ -8,6 +10,7 @@
from aiohttp.typedefs import LooseCookies, LooseHeaders
from graphql import DocumentNode, ExecutionResult, print_ast

from ..utils import extract_files
from .async_transport import AsyncTransport
from .exceptions import (
TransportAlreadyConnected,
Expand All @@ -16,6 +19,8 @@
TransportServerError,
)

log = logging.getLogger(__name__)


class AIOHTTPTransport(AsyncTransport):
""":ref:`Async Transport <async_transports>` to execute GraphQL queries
Expand All @@ -32,7 +37,7 @@ def __init__(
auth: Optional[BasicAuth] = None,
ssl: Union[SSLContext, bool, Fingerprint] = False,
timeout: Optional[int] = None,
client_session_args: Dict[str, Any] = {},
client_session_args: Optional[Dict[str, Any]] = None,
) -> None:
"""Initialize the transport with the given aiohttp parameters.

Expand All @@ -54,7 +59,6 @@ def __init__(
self.ssl: Union[SSLContext, bool, Fingerprint] = ssl
self.timeout: Optional[int] = timeout
self.client_session_args = client_session_args

self.session: Optional[aiohttp.ClientSession] = None

async def connect(self) -> None:
Expand All @@ -81,7 +85,8 @@ async def connect(self) -> None:
)

# Adding custom parameters passed from init
client_session_args.update(self.client_session_args)
if self.client_session_args:
client_session_args.update(self.client_session_args) # type: ignore

self.session = aiohttp.ClientSession(**client_session_args)

Expand All @@ -104,7 +109,8 @@ async def execute(
document: DocumentNode,
variable_values: Optional[Dict[str, str]] = None,
operation_name: Optional[str] = None,
extra_args: Dict[str, Any] = {},
extra_args: Dict[str, Any] = None,
upload_files: bool = False,
) -> ExecutionResult:
"""Execute the provided document AST against the configured remote server
using the current session.
Expand All @@ -118,25 +124,70 @@ async def execute(
:param variables_values: An optional Dict of variable values
:param operation_name: An optional Operation name for the request
:param extra_args: additional arguments to send to the aiohttp post method
:param upload_files: Set to True if you want to put files in the variable values
:returns: an ExecutionResult object.
"""

query_str = print_ast(document)

payload: Dict[str, Any] = {
"query": query_str,
}

if variable_values:
payload["variables"] = variable_values
if operation_name:
payload["operationName"] = operation_name

post_args = {
"json": payload,
}
if upload_files:

# If the upload_files flag is set, then we need variable_values
assert variable_values is not None

# If we upload files, we will extract the files present in the
# variable_values dict and replace them by null values
nulled_variable_values, files = extract_files(variable_values)

# Save the nulled variable values in the payload
payload["variables"] = nulled_variable_values

# Prepare aiohttp to send multipart-encoded data
data = aiohttp.FormData()

# Generate the file map
# path is nested in a list because the spec allows multiple pointers
# to the same file. But we don't support that.
# Will generate something like {"0": ["variables.file"]}
file_map = {str(i): [path] for i, path in enumerate(files)}

# Enumerate the file streams
# Will generate something like {'0': <_io.BufferedReader ...>}
file_streams = {str(i): files[path] for i, path in enumerate(files)}

# Add the payload to the operations field
operations_str = json.dumps(payload)
log.debug("operations %s", operations_str)
data.add_field(
"operations", operations_str, content_type="application/json"
)

# Add the file map field
file_map_str = json.dumps(file_map)
log.debug("file_map %s", file_map_str)
data.add_field("map", file_map_str, content_type="application/json")

# Add the extracted files as remaining fields
data.add_fields(*file_streams.items())

post_args: Dict[str, Any] = {"data": data}

else:
if variable_values:
payload["variables"] = variable_values

post_args = {"json": payload}

# Pass post_args to aiohttp post method
post_args.update(extra_args)
if extra_args:
post_args.update(extra_args)

if self.session is None:
raise TransportClosed("Transport is not connected")
Expand Down
43 changes: 43 additions & 0 deletions gql/utils.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
"""Utilities to manipulate several python objects."""

import io
from typing import Any, Dict, Tuple


# From this response in Stackoverflow
# http://stackoverflow.com/a/19053800/1072990
Expand All @@ -8,3 +11,43 @@ def to_camel_case(snake_str):
# We capitalize the first letter of each component except the first one
# with the 'title' method and join them together.
return components[0] + "".join(x.title() if x else "_" for x in components[1:])


def is_file_like(value: Any) -> bool:
"""Check if a value represents a file like object"""
return isinstance(value, io.IOBase)


def extract_files(variables: Dict) -> Tuple[Dict, Dict]:
files = {}

def recurse_extract(path, obj):
"""
recursively traverse obj, doing a deepcopy, but
replacing any file-like objects with nulls and
shunting the originals off to the side.
"""
nonlocal files
if isinstance(obj, list):
nulled_obj = []
for key, value in enumerate(obj):
value = recurse_extract(f"{path}.{key}", value)
nulled_obj.append(value)
return nulled_obj
elif isinstance(obj, dict):
nulled_obj = {}
for key, value in obj.items():
value = recurse_extract(f"{path}.{key}", value)
nulled_obj[key] = value
return nulled_obj
elif is_file_like(obj):
# extract obj from its parent and put it into files instead.
files[path] = obj
return None
else:
# base case: pass through unchanged
return obj

nulled_variables = recurse_extract("variables", variables)

return nulled_variables, files
31 changes: 31 additions & 0 deletions tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,10 @@
import os
import pathlib
import ssl
import tempfile
import types
from concurrent.futures import ThreadPoolExecutor
from typing import Union

import pytest
import websockets
Expand Down Expand Up @@ -187,6 +189,35 @@ async def send_connection_ack(ws):
await ws.send('{"event":"phx_reply", "payload": {"status": "ok"}, "ref": 1}')


class TemporaryFile:
"""Class used to generate temporary files for the tests"""

def __init__(self, content: Union[str, bytearray]):

mode = "w" if isinstance(content, str) else "wb"

# We need to set the newline to '' so that the line returns
# are not replaced by '\r\n' on windows
newline = "" if isinstance(content, str) else None

self.file = tempfile.NamedTemporaryFile(
mode=mode, newline=newline, delete=False
)

with self.file as f:
f.write(content)

@property
def filename(self):
return self.file.name

def __enter__(self):
return self

def __exit__(self, type, value, traceback):
os.unlink(self.filename)


def get_server_handler(request):
"""Get the server handler.

Expand Down
Loading