Skip to content

ENH: add support for reading .tar archives #44787

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 38 commits into from
May 7, 2022
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
c1823ef
Add reproduction test for .tar.gz archives
Skn0tt Dec 6, 2021
9a85cba
add support for .tar archives
Skn0tt Dec 6, 2021
e673061
update doc comments
Skn0tt Dec 6, 2021
a0d6386
fix: pep8 errors
Skn0tt Dec 6, 2021
6a8edef
refactor: flip _compression_to_extension around to support multiple e…
Skn0tt Dec 7, 2021
d4e40c9
refactor: detect tar files using existing extension mapping
Skn0tt Dec 7, 2021
5f22df7
feat: add support for writing tar files
Skn0tt Dec 7, 2021
c6573ef
feat: assure it respects .gz endings
Skn0tt Dec 15, 2021
f3b6ed5
Merge branch 'master' into read-tar-archives
Skn0tt Dec 15, 2021
a4ac382
feat: add "tar" entry to compressionoptions
Skn0tt Dec 15, 2021
e66826b
chore: add whatsnew entry
Skn0tt Dec 15, 2021
941be37
fix: test_compression_size_fh
Skn0tt Dec 15, 2021
e3369aa
Merge branch 'master' into read-tar-archives
Skn0tt Jan 4, 2022
0468e5f
add tarfile to shared compression docs
Skn0tt Jan 4, 2022
2531ee0
fix formatting
Skn0tt Jan 4, 2022
57eba0a
pass through "mode" via compression args
Skn0tt Jan 4, 2022
38f7d54
fix pickle test
Skn0tt Jan 4, 2022
887fd10
add class comment
Skn0tt Jan 4, 2022
fc2e7f0
Merge remote-tracking branch 'origin/main' into read-tar-archives
Skn0tt Apr 9, 2022
669d942
sort imports
Skn0tt Apr 9, 2022
7d7d3c6
add _compression_to_extension back for backwards compatibility
Skn0tt Apr 9, 2022
8b8b8ac
fix some type warnings
Skn0tt Apr 9, 2022
dd356f6
fix: formatting
Skn0tt Apr 9, 2022
514014a
fix: mypy complaints
Skn0tt Apr 9, 2022
38971c7
fix: more tests
Skn0tt Apr 9, 2022
e35d361
fix: some error with xml
Skn0tt Apr 9, 2022
c5088fc
fix: interpreted text role
Skn0tt Apr 9, 2022
f6c5173
move to v1.5 whatsnw
Skn0tt Apr 9, 2022
9a4fa07
add versionadded note
Skn0tt Apr 11, 2022
0c31aa8
don't leave blank lines
Skn0tt Apr 11, 2022
086c598
add tests for zero files / multiple files
Skn0tt Apr 13, 2022
861faf0
move _compression_to_extension to tests
Skn0tt Apr 13, 2022
9458ecb
revert added "mode" argument
Skn0tt Apr 13, 2022
d20f315
add test to ensure that `compression.mode` works
Skn0tt Apr 13, 2022
1066f1b
Merge branch 'main' into read-tar-archives
Skn0tt Apr 25, 2022
6b0e1e6
Merge branch 'main' into read-tar-archives
Skn0tt May 5, 2022
0d9ed18
compare strings, not bytes
Skn0tt May 5, 2022
37370c2
replace carriage returns
Skn0tt May 6, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions doc/source/whatsnew/v1.5.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,31 @@ as seen in the following example.
1 2021-01-02 08:00:00 4
2 2021-01-02 16:00:00 5

.. _whatsnew_150.enhancements.tar:

Reading directly from TAR archives
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I/O methods like :func:`read_csv` or :meth:`DataFrame.to_json` now allow reading and writing
directly on TAR archives (:issue:`44787`).

.. code-block:: python

df = pd.read_csv("./movement.tar.gz")
# ...
df.to_csv("./out.tar.gz")

This supports ``.tar``, ``.tar.gz``, ``.tar.bz`` and ``.tar.xz2`` archives.
The used compression method is inferred from the filename.
If the compression method cannot be inferred, use the ``compression`` argument:

.. code-block:: python

df = pd.read_csv(some_file_obj, compression={"method": "tar", "mode": "r:gz"}) # noqa F821

(``mode`` being one of ``tarfile.open``'s modes: https://docs.python.org/3/library/tarfile.html#tarfile.open)


.. _whatsnew_150.enhancements.other:

Other enhancements
Expand Down
10 changes: 10 additions & 0 deletions pandas/_testing/_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@
import bz2
from functools import wraps
import gzip
import io
import socket
import tarfile
from typing import (
TYPE_CHECKING,
Any,
Expand Down Expand Up @@ -398,6 +400,14 @@ def write_to_compressed(compression, path, data, dest="test"):
mode = "w"
args = (dest, data)
method = "writestr"
elif compression == "tar":
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this entire function could be replaced with a call to get_handle (not needed in this PR)

compress_method = tarfile.TarFile
mode = "w"
file = tarfile.TarInfo(name=dest)
bytes = io.BytesIO(data)
file.size = len(data)
args = (file, bytes)
method = "addfile"
elif compression == "gzip":
compress_method = gzip.GzipFile
elif compression == "bz2":
Expand Down
2 changes: 1 addition & 1 deletion pandas/_typing.py
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@ def closed(self) -> bool:
# compression keywords and compression
CompressionDict = Dict[str, Any]
CompressionOptions = Optional[
Union[Literal["infer", "gzip", "bz2", "zip", "xz", "zstd"], CompressionDict]
Union[Literal["infer", "gzip", "bz2", "zip", "xz", "zstd", "tar"], CompressionDict]
]

# types in DataFrameFormatter
Expand Down
2 changes: 2 additions & 0 deletions pandas/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -290,6 +290,7 @@ def other_closed(request):
"bz2",
"zip",
"xz",
"tar",
pytest.param("zstd", marks=td.skip_if_no("zstandard")),
]
)
Expand All @@ -306,6 +307,7 @@ def compression(request):
"bz2",
"zip",
"xz",
"tar",
pytest.param("zstd", marks=td.skip_if_no("zstandard")),
]
)
Expand Down
44 changes: 29 additions & 15 deletions pandas/core/shared_docs.py
Original file line number Diff line number Diff line change
Expand Up @@ -421,29 +421,43 @@
] = """compression : str or dict, default 'infer'
For on-the-fly compression of the output data. If 'infer' and '%s'
path-like, then detect compression from the following extensions: '.gz',
'.bz2', '.zip', '.xz', or '.zst' (otherwise no compression). Set to
``None`` for no compression. Can also be a dict with key ``'method'`` set
to one of {``'zip'``, ``'gzip'``, ``'bz2'``, ``'zstd'``} and other
key-value pairs are forwarded to ``zipfile.ZipFile``, ``gzip.GzipFile``,
``bz2.BZ2File``, or ``zstandard.ZstdDecompressor``, respectively. As an
example, the following could be passed for faster compression and to create
'.bz2', '.zip', '.xz', '.zst', '.tar', '.tar.gz', '.tar.xz' or '.tar.bz2'
(otherwise no compression).
Set to ``None`` for no compression.
Can also be a dict with key ``'method'`` set
to one of {``'zip'``, ``'gzip'``, ``'bz2'``, ``'zstd'``, ``'tar'``} and other
key-value pairs are forwarded to
``zipfile.ZipFile``, ``gzip.GzipFile``,
``bz2.BZ2File``, ``zstandard.ZstdDecompressor`` or
``tarfile.TarFile``, respectively.
As an example, the following could be passed for faster compression and to create
a reproducible gzip archive:
``compression={'method': 'gzip', 'compresslevel': 1, 'mtime': 1}``."""
``compression={'method': 'gzip', 'compresslevel': 1, 'mtime': 1}``.

.. versionadded:: 1.5.0
Added support for `.tar` files."""

_shared_docs[
"decompression_options"
] = """compression : str or dict, default 'infer'
For on-the-fly decompression of on-disk data. If 'infer' and '%s' is
path-like, then detect compression from the following extensions: '.gz',
'.bz2', '.zip', '.xz', or '.zst' (otherwise no compression). If using
'zip', the ZIP file must contain only one data file to be read in. Set to
``None`` for no decompression. Can also be a dict with key ``'method'`` set
to one of {``'zip'``, ``'gzip'``, ``'bz2'``, ``'zstd'``} and other
key-value pairs are forwarded to ``zipfile.ZipFile``, ``gzip.GzipFile``,
``bz2.BZ2File``, or ``zstandard.ZstdDecompressor``, respectively. As an
example, the following could be passed for Zstandard decompression using a
'.bz2', '.zip', '.xz', '.zst', '.tar', '.tar.gz', '.tar.xz' or '.tar.bz2'
(otherwise no compression).
If using 'zip' or 'tar', the ZIP file must contain only one data file to be read in.
Set to ``None`` for no decompression.
Can also be a dict with key ``'method'`` set
to one of {``'zip'``, ``'gzip'``, ``'bz2'``, ``'zstd'``, ``'tar'``} and other
key-value pairs are forwarded to
``zipfile.ZipFile``, ``gzip.GzipFile``,
``bz2.BZ2File``, ``zstandard.ZstdDecompressor`` or
``tarfile.TarFile``, respectively.
As an example, the following could be passed for Zstandard decompression using a
custom compression dictionary:
``compression={'method': 'zstd', 'dict_data': my_compression_dict}``."""
``compression={'method': 'zstd', 'dict_data': my_compression_dict}``.

.. versionadded:: 1.5.0
Added support for `.tar` files."""

_shared_docs[
"replace"
Expand Down
164 changes: 153 additions & 11 deletions pandas/io/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
from io import (
BufferedIOBase,
BytesIO,
FileIO,
RawIOBase,
StringIO,
TextIOBase,
Expand All @@ -19,6 +20,7 @@
import os
from pathlib import Path
import re
import tarfile
from typing import (
IO,
Any,
Expand Down Expand Up @@ -450,12 +452,20 @@ def file_path_to_url(path: str) -> str:
return urljoin("file:", pathname2url(path))


_extension_to_compression = {
".tar": "tar",
".tar.gz": "tar",
".tar.bz2": "tar",
".tar.xz": "tar",
".gz": "gzip",
".bz2": "bz2",
".zip": "zip",
".xz": "xz",
".zst": "zstd",
}
_supported_compressions = set(_extension_to_compression.values())
_compression_to_extension = {
"gzip": ".gz",
"bz2": ".bz2",
"zip": ".zip",
"xz": ".xz",
"zstd": ".zst",
value: key for key, value in _extension_to_compression.items()
}


Expand Down Expand Up @@ -532,20 +542,18 @@ def infer_compression(
return None

# Infer compression from the filename/URL extension
for compression, extension in _compression_to_extension.items():
for extension, compression in _extension_to_compression.items():
if filepath_or_buffer.lower().endswith(extension):
return compression
return None

# Compression has been specified. Check that it's valid
if compression in _compression_to_extension:
if compression in _supported_compressions:
return compression

# https://github.com/python/mypy/issues/5492
# Unsupported operand types for + ("List[Optional[str]]" and "List[str]")
valid = ["infer", None] + sorted(
_compression_to_extension
) # type: ignore[operator]
valid = ["infer", None] + sorted(_supported_compressions) # type: ignore[operator]
msg = (
f"Unrecognized compression type: {compression}\n"
f"Valid compression types are {valid}"
Expand Down Expand Up @@ -682,7 +690,7 @@ def get_handle(
ioargs.encoding,
ioargs.mode,
errors,
ioargs.compression["method"] not in _compression_to_extension,
ioargs.compression["method"] not in _supported_compressions,
)

is_path = isinstance(handle, str)
Expand Down Expand Up @@ -753,6 +761,30 @@ def get_handle(
f"Only one file per ZIP: {zip_names}"
)

# TAR Encoding
elif compression == "tar":
if "mode" not in compression_args:
compression_args["mode"] = ioargs.mode
if is_path:
handle = _BytesTarFile.open(name=handle, **compression_args)
else:
handle = _BytesTarFile.open(fileobj=handle, **compression_args)
assert isinstance(handle, _BytesTarFile)
if handle.mode == "r":
handles.append(handle)
files = handle.getnames()
if len(files) == 1:
file = handle.extractfile(files[0])
assert file is not None
handle = file
elif len(files) == 0:
raise ValueError(f"Zero files found in TAR archive {path_or_buf}")
else:
raise ValueError(
"Multiple files found in TAR archive. "
f"Only one file per TAR archive: {files}"
)

# XZ Compression
elif compression == "xz":
handle = get_lzma_file()(handle, ioargs.mode)
Expand Down Expand Up @@ -844,6 +876,116 @@ def get_handle(
)


# error: Definition of "__exit__" in base class "TarFile" is incompatible with
# definition in base class "BytesIO" [misc]
# error: Definition of "__enter__" in base class "TarFile" is incompatible with
# definition in base class "BytesIO" [misc]
# error: Definition of "__enter__" in base class "TarFile" is incompatible with
# definition in base class "BinaryIO" [misc]
# error: Definition of "__enter__" in base class "TarFile" is incompatible with
# definition in base class "IO" [misc]
# error: Definition of "read" in base class "TarFile" is incompatible with
# definition in base class "BytesIO" [misc]
# error: Definition of "read" in base class "TarFile" is incompatible with
# definition in base class "IO" [misc]
class _BytesTarFile(tarfile.TarFile, BytesIO): # type: ignore[misc]
"""
Wrapper for standard library class TarFile and allow the returned file-like
handle to accept byte strings via `write` method.

BytesIO provides attributes of file-like object and TarFile.addfile writes
bytes strings into a member of the archive.
"""

# GH 17778
def __init__(
self,
name: str | bytes | os.PathLike[str] | os.PathLike[bytes],
mode: Literal["r", "a", "w", "x"],
fileobj: FileIO,
archive_name: str | None = None,
**kwargs,
):
self.archive_name = archive_name
self.multiple_write_buffer: BytesIO | None = None
self._closing = False

super().__init__(name=name, mode=mode, fileobj=fileobj, **kwargs)

@classmethod
def open(cls, name=None, mode="r", **kwargs):
mode = mode.replace("b", "")
return super().open(name=name, mode=cls.extend_mode(name, mode), **kwargs)

@classmethod
def extend_mode(
cls, name: FilePath | ReadBuffer[bytes] | WriteBuffer[bytes], mode: str
) -> str:
if mode != "w":
return mode
if isinstance(name, (os.PathLike, str)):
filename = Path(name)
if filename.suffix == ".gz":
return mode + ":gz"
elif filename.suffix == ".xz":
return mode + ":xz"
elif filename.suffix == ".bz2":
return mode + ":bz2"
return mode

def infer_filename(self):
"""
If an explicit archive_name is not given, we still want the file inside the zip
file not to be named something.tar, because that causes confusion (GH39465).
"""
if isinstance(self.name, (os.PathLike, str)):
# error: Argument 1 to "Path" has
# incompatible type "Union[str, PathLike[str], PathLike[bytes]]";
# expected "Union[str, PathLike[str]]" [arg-type]
filename = Path(self.name) # type: ignore[arg-type]
if filename.suffix == ".tar":
return filename.with_suffix("").name
if filename.suffix in [".tar.gz", ".tar.bz2", ".tar.xz"]:
return filename.with_suffix("").with_suffix("").name
return filename.name
return None

def write(self, data):
# buffer multiple write calls, write on flush
if self.multiple_write_buffer is None:
self.multiple_write_buffer = BytesIO()
self.multiple_write_buffer.write(data)

def flush(self) -> None:
# write to actual handle and close write buffer
if self.multiple_write_buffer is None or self.multiple_write_buffer.closed:
return

# TarFile needs a non-empty string
archive_name = self.archive_name or self.infer_filename() or "tar"
with self.multiple_write_buffer:
value = self.multiple_write_buffer.getvalue()
tarinfo = tarfile.TarInfo(name=archive_name)
tarinfo.size = len(value)
self.addfile(tarinfo, BytesIO(value))

def close(self):
self.flush()
super().close()

@property
def closed(self):
if self.multiple_write_buffer is None:
return False
return self.multiple_write_buffer.closed and super().closed

@closed.setter
def closed(self, value):
if not self._closing and value:
self._closing = True
self.close()


# error: Definition of "__exit__" in base class "ZipFile" is incompatible with
# definition in base class "BytesIO" [misc]
# error: Definition of "__enter__" in base class "ZipFile" is incompatible with
Expand Down
3 changes: 2 additions & 1 deletion pandas/io/json/_json.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ def to_json(
default_handler: Callable[[Any], JSONSerializable] | None = None,
lines: bool = False,
compression: CompressionOptions = "infer",
mode: str = "w",
index: bool = True,
indent: int = 0,
storage_options: StorageOptions = None,
Expand Down Expand Up @@ -125,7 +126,7 @@ def to_json(
if path_or_buf is not None:
# apply compression and byte/text conversion
with get_handle(
path_or_buf, "w", compression=compression, storage_options=storage_options
path_or_buf, mode, compression=compression, storage_options=storage_options
) as handles:
handles.handle.write(s)
else:
Expand Down
3 changes: 2 additions & 1 deletion pandas/io/pickle.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ def to_pickle(
obj: Any,
filepath_or_buffer: FilePath | WriteBuffer[bytes],
compression: CompressionOptions = "infer",
mode: str = "wb",
protocol: int = pickle.HIGHEST_PROTOCOL,
storage_options: StorageOptions = None,
) -> None:
Expand Down Expand Up @@ -96,7 +97,7 @@ def to_pickle(

with get_handle(
filepath_or_buffer,
"wb",
mode,
compression=compression,
is_text=False,
storage_options=storage_options,
Expand Down
Loading