Description
Pandas version checks
-
I have checked that this issue has not already been reported.
-
I have confirmed this bug exists on the latest version of pandas.
-
I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
import pandas as pd
datetimes = [None, "2022-01-01T10:00:30", "2022-01-01T10:01:00"]
dt = pd.Index(datetimes, dtype="timestamp[ms][pyarrow]")
offset = pd.Timestamp("2022-01-01 10:00:30")
unit = pd.Index([pd.Timedelta(30, "s")], dtype="duration[ms][pyarrow]").item()
# %% encode to double[pyarrow]
encoded = (dt - offset) / unit
decoded = (encoded.round().astype(float) * unit + offset).astype(dt.dtype)
# compare original and decoded
pd.testing.assert_index_equal(dt, decoded, exact=True) # ✅
assert ((dt - dt) == 0).all() # ✅
assert ((decoded - decoded) == 0).all() # ✅
assert ((decoded - dt) == 0).all() # ✅
assert ((dt - decoded) == 0).all() # ❌ overflow ?!?!
Issue Description
This one is absolutely baffling to me. Two Index
objects, despite satisfying assert_index_equal
, raise an exception when subtracting each other. I guess assert_index_equal
must be omitting some internal differences under the hood?
It occurs after encoding and decoding a timestamp[ms][pyarrow]
index to floating and back to timestamp[ms][pyarrow]
, by subtracting and offset and dividing by some frequency.
Even more weird is that it makes a difference how we define the unit:
# %% Try with different units
unit_a = pd.Timedelta(30, "s") # <-- this one works!
unit_b = pd.Index([pd.Timedelta(30, "s")], dtype="duration[ms][pyarrow]").item()
assert unit_a == unit_b # ✅
assert hash(unit_a) == hash(unit_b) # ✅
# encode to double[pyarrow]
encoded_a = (dt - offset) / unit_a
encoded_b = (dt - offset) / unit_b
pd.testing.assert_index_equal(encoded_a, encoded_b, exact=True) # ✅
# decode
decoded_a = (encoded_a.round().astype(float) * unit_a + offset).astype(dt.dtype)
decoded_b = (encoded_b.round().astype(float) * unit_b + offset).astype(dt.dtype)
pd.testing.assert_index_equal(decoded_a, decoded_b, exact=True) # ✅
pd.testing.assert_index_equal(dt, decoded_a, exact=True) # ✅
pd.testing.assert_index_equal(dt, decoded_b, exact=True) # ✅
assert ((dt - decoded_a) == 0).all() # ✅
assert ((dt - decoded_b) == 0).all() # ❌ overflow ?!?!
Moreover, manually computing the difference in pyarrow
works as well:
# %% compute differences in pyarrow:
import pyarrow as pa
x = pa.Array.from_pandas(dt)
y = pa.Array.from_pandas(decoded)
pa.compute.subtract(x, y) # ✅
pa.compute.subtract(y, x) # ✅
Expected Behavior
Either assert_index_equal
should show some discrepancy, or dt
an decoded
should behave interchangably.
Installed Versions
INSTALLED VERSIONS
commit : d9cdd2e
python : 3.11.7.final.0
python-bits : 64
OS : Linux
OS-release : 6.5.0-41-generic
Version : #41~22.04.2-Ubuntu SMP PREEMPT_DYNAMIC Mon Jun 3 11:32:55 UTC 2
machine : x86_64
processor : x86_64
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : en_US.UTF-8
pandas : 2.2.2
numpy : 2.0.0
pytz : 2024.1
dateutil : 2.9.0.post0
setuptools : 70.1.0
pip : 24.1
Cython : None
pytest : 8.2.2
hypothesis : 6.103.5
sphinx : 7.3.7
blosc : None
feather : None
xlsxwriter : None
lxml.etree : None
html5lib : None
pymysql : None
psycopg2 : None
jinja2 : 3.1.4
IPython : 8.25.0
pandas_datareader : None
adbc-driver-postgresql: None
adbc-driver-sqlite : None
bs4 : 4.12.3
bottleneck : None
dataframe-api-compat : None
fastparquet : None
fsspec : 2024.6.0
gcsfs : None
matplotlib : 3.9.0
numba : None
numexpr : None
odfpy : None
openpyxl : 3.1.4
pandas_gbq : None
pyarrow : 16.1.0
pyreadstat : None
python-calamine : None
pyxlsb : None
s3fs : None
scipy : 1.13.1
sqlalchemy : None
tables : None
tabulate : None
xarray : None
xlrd : None
zstandard : None
tzdata : 2024.1
qtpy : None
pyqt5 : None