-
Notifications
You must be signed in to change notification settings - Fork 246
Description
What is the problem the feature request solves?
Comet currently delegates to DataFusion for many cast operations, and the behavior is not guaranteed to match Spark. This epic is to track fully implementing Spark-compatible cast and try_cast operations in Comet, with support for ANSI mode.
For each item in this list to be considered complete, we should have scala tests demonstrating that cast and try_cast produce the same results as Spark, both with ANSI mode enabled and disabled, using fuzz testing to find edge cases. We can update this list with links to issues as we make progress.
For cast operations that we cannot easily support with full compatibility, we should either fall back to Spark or provide a configuration that the user can enable to allow the operation to run in Comet. We should also provide documentation explaining any differences in behavior compared to Spark.
- Cast from string to another type
- Boolean - feat: Support ANSI mode in CAST from String to Bool #290
- Integral Types (byte, short, int, long) - feat: Implement Spark-compatible CAST from string to integral types #307
- Floating-point (float, double) - Implement Spark-compatible CAST from String to Floating Point #326
- Decimal - Implement Spark-compatible CAST from String to Decimal #325
- Date - Implement Spark-compatible CAST from String to Date #327
- Timestamp - Implement Spark-compatible CAST from String to Timestamp #328 and Cast string to timestamp remaining work #376
- Cast to string from primitive types
- Boolean - chore: Add more cast tests and improve test framework #351
- Integral Types (byte, short, int, long) - chore: Add more cast tests and improve test framework #351
- Floating-point (float, double) - Implement Spark-compatible CAST float/double to string #312
- Decimal - seems to correct but needs tests to confirm, also should fall back to Spark if
spark.sql.legacy.allowNegativeScaleOfDecimalis true and scale is negative - Date - seems correct but needs tests to confirm
- Timestamp - seems correct but needs tests to confirm
- Cast between numeric types
- Integral to Integral - Implement Spark-compatible CAST between integer types #311
- Integral to Boolean - chore: Add more cast tests and improve test framework #351
- Integral to Decimal - Implement Spark-compatible cast from integral types to decimal #2049
- Integral to Floating-point - chore: Add more cast tests and improve test framework #351
- Floating-point to Boolean - chore: Add more cast tests and improve test framework #351
- Floating-point to Decimal - Implement Spark-compatible CAST from floating-point to decimal #371
- Floating-point to Integral - Implement Spark-compatible CAST from float/double to integer types #350
- Decimal to Boolean - fails because arrow does not support this cast
- Decimal to Integral - same issues as Implement Spark-compatible CAST from float/double to integer types #350
- Decimal to Floating-point - seems correct but needs tests to confirm
- Implement Spark-compatible cast between decimals with different precision and scale #375
- Cast between temporal types
- Date to boolean/int/float/decimal - results are incorrect
- Date to Timestamp / TimestampNTZ
- Timestamp to boolean/int/float/decimal - Implement Spark-compatible CAST from timestamp to numeric types #352
- Timestamp to Date
- Other
In addition to the above tasks, we also need to do the following:
- Implement a mechanism where we can selectively fall back to Spark for specific cast operations (feat: Disable cast string to timestamp by default #337 )
- Write documentation that explains any differences between Comet and Spark
- Add support for TryCast expression in Spark 3.2 and 3.3 #374