Open
Description
There seems to be no way to define data classes where the data class encoder produces a Spark schema with fields of type Decimal(38, 0)
. The natural approach would be to define a data class with a field of type BigInteger, but this is unsupported by the data class encoder.
This can be seen by the following code
data class A(val value: BigInteger)
fun main() = withSpark {
val ds = dsOf(1, 2)
val df = ds.`as`<A>()
println(df.schema())
}
which throws
java.lang.IllegalArgumentException: java.math.BigInteger is unsupported
.
Metadata
Metadata
Assignees
Labels
No labels