pyspark.sql.types.
DecimalType
Decimal (decimal.Decimal) data type.
The DecimalType must have fixed precision (the maximum total number of digits) and scale (the number of digits on the right of dot). For example, (5, 2) can support the value from [-999.99 to 999.99].
The precision can be up to 38, the scale must be less or equal to precision.
When creating a DecimalType, the default precision and scale is (10, 0). When inferring schema from decimal.Decimal objects, it will be DecimalType(38, 18).
the maximum (i.e. total) number of digits (default: 10)
the number of digits on right side of dot. (default: 0)
Methods
fromInternal(obj)
fromInternal
Converts an internal SQL object into a native Python object.
json()
json
jsonValue()
jsonValue
needConversion()
needConversion
Does this type needs conversion between Python object and internal SQL object.
simpleString()
simpleString
toInternal(obj)
toInternal
Converts a Python object into an internal SQL object.
typeName()
typeName
Methods Documentation
This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.