FloatType#
- class pyspark.sql.types.FloatType[source]#
Float data type, representing single precision floats.
Methods
fromDDL
(ddl)Creates
DataType
for a given DDL-formatted string.fromInternal
(obj)Converts an internal SQL object into a native Python object.
json
()Does this type needs conversion between Python object and internal SQL object.
toInternal
(obj)Converts a Python object into an internal SQL object.
typeName
()Methods Documentation
- classmethod fromDDL(ddl)#
Creates
DataType
for a given DDL-formatted string.New in version 4.0.0.
- Parameters
- ddlstr
DDL-formatted string representation of types, e.g.
pyspark.sql.types.DataType.simpleString
, except that top level struct type can omit thestruct<>
for the compatibility reason withspark.createDataFrame
and Python UDFs.
- Returns
Examples
Create a StructType by the corresponding DDL formatted string.
>>> from pyspark.sql.types import DataType >>> DataType.fromDDL("b string, a int") StructType([StructField('b', StringType(), True), StructField('a', IntegerType(), True)])
Create a single DataType by the corresponding DDL formatted string.
>>> DataType.fromDDL("decimal(10,10)") DecimalType(10,10)
Create a StructType by the legacy string format.
>>> DataType.fromDDL("b: string, a: int") StructType([StructField('b', StringType(), True), StructField('a', IntegerType(), True)])
- fromInternal(obj)#
Converts an internal SQL object into a native Python object.
- json()#
- jsonValue()#
- needConversion()#
Does this type needs conversion between Python object and internal SQL object.
This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.
- simpleString()#
- toInternal(obj)#
Converts a Python object into an internal SQL object.
- classmethod typeName()#