VariantType#
- class pyspark.sql.types.VariantType[source]#
- Variant data type, representing semi-structured values. - New in version 4.0.0. - Methods - fromDDL(ddl)- Creates - DataTypefor a given DDL-formatted string.- fromInternal(obj)- Converts an internal SQL object into a native Python object. - json()- Does this type needs conversion between Python object and internal SQL object. - toInternal(variant)- Converts a Python object into an internal SQL object. - typeName()- Methods Documentation - classmethod fromDDL(ddl)#
- Creates - DataTypefor a given DDL-formatted string.- New in version 4.0.0. - Parameters
- ddlstr
- DDL-formatted string representation of types, e.g. - pyspark.sql.types.DataType.simpleString, except that top level struct type can omit the- struct<>for the compatibility reason with- spark.createDataFrameand Python UDFs.
 
- Returns
 - Examples - Create a StructType by the corresponding DDL formatted string. - >>> from pyspark.sql.types import DataType >>> DataType.fromDDL("b string, a int") StructType([StructField('b', StringType(), True), StructField('a', IntegerType(), True)]) - Create a single DataType by the corresponding DDL formatted string. - >>> DataType.fromDDL("decimal(10,10)") DecimalType(10,10) - Create a StructType by the legacy string format. - >>> DataType.fromDDL("b: string, a: int") StructType([StructField('b', StringType(), True), StructField('a', IntegerType(), True)]) 
 - json()#
 - jsonValue()#
 - needConversion()[source]#
- Does this type needs conversion between Python object and internal SQL object. - This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType. 
 - simpleString()#
 - classmethod typeName()#