StructField

class pyspark.sql.types.StructField(name: str, dataType: pyspark.sql.types.DataType, nullable: bool = True, metadata: Optional[Dict[str, Any]] = None)[source]

A field in StructType.

Parameters
namestr

name of the field.

dataTypeDataType

DataType of the field.

nullablebool, optional

whether the field can be null (None) or not.

metadatadict, optional

a dict from string to simple type that can be toInternald to JSON automatically

Examples

>>> from pyspark.sql.types import StringType, StructField
>>> (StructField("f1", StringType(), True)
...      == StructField("f1", StringType(), True))
True
>>> (StructField("f1", StringType(), True)
...      == StructField("f2", StringType(), True))
False

Methods

fromInternal(obj)

Converts an internal SQL object into a native Python object.

fromJson(json)

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()

Methods Documentation

fromInternal(obj: T) → T[source]

Converts an internal SQL object into a native Python object.

classmethod fromJson(json: Dict[str, Any])pyspark.sql.types.StructField[source]
json() → str
jsonValue() → Dict[str, Any][source]
needConversion() → bool[source]

Does this type needs conversion between Python object and internal SQL object.

This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.

simpleString() → str[source]
toInternal(obj: T) → T[source]

Converts a Python object into an internal SQL object.

typeName() → str[source]