ArrayType

class pyspark.sql.types.ArrayType(elementType: pyspark.sql.types.DataType, containsNull: bool = True)[source]

Array data type.

Parameters
elementTypeDataType

DataType of each element in the array.

containsNullbool, optional

whether the array can contain null (None) values.

Examples

>>> from pyspark.sql.types import ArrayType, StringType, StructField, StructType

The below example demonstrates how to create class:ArrayType:

>>> arr = ArrayType(StringType())

The array can contain null (None) values by default:

>>> ArrayType(StringType()) == ArrayType(StringType(), True)
True
>>> ArrayType(StringType(), False) == ArrayType(StringType())
False

Methods

fromInternal(obj)

Converts an internal SQL object into a native Python object.

fromJson(json)

json()

jsonValue()

needConversion()

Does this type needs conversion between Python object and internal SQL object.

simpleString()

toInternal(obj)

Converts a Python object into an internal SQL object.

typeName()

Methods Documentation

fromInternal(obj: List[Optional[T]]) → List[Optional[T]][source]

Converts an internal SQL object into a native Python object.

classmethod fromJson(json: Dict[str, Any])pyspark.sql.types.ArrayType[source]
json() → str
jsonValue() → Dict[str, Any][source]
needConversion() → bool[source]

Does this type needs conversion between Python object and internal SQL object.

This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.

simpleString() → str[source]
toInternal(obj: List[Optional[T]]) → List[Optional[T]][source]

Converts a Python object into an internal SQL object.

classmethod typeName() → str