ArrayType¶
-
class
pyspark.sql.types.
ArrayType
(elementType: pyspark.sql.types.DataType, containsNull: bool = True)[source]¶ Array data type.
- Parameters
Examples
>>> from pyspark.sql.types import ArrayType, StringType, StructField, StructType
The below example demonstrates how to create class:ArrayType:
>>> arr = ArrayType(StringType())
The array can contain null (None) values by default:
>>> ArrayType(StringType()) == ArrayType(StringType(), True) True >>> ArrayType(StringType(), False) == ArrayType(StringType()) False
Methods
fromInternal
(obj)Converts an internal SQL object into a native Python object.
fromJson
(json)json
()Does this type needs conversion between Python object and internal SQL object.
toInternal
(obj)Converts a Python object into an internal SQL object.
typeName
()Methods Documentation
-
fromInternal
(obj: List[Optional[T]]) → List[Optional[T]][source]¶ Converts an internal SQL object into a native Python object.
-
classmethod
fromJson
(json: Dict[str, Any]) → pyspark.sql.types.ArrayType[source]¶
-
json
() → str¶
-
needConversion
() → bool[source]¶ Does this type needs conversion between Python object and internal SQL object.
This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.
-
toInternal
(obj: List[Optional[T]]) → List[Optional[T]][source]¶ Converts a Python object into an internal SQL object.
-
classmethod
typeName
() → str¶