MapType¶
-
class
pyspark.sql.types.
MapType
(keyType: pyspark.sql.types.DataType, valueType: pyspark.sql.types.DataType, valueContainsNull: bool = True)[source]¶ Map data type.
- Parameters
Notes
Keys in a map data type are not allowed to be null (None).
Examples
>>> from pyspark.sql.types import IntegerType, FloatType, MapType, StringType
The below example demonstrates how to create class:MapType:
>>> map_type = MapType(StringType(), IntegerType())
The values of the map can contain null (
None
) values by default:>>> (MapType(StringType(), IntegerType()) ... == MapType(StringType(), IntegerType(), True)) True >>> (MapType(StringType(), IntegerType(), False) ... == MapType(StringType(), FloatType())) False
Methods
fromInternal
(obj)Converts an internal SQL object into a native Python object.
fromJson
(json)json
()Does this type needs conversion between Python object and internal SQL object.
toInternal
(obj)Converts a Python object into an internal SQL object.
typeName
()Methods Documentation
-
fromInternal
(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]][source]¶ Converts an internal SQL object into a native Python object.
-
classmethod
fromJson
(json: Dict[str, Any]) → pyspark.sql.types.MapType[source]¶
-
json
() → str¶
-
needConversion
() → bool[source]¶ Does this type needs conversion between Python object and internal SQL object.
This is used to avoid the unnecessary conversion for ArrayType/MapType/StructType.
-
toInternal
(obj: Dict[T, Optional[U]]) → Dict[T, Optional[U]][source]¶ Converts a Python object into an internal SQL object.
-
classmethod
typeName
() → str¶