pyspark.sql.functions.map_concat

pyspark.sql.functions.map_concat(*cols: Union[ColumnOrName, List[ColumnOrName_], Tuple[ColumnOrName_, …]]) → pyspark.sql.column.Column[source]

Returns the union of all the given maps.

New in version 2.4.0.

Parameters
colsColumn or str

column names or Columns

Examples

>>> from pyspark.sql.functions import map_concat
>>> df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map(3, 'c') as map2")
>>> df.select(map_concat("map1", "map2").alias("map3")).show(truncate=False)
+------------------------+
|map3                    |
+------------------------+
|{1 -> a, 2 -> b, 3 -> c}|
+------------------------+