pyspark.sql.functions.posexplode(col: ColumnOrName) → pyspark.sql.column.Column[source]

Returns a new row for each element with position in the given array or map. Uses the default column name pos for position, and col for elements in the array and key and value for elements in the map unless specified otherwise.

New in version 2.1.0.

Changed in version 3.4.0: Supports Spark Connect.

colColumn or str

target column to work on.


one row per array item or map key value including positions as a separate column.


>>> from pyspark.sql import Row
>>> eDF = spark.createDataFrame([Row(a=1, intlist=[1,2,3], mapfield={"a": "b"})])
[Row(pos=0, col=1), Row(pos=1, col=2), Row(pos=2, col=3)]
|  0|  a|    b|