pyspark.sql.functions.element_at(col, extraction)[source]

Collection function: Returns element of array at given index in extraction if col is array. Returns value for the given key in extraction if col is map.

New in version 2.4.0.

colColumn or str

name of column containing array or map

extraction :

index to check for in array or key to check for in map


The position is not zero based, but 1 based index.


>>> df = spark.createDataFrame([(["a", "b", "c"],), ([],)], ['data'])
>>>, 1)).collect()
[Row(element_at(data, 1)='a'), Row(element_at(data, 1)=None)]
>>> df = spark.createDataFrame([({"a": 1.0, "b": 2.0},), ({},)], ['data'])
>>>, lit("a"))).collect()
[Row(element_at(data, a)=1.0), Row(element_at(data, a)=None)]