pyspark.sql.functions.try_element_at#
- pyspark.sql.functions.try_element_at(col, extraction)[source]#
Collection function: (array, index) - Returns element of array at given (1-based) index. If Index is 0, Spark will throw an error. If index < 0, accesses elements from the last to the first. The function always returns NULL if the index exceeds the length of the array.
(map, key) - Returns value for given key. The function always returns NULL if the key is not contained in the map.
New in version 3.5.0.
- Parameters
- col
Column
or str name of column containing array or map
- extraction
index to check for in array or key to check for in map
- col
Examples
Example 1: Getting the first element of an array
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(["a", "b", "c"],)], ['data']) >>> df.select(sf.try_element_at(df.data, sf.lit(1))).show() +-----------------------+ |try_element_at(data, 1)| +-----------------------+ | a| +-----------------------+
Example 2: Getting the last element of an array using negative index
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(["a", "b", "c"],)], ['data']) >>> df.select(sf.try_element_at(df.data, sf.lit(-1))).show() +------------------------+ |try_element_at(data, -1)| +------------------------+ | c| +------------------------+
Example 3: Getting a value from a map using a key
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([({"a": 1.0, "b": 2.0},)], ['data']) >>> df.select(sf.try_element_at(df.data, sf.lit("a"))).show() +-----------------------+ |try_element_at(data, a)| +-----------------------+ | 1.0| +-----------------------+
Example 4: Getting a non-existing element from an array
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([(["a", "b", "c"],)], ['data']) >>> df.select(sf.try_element_at(df.data, sf.lit(4))).show() +-----------------------+ |try_element_at(data, 4)| +-----------------------+ | NULL| +-----------------------+
Example 5: Getting a non-existing value from a map using a key
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([({"a": 1.0, "b": 2.0},)], ['data']) >>> df.select(sf.try_element_at(df.data, sf.lit("c"))).show() +-----------------------+ |try_element_at(data, c)| +-----------------------+ | NULL| +-----------------------+