pyspark.sql.functions.array_insert¶
-
pyspark.sql.functions.
array_insert
(arr: ColumnOrName, pos: Union[ColumnOrName, int], value: Any) → pyspark.sql.column.Column[source]¶ Collection function: adds an item into a given array at a specified array index. Array indices start at 1, or start from the end if index is negative. Index above array size appends the array, or prepends the array if index is negative, with ‘null’ elements.
New in version 3.4.0.
- Parameters
- Returns
Column
an array of values, including the new specified value
Notes
Supports Spark Connect.
Examples
>>> df = spark.createDataFrame( ... [(['a', 'b', 'c'], 2, 'd'), (['c', 'b', 'a'], -2, 'd')], ... ['data', 'pos', 'val'] ... ) >>> df.select(array_insert(df.data, df.pos.cast('integer'), df.val).alias('data')).collect() [Row(data=['a', 'd', 'b', 'c']), Row(data=['c', 'b', 'd', 'a'])] >>> df.select(array_insert(df.data, 5, 'hello').alias('data')).collect() [Row(data=['a', 'b', 'c', None, 'hello']), Row(data=['c', 'b', 'a', None, 'hello'])]