pyspark.sql.functions.slice

pyspark.sql.functions.slice(x: ColumnOrName, start: Union[ColumnOrName, int], length: Union[ColumnOrName, int]) → pyspark.sql.column.Column[source]

Collection function: returns an array containing all the elements in x from index start (array indices start at 1, or from the end if start is negative) with the specified length.

New in version 2.4.0.

Parameters
xColumn or str

column name or column containing the array to be sliced

startColumn or str or int

column name, column, or int containing the starting index

lengthColumn or str or int

column name, column, or int containing the length of the slice

Examples

>>> df = spark.createDataFrame([([1, 2, 3],), ([4, 5],)], ['x'])
>>> df.select(slice(df.x, 2, 2).alias("sliced")).collect()
[Row(sliced=[2, 3]), Row(sliced=[5])]