pyspark.sql.functions.array_repeat(col: ColumnOrName, count: Union[ColumnOrName, int]) → pyspark.sql.column.Column[source]

Collection function: creates an array containing a column repeated count times.

New in version 2.4.0.

colColumn or str

column name or column that contains the element to be repeated

countColumn or str or int

column name, column, or int containing the number of times to repeat the first argument


>>> df = spark.createDataFrame([('ab',)], ['data'])
>>>, 3).alias('r')).collect()
[Row(r=['ab', 'ab', 'ab'])]