pyspark.sql.functions.dateadd

pyspark.sql.functions.dateadd(start: ColumnOrName, days: Union[ColumnOrName, int]) → pyspark.sql.column.Column[source]

Returns the date that is days days after start. If days is a negative value then these amount of days will be deducted from start.

New in version 3.5.0.

Parameters
startColumn or str

date column to work on.

daysColumn or str or int

how many days after the given date to calculate. Accepts negative value as well to calculate backwards in time.

Returns
Column

a date after/before given number of days.

Examples

>>> import pyspark.sql.functions as sf
>>> spark.createDataFrame(
...     [('2015-04-08', 2,)], ['dt', 'add']
... ).select(sf.dateadd("dt", 1)).show()
+---------------+
|date_add(dt, 1)|
+---------------+
|     2015-04-09|
+---------------+
>>> import pyspark.sql.functions as sf
>>> spark.createDataFrame(
...     [('2015-04-08', 2,)], ['dt', 'add']
... ).select(sf.dateadd("dt", sf.lit(2))).show()
+---------------+
|date_add(dt, 2)|
+---------------+
|     2015-04-10|
+---------------+
>>> import pyspark.sql.functions as sf
>>> spark.createDataFrame(
...     [('2015-04-08', 2,)], ['dt', 'add']
... ).select(sf.dateadd("dt", -1)).show()
+----------------+
|date_add(dt, -1)|
+----------------+
|      2015-04-07|
+----------------+