pyspark.sql.functions.unix_date#
- pyspark.sql.functions.unix_date(col)[source]#
- Returns the number of days since 1970-01-01. - New in version 3.5.0. - Parameters
- colColumnor column name
- input column of values to convert. 
 
- col
- Returns
- Column
- the number of days since 1970-01-01. 
 
 - See also - Examples - >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([('1970-01-02',), ('2022-01-02',)], ['dt']) >>> df.select('*', sf.unix_date(sf.to_date('dt'))).show() +----------+----------------------+ | dt|unix_date(to_date(dt))| +----------+----------------------+ |1970-01-02| 1| |2022-01-02| 18994| +----------+----------------------+ - >>> spark.conf.unset("spark.sql.session.timeZone")