pyspark.sql.functions.dayofweek#
- pyspark.sql.functions.dayofweek(col)[source]#
- Extract the day of the week of a given date/timestamp as integer. Ranges from 1 for a Sunday through to 7 for a Saturday - New in version 2.3.0. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- colColumnor column name
- target date/timestamp column to work on. 
 
- col
- Returns
- Column
- day of the week for given date/timestamp as integer. 
 
 - See also - Examples - Example 1: Extract the day of the week from a string column representing dates - >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08',), ('2024-10-31',)], ['dt']) >>> df.select("*", sf.typeof('dt'), sf.dayofweek('dt')).show() +----------+----------+-------------+ | dt|typeof(dt)|dayofweek(dt)| +----------+----------+-------------+ |2015-04-08| string| 4| |2024-10-31| string| 5| +----------+----------+-------------+ - Example 2: Extract the day of the week from a string column representing timestamp - >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.dayofweek('ts')).show() +-------------------+----------+-------------+ | ts|typeof(ts)|dayofweek(ts)| +-------------------+----------+-------------+ |2015-04-08 13:08:15| string| 4| |2024-10-31 10:09:16| string| 5| +-------------------+----------+-------------+ - Example 3: Extract the day of the week from a date column - >>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.date(2015, 4, 8),), ... (datetime.date(2024, 10, 31),)], ['dt']) >>> df.select("*", sf.typeof('dt'), sf.dayofweek('dt')).show() +----------+----------+-------------+ | dt|typeof(dt)|dayofweek(dt)| +----------+----------+-------------+ |2015-04-08| date| 4| |2024-10-31| date| 5| +----------+----------+-------------+ - Example 4: Extract the day of the week from a timestamp column - >>> import datetime >>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([ ... (datetime.datetime(2015, 4, 8, 13, 8, 15),), ... (datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts']) >>> df.select("*", sf.typeof('ts'), sf.dayofweek('ts')).show() +-------------------+----------+-------------+ | ts|typeof(ts)|dayofweek(ts)| +-------------------+----------+-------------+ |2015-04-08 13:08:15| timestamp| 4| |2024-10-31 10:09:16| timestamp| 5| +-------------------+----------+-------------+