pyspark.sql.functions.hours

pyspark.sql.functions.hours(col: ColumnOrName) → pyspark.sql.column.Column[source]

Partition transform function: A transform for timestamps to partition data into hours.

New in version 3.1.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
colColumn or str

target date or timestamp column to work on.

Returns
Column

data partitioned by hours.

Notes

This function can be used only in combination with partitionedBy() method of the DataFrameWriterV2.

Examples

>>> df.writeTo("catalog.db.table").partitionedBy(   
...     hours("ts")
... ).createOrReplace()