pyspark.sql.functions.hours

pyspark.sql.functions.hours(col)[source]

Partition transform function: A transform for timestamps to partition data into hours.

New in version 3.1.0.

Notes

This function can be used only in combination with partitionedBy() method of the DataFrameWriterV2.

Examples

>>> df.writeTo("catalog.db.table").partitionedBy(   
...     hours("ts")
... ).createOrReplace()