pyspark.sql.functions.hours¶
- 
pyspark.sql.functions.hours(col: ColumnOrName) → pyspark.sql.column.Column[source]¶ Partition transform function: A transform for timestamps to partition data into hours.
New in version 3.1.0.
Changed in version 3.4.0: Supports Spark Connect.
- Parameters
 - col
Columnor str target date or timestamp column to work on.
- col
 - Returns
 Columndata partitioned by hours.
Notes
This function can be used only in combination with
partitionedBy()method of the DataFrameWriterV2.Examples
>>> df.writeTo("catalog.db.table").partitionedBy( ... hours("ts") ... ).createOrReplace()