pyspark.sql.functions.round

pyspark.sql.functions.round(col: ColumnOrName, scale: int = 0) → pyspark.sql.column.Column[source]

Round the given value to scale decimal places using HALF_UP rounding mode if scale >= 0 or at integral part when scale < 0.

New in version 1.5.0.

Examples

>>> spark.createDataFrame([(2.5,)], ['a']).select(round('a', 0).alias('r')).collect()
[Row(r=3.0)]