pyspark.sql.functions.bround

pyspark.sql.functions.bround(col: ColumnOrName, scale: int = 0) → pyspark.sql.column.Column[source]

Round the given value to scale decimal places using HALF_EVEN rounding mode if scale >= 0 or at integral part when scale < 0.

New in version 2.0.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
colColumn or str

input column to round.

scaleint optional default 0

scale value.

Returns
Column

rounded values.

Examples

>>> spark.createDataFrame([(2.5,)], ['a']).select(bround('a', 0).alias('r')).collect()
[Row(r=2.0)]