pyspark.sql.functions.try_divide¶
-
pyspark.sql.functions.
try_divide
(left: ColumnOrName, right: ColumnOrName) → pyspark.sql.column.Column[source]¶ Returns dividend/divisor. It always performs floating point division. Its result is always null if divisor is 0.
New in version 3.5.0.
Examples
>>> df = spark.createDataFrame([(6000, 15), (1990, 2)], ["a", "b"]) >>> df.select(try_divide(df.a, df.b).alias('r')).collect() [Row(r=400.0), Row(r=995.0)]
>>> df = spark.createDataFrame([(1, 2)], ["year", "month"]) >>> df.select( ... try_divide(make_interval(df.year), df.month).alias('r') ... ).show(truncate=False) +--------+ |r | +--------+ |6 months| +--------+
>>> df.select( ... try_divide(make_interval(df.year, df.month), lit(2)).alias('r') ... ).show(truncate=False) +--------+ |r | +--------+ |7 months| +--------+
>>> df.select( ... try_divide(make_interval(df.year, df.month), lit(0)).alias('r') ... ).show(truncate=False) +----+ |r | +----+ |NULL| +----+