pyspark.sql.functions.nvl2#

pyspark.sql.functions.nvl2(col1, col2, col3)[source]#

Returns col2 if col1 is not null, or col3 otherwise.

New in version 3.5.0.

Parameters
col1Column or str
col2Column or str
col3Column or str

Examples

>>> df = spark.createDataFrame([(None, 8, 6,), (1, 9, 9,)], ["a", "b", "c"])
>>> df.select(nvl2(df.a, df.b, df.c).alias('r')).collect()
[Row(r=6), Row(r=9)]