pyspark.sql.functions.make_dt_interval#
- pyspark.sql.functions.make_dt_interval(days=None, hours=None, mins=None, secs=None)[source]#
Make DayTimeIntervalType duration from days, hours, mins and secs.
New in version 3.5.0.
- Parameters
- days
Column
or column name, optional The number of days, positive or negative.
- hours
Column
or column name, optional The number of hours, positive or negative.
- mins
Column
or column name, optional The number of minutes, positive or negative.
- secs
Column
or column name, optional The number of seconds with the fractional part in microsecond precision.
- days
- Returns
Column
A new column that contains a DayTimeIntervalType duration.
See also
Examples
Example 1: Make DayTimeIntervalType duration from days, hours, mins and secs.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec']) >>> df.select('*', sf.make_dt_interval(df.day, df.hour, df.min, df.sec)).show(truncate=False) +---+----+---+--------+------------------------------------------+ |day|hour|min|sec |make_dt_interval(day, hour, min, sec) | +---+----+---+--------+------------------------------------------+ |1 |12 |30 |1.001001|INTERVAL '1 12:30:01.001001' DAY TO SECOND| +---+----+---+--------+------------------------------------------+
Example 2: Make DayTimeIntervalType duration from days, hours and mins.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec']) >>> df.select('*', sf.make_dt_interval(df.day, 'hour', df.min)).show(truncate=False) +---+----+---+--------+-----------------------------------+ |day|hour|min|sec |make_dt_interval(day, hour, min, 0)| +---+----+---+--------+-----------------------------------+ |1 |12 |30 |1.001001|INTERVAL '1 12:30:00' DAY TO SECOND| +---+----+---+--------+-----------------------------------+
Example 3: Make DayTimeIntervalType duration from days and hours.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec']) >>> df.select('*', sf.make_dt_interval(df.day, df.hour)).show(truncate=False) +---+----+---+--------+-----------------------------------+ |day|hour|min|sec |make_dt_interval(day, hour, 0, 0) | +---+----+---+--------+-----------------------------------+ |1 |12 |30 |1.001001|INTERVAL '1 12:00:00' DAY TO SECOND| +---+----+---+--------+-----------------------------------+
Example 4: Make DayTimeIntervalType duration from days.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[1, 12, 30, 01.001001]], ['day', 'hour', 'min', 'sec']) >>> df.select('*', sf.make_dt_interval('day')).show(truncate=False) +---+----+---+--------+-----------------------------------+ |day|hour|min|sec |make_dt_interval(day, 0, 0, 0) | +---+----+---+--------+-----------------------------------+ |1 |12 |30 |1.001001|INTERVAL '1 00:00:00' DAY TO SECOND| +---+----+---+--------+-----------------------------------+
Example 5: Make empty interval.
>>> import pyspark.sql.functions as sf >>> spark.range(1).select(sf.make_dt_interval()).show(truncate=False) +-----------------------------------+ |make_dt_interval(0, 0, 0, 0) | +-----------------------------------+ |INTERVAL '0 00:00:00' DAY TO SECOND| +-----------------------------------+