pyspark.sql.functions.make_dt_interval¶
-
pyspark.sql.functions.
make_dt_interval
(days: Optional[ColumnOrName] = None, hours: Optional[ColumnOrName] = None, mins: Optional[ColumnOrName] = None, secs: Optional[ColumnOrName] = None) → pyspark.sql.column.Column[source]¶ Make DayTimeIntervalType duration from days, hours, mins and secs.
New in version 3.5.0.
- Parameters
Examples
>>> df = spark.createDataFrame([[1, 12, 30, 01.001001]], ... ["day", "hour", "min", "sec"]) >>> df.select(make_dt_interval( ... df.day, df.hour, df.min, df.sec).alias('r') ... ).show(truncate=False) +------------------------------------------+ |r | +------------------------------------------+ |INTERVAL '1 12:30:01.001001' DAY TO SECOND| +------------------------------------------+
>>> df.select(make_dt_interval( ... df.day, df.hour, df.min).alias('r') ... ).show(truncate=False) +-----------------------------------+ |r | +-----------------------------------+ |INTERVAL '1 12:30:00' DAY TO SECOND| +-----------------------------------+
>>> df.select(make_dt_interval( ... df.day, df.hour).alias('r') ... ).show(truncate=False) +-----------------------------------+ |r | +-----------------------------------+ |INTERVAL '1 12:00:00' DAY TO SECOND| +-----------------------------------+
>>> df.select(make_dt_interval(df.day).alias('r')).show(truncate=False) +-----------------------------------+ |r | +-----------------------------------+ |INTERVAL '1 00:00:00' DAY TO SECOND| +-----------------------------------+
>>> df.select(make_dt_interval().alias('r')).show(truncate=False) +-----------------------------------+ |r | +-----------------------------------+ |INTERVAL '0 00:00:00' DAY TO SECOND| +-----------------------------------+