pyspark.sql.functions.make_timestamp_ntz(years: ColumnOrName, months: ColumnOrName, days: ColumnOrName, hours: ColumnOrName, mins: ColumnOrName, secs: ColumnOrName) → pyspark.sql.column.Column[source]

Create local date-time from years, months, days, hours, mins, secs fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.

New in version 3.5.0.

yearsColumn or str

the year to represent, from 1 to 9999

monthsColumn or str

the month-of-year to represent, from 1 (January) to 12 (December)

daysColumn or str

the day-of-month to represent, from 1 to 31

hoursColumn or str

the hour-of-day to represent, from 0 to 23

minsColumn or str

the minute-of-hour to represent, from 0 to 59

secsColumn or str

the second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.


>>> import pyspark.sql.functions as sf
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887]],
...     ["year", "month", "day", "hour", "min", "sec"])
...     df.year, df.month,, df.hour, df.min, df.sec)
... ).show(truncate=False)
|make_timestamp_ntz(year, month, day, hour, min, sec)|
|2014-12-28 06:30:45.887                             |
>>> spark.conf.unset("spark.sql.session.timeZone")