pyspark.sql.functions.typeof#

pyspark.sql.functions.typeof(col)[source]#

Return DDL-formatted type string for the data type of the input.

New in version 3.5.0.

Parameters
colColumn or column name

Examples

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([(True, 1, 1.0, 'xyz',)], ['a', 'b', 'c', 'd'])
>>> df.select(sf.typeof(df.a), sf.typeof(df.b), sf.typeof('c'), sf.typeof('d')).show()
+---------+---------+---------+---------+
|typeof(a)|typeof(b)|typeof(c)|typeof(d)|
+---------+---------+---------+---------+
|  boolean|   bigint|   double|   string|
+---------+---------+---------+---------+