pyspark.sql.Column.alias#

Column.alias(*alias, **kwargs)[source]#

Returns this column aliased with a new name or names (in the case of expressions that return more than one column, such as explode).

New in version 1.3.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
aliasstr

desired column names (collects all positional arguments passed)

Returns
Column

Column representing whether each element of Column is aliased with new name or names.

Other Parameters
metadata: dict

a dict of information to be stored in metadata attribute of the corresponding StructField (optional, keyword only argument)

Changed in version 2.2.0: Added optional metadata argument.

Examples

>>> df = spark.createDataFrame(
...      [(2, "Alice"), (5, "Bob")], ["age", "name"])
>>> df.select(df.age.alias("age2")).collect()
[Row(age2=2), Row(age2=5)]
>>> df.select(df.age.alias("age3", metadata={'max': 99})).schema['age3'].metadata['max']
99