pyspark.sql.DataFrameReader.table

DataFrameReader.table(tableName)[source]

Returns the specified table as a DataFrame.

New in version 1.4.0.

Parameters
tableNamestr

string, name of the table.

Examples

>>> df = spark.read.parquet('python/test_support/sql/parquet_partitioned')
>>> df.createOrReplaceTempView('tmpTable')
>>> spark.read.table('tmpTable').dtypes
[('name', 'string'), ('year', 'int'), ('month', 'int'), ('day', 'int')]