Core Classes#
|
The entry point to programming Spark with the Dataset and DataFrame API. |
|
User-facing catalog API, accessible through SparkSession.catalog. |
|
A distributed collection of data grouped into named columns. |
|
A column in a DataFrame. |
|
Class to observe (named) metrics on a |
|
A row in |
|
A set of methods for aggregations on a |
|
A logical grouping of two |
Functionality for working with missing data in |
|
Functionality for statistic functions with |
|
|
Utility functions for defining window in DataFrames. |
|
Interface used to load a |
|
Interface used to write a |
|
Interface used to write a class:pyspark.sql.dataframe.DataFrame to external storage using the v2 API. |
|
Wrapper for user-defined function registration. |
|
Wrapper for user-defined table function registration. |
|
User defined function in Python |
|
User-defined table function in Python |
|
A base class for data sources. |
A base class for data source readers. |
|
A base class for streaming data source readers. |
|
A base class for data source writers. |
|
|
Wrapper for data source registration. |
|
A base class representing an input partition returned by the partitions() method of |
A commit message returned by the |
|
|
A class to represent a Variant value in Python. |