pyspark.sql.DataFrame.unpersist

DataFrame.unpersist(blocking=False)[source]

Marks the DataFrame as non-persistent, and remove all blocks for it from memory and disk.

New in version 1.3.0.

Notes

blocking default has changed to False to match Scala in 2.0.