pyspark.sql.SparkSession.sparkContext

property SparkSession.sparkContext

Returns the underlying SparkContext.

New in version 2.0.0.

Returns
SparkContext

Examples

>>> spark.sparkContext
<SparkContext master=... appName=...>

Create an RDD from the Spark context

>>> rdd = spark.sparkContext.parallelize([1, 2, 3])
>>> rdd.collect()
[1, 2, 3]