pyspark.SparkContext.defaultParallelism

property SparkContext.defaultParallelism

Default level of parallelism to use when not given by user (e.g. for reduce tasks)

New in version 0.7.0.

Examples

>>> sc.defaultParallelism > 0
True