SparkContext.
defaultParallelism
Default level of parallelism to use when not given by user (e.g. for reduce tasks)
New in version 0.7.0.
Examples
>>> sc.defaultParallelism > 0 True
previous
pyspark.SparkContext.defaultMinPartitions
next
pyspark.SparkContext.dump_profiles