Choose a Spark release:
Choose a package type:
Download Spark:
Verify this release using the and project release KEYS.
Note: Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support.
Spark artifacts are hosted in Maven Central. You can add a Maven dependency with the following coordinates:
groupId: org.apache.spark
artifactId: spark-core_2.11
version: 2.3.2
PySpark is now available in pypi. To install just run pip install pyspark.
As new Spark releases come out for each development stream, previous ones will be archived, but they are still available at Spark release archives.