Lightning-fast unified analytics engine

Download Apache Spark™

  1. Choose a Spark release:

  2. Choose a package type:

  3. Download Spark:

  4. Verify this release using the and project release KEYS.

Note: Starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 users should download the Spark source package and build with Scala 2.10 support.

Spark artifacts are hosted in Maven Central. You can add a Maven dependency with the following coordinates:

groupId: org.apache.spark
artifactId: spark-core_2.11
version: 2.3.1

Installing with PyPi

PySpark is now available in pypi. To install just run pip install pyspark.

Spark Source Code Management

If you are interested in working with the newest under-development code or contributing to Apache Spark development, you can also check out the master branch from Git:

# Master development branch
git clone git://

# Maintenance branch with stability fixes on top of Spark 2.3.1
git clone git:// -b branch-2.3

Once you’ve downloaded Spark, you can find instructions for installing and building it on the documentation page.

Release Notes for Stable Releases

    Archived Releases

    As new Spark releases come out for each development stream, previous ones will be archived, but they are still available at Spark release archives.

    Nightly Packages and Artifacts

    For developers, Spark maintains nightly builds and SNAPSHOT artifacts. More information is available on the the Developer Tools page.