org.apache.spark

SparkConf

class SparkConf extends Cloneable with Logging

Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.

Most of the time, you would create a SparkConf object with new SparkConf(), which will load values from any spark.* Java system properties set in your application as well. In this case, parameters you set directly on the SparkConf object take priority over system properties.

For unit tests, you can also call new SparkConf(false) to skip loading external settings and get the same configuration no matter what the system properties are.

All setter methods in this class support chaining. For example, you can write new SparkConf().setMaster("local").setAppName("My app").

Note that once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.

Linear Supertypes
Logging, Cloneable, Cloneable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkConf
  2. Logging
  3. Cloneable
  4. Cloneable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SparkConf()

    Create a SparkConf that loads defaults from system properties and the classpath

  2. new SparkConf(loadDefaults: Boolean)

    loadDefaults

    whether to also load values from Java system properties

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): SparkConf

    Copy this object

    Copy this object

    Definition Classes
    SparkConf → AnyRef
  8. def contains(key: String): Boolean

    Does the configuration contain a given parameter?

  9. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. def get(key: String, defaultValue: String): String

    Get a parameter, falling back to a default if not set

  13. def get(key: String): String

    Get a parameter; throws a NoSuchElementException if it's not set

  14. def getAkkaConf: Seq[(String, String)]

    Get all akka conf variables set on this SparkConf

  15. def getAll: Array[(String, String)]

    Get all parameters as a list of pairs

  16. def getAppId: String

    Returns the Spark application id, valid in the Driver after TaskScheduler registration and from the start in the Executor.

  17. def getBoolean(key: String, defaultValue: Boolean): Boolean

    Get a parameter as a boolean, falling back to a default if not set

  18. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  19. def getDouble(key: String, defaultValue: Double): Double

    Get a parameter as a double, falling back to a default if not set

  20. def getExecutorEnv: Seq[(String, String)]

    Get all executor environment variables set on this SparkConf

  21. def getInt(key: String, defaultValue: Int): Int

    Get a parameter as an integer, falling back to a default if not set

  22. def getLong(key: String, defaultValue: Long): Long

    Get a parameter as a long, falling back to a default if not set

  23. def getOption(key: String): Option[String]

    Get a parameter as an Option

  24. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  25. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  26. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  27. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  28. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  29. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  30. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  32. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  33. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  34. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  35. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  36. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  37. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  38. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  39. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  40. final def notify(): Unit

    Definition Classes
    AnyRef
  41. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  42. def registerKryoClasses(classes: Array[Class[_]]): SparkConf

    Use Kryo serialization and register the given set of classes with Kryo.

    Use Kryo serialization and register the given set of classes with Kryo. If called multiple times, this will append the classes from all calls together.

  43. def remove(key: String): SparkConf

    Remove a parameter from the configuration

  44. def set(key: String, value: String): SparkConf

    Set a configuration variable.

  45. def setAll(settings: Traversable[(String, String)]): SparkConf

    Set multiple parameters together

  46. def setAppName(name: String): SparkConf

    Set a name for your application.

    Set a name for your application. Shown in the Spark web UI.

  47. def setExecutorEnv(variables: Array[(String, String)]): SparkConf

    Set multiple environment variables to be used when launching executors.

    Set multiple environment variables to be used when launching executors. (Java-friendly version.)

  48. def setExecutorEnv(variables: Seq[(String, String)]): SparkConf

    Set multiple environment variables to be used when launching executors.

    Set multiple environment variables to be used when launching executors. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.

  49. def setExecutorEnv(variable: String, value: String): SparkConf

    Set an environment variable to be used when launching executors for this application.

    Set an environment variable to be used when launching executors for this application. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.

  50. def setIfMissing(key: String, value: String): SparkConf

    Set a parameter if it isn't already configured

  51. def setJars(jars: Array[String]): SparkConf

    Set JAR files to distribute to the cluster.

    Set JAR files to distribute to the cluster. (Java-friendly version.)

  52. def setJars(jars: Seq[String]): SparkConf

    Set JAR files to distribute to the cluster.

  53. def setMaster(master: String): SparkConf

    The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.

  54. def setSparkHome(home: String): SparkConf

    Set the location where Spark is installed on worker nodes.

  55. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  56. def toDebugString: String

    Return a string listing all keys and values, one per line.

    Return a string listing all keys and values, one per line. This is useful to print the configuration out for debugging.

  57. def toString(): String

    Definition Classes
    AnyRef → Any
  58. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  59. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  60. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from Cloneable

Inherited from Cloneable

Inherited from AnyRef

Inherited from Any

Ungrouped