Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • object SparkSession extends Logging with Serializable
    Definition Classes
    sql
    Annotations
    @Stable()
  • Builder

class Builder extends Logging

Builder for SparkSession.

Annotations
@Stable()
Source
SparkSession.scala
Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Builder
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Builder()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def appName(name: String): Builder

    Sets a name for the application, which will be shown in the Spark web UI.

    Sets a name for the application, which will be shown in the Spark web UI. If no application name is set, a randomly generated name will be used.

    Since

    2.0.0

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @IntrinsicCandidate()
  7. def config(conf: SparkConf): Builder

    Sets a list of config options based on the given SparkConf.

    Sets a list of config options based on the given SparkConf.

    Since

    2.0.0

  8. def config(map: Map[String, Any]): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    3.4.0

  9. def config(map: Map[String, Any]): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    3.4.0

  10. def config(key: String, value: Boolean): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    2.0.0

  11. def config(key: String, value: Double): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    2.0.0

  12. def config(key: String, value: Long): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    2.0.0

  13. def config(key: String, value: String): Builder

    Sets a config option.

    Sets a config option. Options set using this method are automatically propagated to both SparkConf and SparkSession's own configuration.

    Since

    2.0.0

  14. def enableHiveSupport(): Builder

    Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.

    Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.

    Since

    2.0.0

  15. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  18. def getOrCreate(): SparkSession

    Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder.

    Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder.

    This method first checks whether there is a valid thread-local SparkSession, and if yes, return that one. It then checks whether there is a valid global default SparkSession, and if yes, return that one. If no valid global default SparkSession exists, the method creates a new SparkSession and assigns the newly created SparkSession as the global default.

    In case an existing SparkSession is returned, the non-static config options specified in this builder will be applied to the existing SparkSession.

    Since

    2.0.0

  19. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  20. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  21. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  22. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  23. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  24. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  25. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  26. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  27. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  32. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def master(master: String): Builder

    Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.

    Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.

    Since

    2.0.0

  37. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  38. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  39. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  40. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  41. def toString(): String
    Definition Classes
    AnyRef → Any
  42. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  44. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  45. def withExtensions(f: (SparkSessionExtensions) ⇒ Unit): Builder

    Inject extensions into the SparkSession.

    Inject extensions into the SparkSession. This allows a user to add Analyzer rules, Optimizer rules, Planning Strategies or a customized parser.

    Since

    2.2.0

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped