class Builder extends Logging
- Alphabetic
- By Inheritance
- Builder
- Logging
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new Builder()
Value Members
-
def
appName(name: String): Builder
Sets a name for the application, which will be shown in the Spark web UI.
Sets a name for the application, which will be shown in the Spark web UI. If no application name is set, a randomly generated name will be used.
- Since
2.0.0
-
def
config(conf: SparkConf): Builder
Sets a list of config options based on the given
SparkConf
.Sets a list of config options based on the given
SparkConf
.- Since
2.0.0
-
def
config(map: Map[String, Any]): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
3.4.0
-
def
config(map: Map[String, Any]): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
3.4.0
-
def
config(key: String, value: Boolean): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
2.0.0
-
def
config(key: String, value: Double): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
2.0.0
-
def
config(key: String, value: Long): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
2.0.0
-
def
config(key: String, value: String): Builder
Sets a config option.
Sets a config option. Options set using this method are automatically propagated to both
SparkConf
and SparkSession's own configuration.- Since
2.0.0
-
def
enableHiveSupport(): Builder
Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.
Enables Hive support, including connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined functions.
- Since
2.0.0
-
def
getOrCreate(): SparkSession
Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder.
Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder.
This method first checks whether there is a valid thread-local SparkSession, and if yes, return that one. It then checks whether there is a valid global default SparkSession, and if yes, return that one. If no valid global default SparkSession exists, the method creates a new SparkSession and assigns the newly created SparkSession as the global default.
In case an existing SparkSession is returned, the non-static config options specified in this builder will be applied to the existing SparkSession.
- Since
2.0.0
-
def
master(master: String): Builder
Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.
Sets the Spark master URL to connect to, such as "local" to run locally, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.
- Since
2.0.0
-
def
withExtensions(f: (SparkSessionExtensions) ⇒ Unit): Builder
Inject extensions into the SparkSession.
Inject extensions into the SparkSession. This allows a user to add Analyzer rules, Optimizer rules, Planning Strategies or a customized parser.
- Since
2.2.0