sparkR.init {SparkR}R Documentation

(Deprecated) Initialize a new Spark Context

Description

This function initializes a new SparkContext.

Usage

sparkR.init(master = "", appName = "SparkR",
  sparkHome = Sys.getenv("SPARK_HOME"), sparkEnvir = list(),
  sparkExecutorEnv = list(), sparkJars = "", sparkPackages = "")

Arguments

master

The Spark master URL

appName

Application name to register with cluster manager

sparkHome

Spark Home directory

sparkEnvir

Named list of environment variables to set on worker nodes

sparkExecutorEnv

Named list of environment variables to be used when launching executors

sparkJars

Character vector of jar files to pass to the worker nodes

sparkPackages

Character vector of package coordinates

Note

sparkR.init since 1.4.0

See Also

sparkR.session

Examples

## Not run: 
##D sc <- sparkR.init("local[2]", "SparkR", "/home/spark")
##D sc <- sparkR.init("local[2]", "SparkR", "/home/spark",
##D                  list(spark.executor.memory="1g"))
##D sc <- sparkR.init("yarn-client", "SparkR", "/home/spark",
##D                  list(spark.executor.memory="4g"),
##D                  list(LD_LIBRARY_PATH="/directory of JVM libraries (libjvm.so) on workers/"),
##D                  c("one.jar", "two.jar", "three.jar"),
##D                  c("com.databricks:spark-avro_2.11:2.0.1"))
## End(Not run)

[Package SparkR version 2.3.3 Index]