Class SparkConf
- All Implemented Interfaces:
- Serializable,- Cloneable,- org.apache.spark.internal.Logging,- ReadOnlySparkConf
 Most of the time, you would create a SparkConf object with new SparkConf(), which will load
 values from any spark.* Java system properties set in your application as well. In this case,
 parameters you set directly on the SparkConf object take priority over system properties.
 
 For unit tests, you can also call new SparkConf(false) to skip loading external settings and
 get the same configuration no matter what the system properties are.
 
 All setter methods in this class support chaining. For example, you can write
 new SparkConf().setMaster("local").setAppName("My app").
 
param: loadDefaults whether to also load values from Java system properties
- See Also:
- Note:
- Once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.
- 
Nested Class SummaryNested classes/interfaces inherited from interface org.apache.spark.internal.Loggingorg.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionclone()Copy this objectbooleanDoes the configuration contain a given parameter?getAll()Get all parameters as a list of pairsgetAllWithPrefix(String prefix) Get all parameters that start withprefix<K> scala.Tuple2<K,String>[] getAllWithPrefix(String prefix, scala.Function1<String, K> f) Get all parameters that start withprefixand apply f.getAppId()Returns the Spark application id, valid in the Driver after TaskScheduler registration and from the start in the Executor.Gets all the avro schemas in the configuration used in the generic Avro record serializerstatic scala.Option<String>getDeprecatedConfig(String key, Map<String, String> conf) Looks for available deprecated keys for the given config option, and return the first value available.Get all executor environment variables set on this SparkConfscala.Option<String>Get a parameter as an Optionstatic booleanisExecutorStartupConf(String name) Return whether the given config should be passed to an executor on start-up.static booleanisSparkPortConf(String name) Return true if the given config matches eitherspark.*.portorspark.port.*.static voidLogs a warning message if the given config key is deprecated.static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) registerAvroSchemas(scala.collection.immutable.Seq<org.apache.avro.Schema> schemas) Use Kryo serialization and register the given set of Avro schemas so that the generic record serializer can decrease network IOregisterKryoClasses(Class<?>[] classes) Use Kryo serialization and register the given set of classes with Kryo.Remove a parameter from the configurationSet a configuration variable.Set multiple parameters togethersetAppName(String name) Set a name for your application.setExecutorEnv(String variable, String value) Set an environment variable to be used when launching executors for this application.setExecutorEnv(scala.collection.immutable.Seq<scala.Tuple2<String, String>> variables) Set multiple environment variables to be used when launching executors.setExecutorEnv(scala.Tuple2<String, String>[] variables) Set multiple environment variables to be used when launching executors.setIfMissing(String key, String value) Set a parameter if it isn't already configuredSet JAR files to distribute to the cluster.Set JAR files to distribute to the cluster.The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.setSparkHome(String home) Set the location where Spark is installed on worker nodes.Return a string listing all keys and values, one per line.Methods inherited from class java.lang.Objectequals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.internal.LogginginitializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logBasedOnLevel, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, MDC, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContextMethods inherited from interface org.apache.spark.ReadOnlySparkConfcatchIllegalValue, contains, get, get, getBoolean, getDouble, getenv, getInt, getLong, getSizeAsBytes, getSizeAsBytes, getSizeAsBytes, getSizeAsGb, getSizeAsGb, getSizeAsKb, getSizeAsKb, getSizeAsMb, getSizeAsMb, getTimeAsMs, getTimeAsMs, getTimeAsSeconds, getTimeAsSeconds
- 
Constructor Details- 
SparkConfpublic SparkConf(boolean loadDefaults) 
- 
SparkConfpublic SparkConf()Create a SparkConf that loads defaults from system properties and the classpath
 
- 
- 
Method Details- 
isExecutorStartupConfReturn whether the given config should be passed to an executor on start-up.Certain authentication configs are required from the executor when it connects to the scheduler, while the rest of the spark configs can be inherited from the driver later. - Parameters:
- name- (undocumented)
- Returns:
- (undocumented)
 
- 
isSparkPortConfReturn true if the given config matches eitherspark.*.portorspark.port.*.- Parameters:
- name- (undocumented)
- Returns:
- (undocumented)
 
- 
getDeprecatedConfigLooks for available deprecated keys for the given config option, and return the first value available.- Parameters:
- key- (undocumented)
- conf- (undocumented)
- Returns:
- (undocumented)
 
- 
logDeprecationWarningLogs a warning message if the given config key is deprecated.- Parameters:
- key- (undocumented)
 
- 
org$apache$spark$internal$Logging$$log_public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
- 
org$apache$spark$internal$Logging$$log__$eqpublic static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) 
- 
LogStringContextpublic static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc) 
- 
setSet a configuration variable.
- 
setMasterThe master URL to connect to, such as "local" to run locally with one thread, "local[4]" to run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.- Parameters:
- master- (undocumented)
- Returns:
- (undocumented)
 
- 
setAppNameSet a name for your application. Shown in the Spark web UI.
- 
setJarsSet JAR files to distribute to the cluster.
- 
setJarsSet JAR files to distribute to the cluster. (Java-friendly version.)
- 
setExecutorEnvSet an environment variable to be used when launching executors for this application. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.- Parameters:
- variable- (undocumented)
- value- (undocumented)
- Returns:
- (undocumented)
 
- 
setExecutorEnvpublic SparkConf setExecutorEnv(scala.collection.immutable.Seq<scala.Tuple2<String, String>> variables) Set multiple environment variables to be used when launching executors. These variables are stored as properties of the form spark.executorEnv.VAR_NAME (for example spark.executorEnv.PATH) but this method makes them easier to set.- Parameters:
- variables- (undocumented)
- Returns:
- (undocumented)
 
- 
setExecutorEnvSet multiple environment variables to be used when launching executors. (Java-friendly version.)- Parameters:
- variables- (undocumented)
- Returns:
- (undocumented)
 
- 
setSparkHomeSet the location where Spark is installed on worker nodes.- Parameters:
- home- (undocumented)
- Returns:
- (undocumented)
 
- 
setAllSet multiple parameters together
- 
setIfMissingSet a parameter if it isn't already configured
- 
registerKryoClassesUse Kryo serialization and register the given set of classes with Kryo. If called multiple times, this will append the classes from all calls together.- Parameters:
- classes- (undocumented)
- Returns:
- (undocumented)
 
- 
registerAvroSchemaspublic SparkConf registerAvroSchemas(scala.collection.immutable.Seq<org.apache.avro.Schema> schemas) Use Kryo serialization and register the given set of Avro schemas so that the generic record serializer can decrease network IO- Parameters:
- schemas- (undocumented)
- Returns:
- (undocumented)
 
- 
getAvroSchemaGets all the avro schemas in the configuration used in the generic Avro record serializer
- 
removeRemove a parameter from the configuration
- 
getOptionGet a parameter as an Option- Specified by:
- getOptionin interface- ReadOnlySparkConf
 
- 
getAllGet all parameters as a list of pairs- Specified by:
- getAllin interface- ReadOnlySparkConf
 
- 
getAllWithPrefixGet all parameters that start withprefix- Parameters:
- prefix- (undocumented)
- Returns:
- (undocumented)
 
- 
getAllWithPrefixGet all parameters that start withprefixand apply f.- Parameters:
- prefix- (undocumented)
- f- (undocumented)
- Returns:
- (undocumented)
 
- 
getExecutorEnvGet all executor environment variables set on this SparkConf
- 
getAppIdReturns the Spark application id, valid in the Driver after TaskScheduler registration and from the start in the Executor.- Returns:
- (undocumented)
 
- 
containsDoes the configuration contain a given parameter?- Specified by:
- containsin interface- ReadOnlySparkConf
 
- 
cloneCopy this object
- 
toDebugStringReturn a string listing all keys and values, one per line. This is useful to print the configuration out for debugging.- Returns:
- (undocumented)
 
 
-