public class SparkConf
extends java.lang.Object
implements scala.Cloneable
Most of the time, you would create a SparkConf object with new SparkConf()
, which will load
values from any spark.*
Java system properties set in your application as well. In this case,
parameters you set directly on the SparkConf
object take priority over system properties.
For unit tests, you can also call new SparkConf(false)
to skip loading external settings and
get the same configuration no matter what the system properties are.
All setter methods in this class support chaining. For example, you can write
new SparkConf().setMaster("local").setAppName("My app")
.
Note that once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.
param: loadDefaults whether to also load values from Java system properties
Constructor and Description |
---|
SparkConf()
Create a SparkConf that loads defaults from system properties and the classpath
|
SparkConf(boolean loadDefaults) |
Modifier and Type | Method and Description |
---|---|
SparkConf |
clone()
Copy this object
|
boolean |
contains(java.lang.String key)
Does the configuration contain a given parameter?
|
java.lang.String |
get(java.lang.String key)
Get a parameter; throws a NoSuchElementException if it's not set
|
java.lang.String |
get(java.lang.String key,
java.lang.String defaultValue)
Get a parameter, falling back to a default if not set
|
scala.Tuple2<java.lang.String,java.lang.String>[] |
getAll()
Get all parameters as a list of pairs
|
java.lang.String |
getAppId()
Returns the Spark application id, valid in the Driver after TaskScheduler registration and
from the start in the Executor.
|
scala.collection.immutable.Map<java.lang.Object,java.lang.String> |
getAvroSchema()
Gets all the avro schemas in the configuration used in the generic Avro record serializer
|
boolean |
getBoolean(java.lang.String key,
boolean defaultValue)
Get a parameter as a boolean, falling back to a default if not set
|
static scala.Option<java.lang.String> |
getDeprecatedConfig(java.lang.String key,
SparkConf conf)
Looks for available deprecated keys for the given config option, and return the first
value available.
|
double |
getDouble(java.lang.String key,
double defaultValue)
Get a parameter as a double, falling back to a default if not set
|
scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> |
getExecutorEnv()
Get all executor environment variables set on this SparkConf
|
int |
getInt(java.lang.String key,
int defaultValue)
Get a parameter as an integer, falling back to a default if not set
|
long |
getLong(java.lang.String key,
long defaultValue)
Get a parameter as a long, falling back to a default if not set
|
scala.Option<java.lang.String> |
getOption(java.lang.String key)
Get a parameter as an Option
|
long |
getSizeAsBytes(java.lang.String key)
Get a size parameter as bytes; throws a NoSuchElementException if it's not set.
|
long |
getSizeAsBytes(java.lang.String key,
long defaultValue)
Get a size parameter as bytes, falling back to a default if not set.
|
long |
getSizeAsBytes(java.lang.String key,
java.lang.String defaultValue)
Get a size parameter as bytes, falling back to a default if not set.
|
long |
getSizeAsGb(java.lang.String key)
Get a size parameter as Gibibytes; throws a NoSuchElementException if it's not set.
|
long |
getSizeAsGb(java.lang.String key,
java.lang.String defaultValue)
Get a size parameter as Gibibytes, falling back to a default if not set.
|
long |
getSizeAsKb(java.lang.String key)
Get a size parameter as Kibibytes; throws a NoSuchElementException if it's not set.
|
long |
getSizeAsKb(java.lang.String key,
java.lang.String defaultValue)
Get a size parameter as Kibibytes, falling back to a default if not set.
|
long |
getSizeAsMb(java.lang.String key)
Get a size parameter as Mebibytes; throws a NoSuchElementException if it's not set.
|
long |
getSizeAsMb(java.lang.String key,
java.lang.String defaultValue)
Get a size parameter as Mebibytes, falling back to a default if not set.
|
long |
getTimeAsMs(java.lang.String key)
Get a time parameter as milliseconds; throws a NoSuchElementException if it's not set.
|
long |
getTimeAsMs(java.lang.String key,
java.lang.String defaultValue)
Get a time parameter as milliseconds, falling back to a default if not set.
|
long |
getTimeAsSeconds(java.lang.String key)
Get a time parameter as seconds; throws a NoSuchElementException if it's not set.
|
long |
getTimeAsSeconds(java.lang.String key,
java.lang.String defaultValue)
Get a time parameter as seconds, falling back to a default if not set.
|
protected static void |
initializeLogIfNecessary(boolean isInterpreter) |
static boolean |
isExecutorStartupConf(java.lang.String name)
Return whether the given config should be passed to an executor on start-up.
|
static boolean |
isSparkPortConf(java.lang.String name)
Return true if the given config matches either
spark.*.port or spark.port.* . |
protected static boolean |
isTraceEnabled() |
protected static org.slf4j.Logger |
log() |
protected static void |
logDebug(scala.Function0<java.lang.String> msg) |
protected static void |
logDebug(scala.Function0<java.lang.String> msg,
java.lang.Throwable throwable) |
static void |
logDeprecationWarning(java.lang.String key)
Logs a warning message if the given config key is deprecated.
|
protected static void |
logError(scala.Function0<java.lang.String> msg) |
protected static void |
logError(scala.Function0<java.lang.String> msg,
java.lang.Throwable throwable) |
protected static void |
logInfo(scala.Function0<java.lang.String> msg) |
protected static void |
logInfo(scala.Function0<java.lang.String> msg,
java.lang.Throwable throwable) |
protected static java.lang.String |
logName() |
protected static void |
logTrace(scala.Function0<java.lang.String> msg) |
protected static void |
logTrace(scala.Function0<java.lang.String> msg,
java.lang.Throwable throwable) |
protected static void |
logWarning(scala.Function0<java.lang.String> msg) |
protected static void |
logWarning(scala.Function0<java.lang.String> msg,
java.lang.Throwable throwable) |
SparkConf |
registerAvroSchemas(scala.collection.Seq<org.apache.avro.Schema> schemas)
Use Kryo serialization and register the given set of Avro schemas so that the generic
record serializer can decrease network IO
|
SparkConf |
registerKryoClasses(java.lang.Class<?>[] classes)
Use Kryo serialization and register the given set of classes with Kryo.
|
SparkConf |
remove(java.lang.String key)
Remove a parameter from the configuration
|
SparkConf |
set(java.lang.String key,
java.lang.String value)
Set a configuration variable.
|
SparkConf |
setAll(scala.collection.Traversable<scala.Tuple2<java.lang.String,java.lang.String>> settings)
Set multiple parameters together
|
SparkConf |
setAppName(java.lang.String name)
Set a name for your application.
|
SparkConf |
setExecutorEnv(scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> variables)
Set multiple environment variables to be used when launching executors.
|
SparkConf |
setExecutorEnv(java.lang.String variable,
java.lang.String value)
Set an environment variable to be used when launching executors for this application.
|
SparkConf |
setExecutorEnv(scala.Tuple2<java.lang.String,java.lang.String>[] variables)
Set multiple environment variables to be used when launching executors.
|
SparkConf |
setIfMissing(java.lang.String key,
java.lang.String value)
Set a parameter if it isn't already configured
|
SparkConf |
setJars(scala.collection.Seq<java.lang.String> jars)
Set JAR files to distribute to the cluster.
|
SparkConf |
setJars(java.lang.String[] jars)
Set JAR files to distribute to the cluster.
|
SparkConf |
setMaster(java.lang.String master)
The master URL to connect to, such as "local" to run locally with one thread, "local[4]" to
run locally with 4 cores, or "spark://master:7077" to run on a Spark standalone cluster.
|
SparkConf |
setSparkHome(java.lang.String home)
Set the location where Spark is installed on worker nodes.
|
java.lang.String |
toDebugString()
Return a string listing all keys and values, one per line.
|
public SparkConf(boolean loadDefaults)
public SparkConf()
public static boolean isExecutorStartupConf(java.lang.String name)
Certain authentication configs are required from the executor when it connects to the scheduler, while the rest of the spark configs can be inherited from the driver later.
name
- (undocumented)public static boolean isSparkPortConf(java.lang.String name)
spark.*.port
or spark.port.*
.name
- (undocumented)public static scala.Option<java.lang.String> getDeprecatedConfig(java.lang.String key, SparkConf conf)
key
- (undocumented)conf
- (undocumented)public static void logDeprecationWarning(java.lang.String key)
key
- (undocumented)protected static java.lang.String logName()
protected static org.slf4j.Logger log()
protected static void logInfo(scala.Function0<java.lang.String> msg)
protected static void logDebug(scala.Function0<java.lang.String> msg)
protected static void logTrace(scala.Function0<java.lang.String> msg)
protected static void logWarning(scala.Function0<java.lang.String> msg)
protected static void logError(scala.Function0<java.lang.String> msg)
protected static void logInfo(scala.Function0<java.lang.String> msg, java.lang.Throwable throwable)
protected static void logDebug(scala.Function0<java.lang.String> msg, java.lang.Throwable throwable)
protected static void logTrace(scala.Function0<java.lang.String> msg, java.lang.Throwable throwable)
protected static void logWarning(scala.Function0<java.lang.String> msg, java.lang.Throwable throwable)
protected static void logError(scala.Function0<java.lang.String> msg, java.lang.Throwable throwable)
protected static boolean isTraceEnabled()
protected static void initializeLogIfNecessary(boolean isInterpreter)
public SparkConf set(java.lang.String key, java.lang.String value)
public SparkConf setMaster(java.lang.String master)
master
- (undocumented)public SparkConf setAppName(java.lang.String name)
public SparkConf setJars(scala.collection.Seq<java.lang.String> jars)
public SparkConf setJars(java.lang.String[] jars)
public SparkConf setExecutorEnv(java.lang.String variable, java.lang.String value)
variable
- (undocumented)value
- (undocumented)public SparkConf setExecutorEnv(scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> variables)
variables
- (undocumented)public SparkConf setExecutorEnv(scala.Tuple2<java.lang.String,java.lang.String>[] variables)
variables
- (undocumented)public SparkConf setSparkHome(java.lang.String home)
home
- (undocumented)public SparkConf setAll(scala.collection.Traversable<scala.Tuple2<java.lang.String,java.lang.String>> settings)
public SparkConf setIfMissing(java.lang.String key, java.lang.String value)
public SparkConf registerKryoClasses(java.lang.Class<?>[] classes)
classes
- (undocumented)public SparkConf registerAvroSchemas(scala.collection.Seq<org.apache.avro.Schema> schemas)
schemas
- (undocumented)public scala.collection.immutable.Map<java.lang.Object,java.lang.String> getAvroSchema()
public SparkConf remove(java.lang.String key)
public java.lang.String get(java.lang.String key)
public java.lang.String get(java.lang.String key, java.lang.String defaultValue)
public long getTimeAsSeconds(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getTimeAsSeconds(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getTimeAsMs(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getTimeAsMs(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getSizeAsBytes(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getSizeAsBytes(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getSizeAsBytes(java.lang.String key, long defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getSizeAsKb(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getSizeAsKb(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getSizeAsMb(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getSizeAsMb(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public long getSizeAsGb(java.lang.String key)
key
- (undocumented)NoSuchElementException
public long getSizeAsGb(java.lang.String key, java.lang.String defaultValue)
key
- (undocumented)defaultValue
- (undocumented)public scala.Option<java.lang.String> getOption(java.lang.String key)
public scala.Tuple2<java.lang.String,java.lang.String>[] getAll()
public int getInt(java.lang.String key, int defaultValue)
public long getLong(java.lang.String key, long defaultValue)
public double getDouble(java.lang.String key, double defaultValue)
public boolean getBoolean(java.lang.String key, boolean defaultValue)
public scala.collection.Seq<scala.Tuple2<java.lang.String,java.lang.String>> getExecutorEnv()
public java.lang.String getAppId()
public boolean contains(java.lang.String key)
public SparkConf clone()
clone
in class java.lang.Object
public java.lang.String toDebugString()