pyspark.SparkConf¶
-
class
pyspark.
SparkConf
(loadDefaults=True, _jvm=None, _jconf=None)[source]¶ Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.
Most of the time, you would create a SparkConf object with
SparkConf()
, which will load values from spark.* Java system properties as well. In this case, any parameters you set directly on theSparkConf
object take priority over system properties.For unit tests, you can also call
SparkConf(false)
to skip loading external settings and get the same configuration no matter what the system properties are.All setter methods in this class support chaining. For example, you can write
conf.setMaster("local").setAppName("My app")
.Note
Once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user.
-
__init__
(loadDefaults=True, _jvm=None, _jconf=None)[source]¶ Create a new Spark configuration.
- Parameters
loadDefaults – whether to load values from Java system properties (True by default)
_jvm – internal parameter used to pass a handle to the Java VM; does not need to be set by users
_jconf – Optionally pass in an existing SparkConf handle to use its parameters
Methods
-