DefaultParamsWriter¶
-
class
pyspark.ml.util.
DefaultParamsWriter
(instance: Params)[source]¶ Specialization of
MLWriter
forParams
typesClass for writing Estimators and Transformers whose parameters are JSON-serializable.
New in version 2.3.0.
Methods
extractJsonParams
(instance, skipParams)option
(key, value)Adds an option to the underlying MLWriter.
Overwrites if the output path already exists.
save
(path)Save the ML instance to the input path.
saveImpl
(path)save() handles overwriting and then calls this method.
saveMetadata
(instance, path, sc[, …])Saves metadata + Params to: path + “/metadata”
session
(sparkSession)Sets the Spark Session to use for saving/loading.
Attributes
Returns the underlying SparkContext.
Returns the user-specified Spark Session or the default.
Methods Documentation
-
option
(key: str, value: Any) → pyspark.ml.util.MLWriter¶ Adds an option to the underlying MLWriter. See the documentation for the specific model’s writer for possible options. The option name (key) is case-insensitive.
-
overwrite
() → pyspark.ml.util.MLWriter¶ Overwrites if the output path already exists.
-
save
(path: str) → None¶ Save the ML instance to the input path.
-
saveImpl
(path: str) → None[source]¶ save() handles overwriting and then calls this method. Subclasses should override this method to implement the actual saving of the instance.
-
static
saveMetadata
(instance: Params, path: str, sc: pyspark.context.SparkContext, extraMetadata: Optional[Dict[str, Any]] = None, paramMap: Optional[Dict[str, Any]] = None) → None[source]¶ Saves metadata + Params to: path + “/metadata”
class
timestamp
sparkVersion
uid
paramMap
defaultParamMap (since 2.4.0)
(optionally, extra metadata)
- Parameters
- extraMetadatadict, optional
Extra metadata to be saved at same level as uid, paramMap, etc.
- paramMapdict, optional
If given, this is saved in the “paramMap” field.
-
session
(sparkSession: pyspark.sql.session.SparkSession) → RW¶ Sets the Spark Session to use for saving/loading.
Attributes Documentation
-
sc
¶ Returns the underlying SparkContext.
-
sparkSession
¶ Returns the user-specified Spark Session or the default.
-