DefaultParamsReader¶
-
class
pyspark.ml.util.
DefaultParamsReader
(cls: Type[pyspark.ml.util.DefaultParamsReadable[RL]])[source]¶ Specialization of
MLReader
forParams
typesDefault
MLReader
implementation for transformers and estimators that contain basic (json-serializable) params and no data. This will not handle more complex params or types with data (e.g., models with coefficients).New in version 2.3.0.
Methods
getAndSetParams
(instance, metadata[, skipParams])Extract Params from metadata, and set them in the instance.
isPythonParamsInstance
(metadata)load
(path)Load the ML instance from the input path.
loadMetadata
(path, sc[, expectedClassName])Load metadata saved using
DefaultParamsWriter.saveMetadata()
loadParamsInstance
(path, sc)Load a
Params
instance from the given path, and return it.session
(sparkSession)Sets the Spark Session to use for saving/loading.
Attributes
Returns the underlying SparkContext.
Returns the user-specified Spark Session or the default.
Methods Documentation
-
static
getAndSetParams
(instance: RL, metadata: Dict[str, Any], skipParams: Optional[List[str]] = None) → None[source]¶ Extract Params from metadata, and set them in the instance.
-
static
loadMetadata
(path: str, sc: pyspark.context.SparkContext, expectedClassName: str = '') → Dict[str, Any][source]¶ Load metadata saved using
DefaultParamsWriter.saveMetadata()
- Parameters
- pathstr
- sc
pyspark.SparkContext
- expectedClassNamestr, optional
If non empty, this is checked against the loaded metadata.
-
static
loadParamsInstance
(path: str, sc: pyspark.context.SparkContext) → RL[source]¶ Load a
Params
instance from the given path, and return it. This assumes the instance inherits fromMLReadable
.
-
session
(sparkSession: pyspark.sql.session.SparkSession) → RW¶ Sets the Spark Session to use for saving/loading.
Attributes Documentation
-
sc
¶ Returns the underlying SparkContext.
-
sparkSession
¶ Returns the user-specified Spark Session or the default.
-
static