Package org.apache.spark.ml.feature
Class StandardScaler
- All Implemented Interfaces:
Serializable
,org.apache.spark.internal.Logging
,StandardScalerParams
,Params
,HasInputCol
,HasOutputCol
,DefaultParamsWritable
,Identifiable
,MLWritable
,scala.Serializable
public class StandardScaler
extends Estimator<StandardScalerModel>
implements StandardScalerParams, DefaultParamsWritable
Standardizes features by removing the mean and scaling to unit variance using column summary
statistics on the samples in the training set.
The "unit std" is computed using the corrected sample standard deviation, which is computed as the square root of the unbiased sample variance.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.SparkShellLoggingFilter
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionCreates a copy of this instance with the same UID and some extra params.Fits a model to the input data.inputCol()
Param for input column name.static StandardScaler
Param for output column name.static MLReader<T>
read()
setInputCol
(String value) setOutputCol
(String value) setWithMean
(boolean value) setWithStd
(boolean value) transformSchema
(StructType schema) Check transform validity and derive the output schema from the input schema.uid()
An immutable unique ID for the object and its derivatives.withMean()
Whether to center the data with mean before scaling.withStd()
Whether to scale the data to unit standard deviation.Methods inherited from class org.apache.spark.ml.PipelineStage
params
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface org.apache.spark.ml.util.DefaultParamsWritable
write
Methods inherited from interface org.apache.spark.ml.param.shared.HasInputCol
getInputCol
Methods inherited from interface org.apache.spark.ml.param.shared.HasOutputCol
getOutputCol
Methods inherited from interface org.apache.spark.ml.util.Identifiable
toString
Methods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq
Methods inherited from interface org.apache.spark.ml.util.MLWritable
save
Methods inherited from interface org.apache.spark.ml.param.Params
clear, copyValues, defaultCopy, defaultParamMap, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, onParamChange, paramMap, params, set, set, set, setDefault, setDefault, shouldOwn
Methods inherited from interface org.apache.spark.ml.feature.StandardScalerParams
getWithMean, getWithStd, validateAndTransformSchema
-
Constructor Details
-
StandardScaler
-
StandardScaler
public StandardScaler()
-
-
Method Details
-
load
-
read
-
withMean
Description copied from interface:StandardScalerParams
Whether to center the data with mean before scaling. It will build a dense output, so take care when applying to sparse input. Default: false- Specified by:
withMean
in interfaceStandardScalerParams
- Returns:
- (undocumented)
-
withStd
Description copied from interface:StandardScalerParams
Whether to scale the data to unit standard deviation. Default: true- Specified by:
withStd
in interfaceStandardScalerParams
- Returns:
- (undocumented)
-
outputCol
Description copied from interface:HasOutputCol
Param for output column name.- Specified by:
outputCol
in interfaceHasOutputCol
- Returns:
- (undocumented)
-
inputCol
Description copied from interface:HasInputCol
Param for input column name.- Specified by:
inputCol
in interfaceHasInputCol
- Returns:
- (undocumented)
-
uid
Description copied from interface:Identifiable
An immutable unique ID for the object and its derivatives.- Specified by:
uid
in interfaceIdentifiable
- Returns:
- (undocumented)
-
setInputCol
-
setOutputCol
-
setWithMean
-
setWithStd
-
fit
Description copied from class:Estimator
Fits a model to the input data.- Specified by:
fit
in classEstimator<StandardScalerModel>
- Parameters:
dataset
- (undocumented)- Returns:
- (undocumented)
-
transformSchema
Description copied from class:PipelineStage
Check transform validity and derive the output schema from the input schema.We check validity for interactions between parameters during
transformSchema
and raise an exception if any parameter value is invalid. Parameter value checks which do not depend on other parameters are handled byParam.validate()
.Typical implementation should first conduct verification on schema change and parameter validity, including complex parameter interaction checks.
- Specified by:
transformSchema
in classPipelineStage
- Parameters:
schema
- (undocumented)- Returns:
- (undocumented)
-
copy
Description copied from interface:Params
Creates a copy of this instance with the same UID and some extra params. Subclasses should implement this method and set the return type properly. SeedefaultCopy()
.- Specified by:
copy
in interfaceParams
- Specified by:
copy
in classEstimator<StandardScalerModel>
- Parameters:
extra
- (undocumented)- Returns:
- (undocumented)
-