Package org.apache.spark.ml.feature
Interface StandardScalerParams
- All Superinterfaces:
HasInputCol
,HasOutputCol
,Identifiable
,Params
,Serializable
- All Known Implementing Classes:
StandardScaler
,StandardScalerModel
Params for
StandardScaler
and StandardScalerModel
.-
Method Summary
Modifier and TypeMethodDescriptionboolean
boolean
validateAndTransformSchema
(StructType schema) Validates and transforms the input schema.withMean()
Whether to center the data with mean before scaling.withStd()
Whether to scale the data to unit standard deviation.Methods inherited from interface org.apache.spark.ml.param.shared.HasInputCol
getInputCol, inputCol
Methods inherited from interface org.apache.spark.ml.param.shared.HasOutputCol
getOutputCol, outputCol
Methods inherited from interface org.apache.spark.ml.util.Identifiable
toString, uid
Methods inherited from interface org.apache.spark.ml.param.Params
clear, copy, copyValues, defaultCopy, defaultParamMap, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, onParamChange, paramMap, params, set, set, set, setDefault, setDefault, shouldOwn
-
Method Details
-
getWithMean
boolean getWithMean() -
getWithStd
boolean getWithStd() -
validateAndTransformSchema
Validates and transforms the input schema. -
withMean
BooleanParam withMean()Whether to center the data with mean before scaling. It will build a dense output, so take care when applying to sparse input. Default: false- Returns:
- (undocumented)
-
withStd
BooleanParam withStd()Whether to scale the data to unit standard deviation. Default: true- Returns:
- (undocumented)
-