Package org.apache.spark.ml.feature
Interface StandardScalerParams
- All Superinterfaces:
- HasInputCol,- HasOutputCol,- Identifiable,- Params,- Serializable
- All Known Implementing Classes:
- StandardScaler,- StandardScalerModel
Params for 
StandardScaler and StandardScalerModel.- 
Method SummaryModifier and TypeMethodDescriptionbooleanbooleanvalidateAndTransformSchema(StructType schema) Validates and transforms the input schema.withMean()Whether to center the data with mean before scaling.withStd()Whether to scale the data to unit standard deviation.Methods inherited from interface org.apache.spark.ml.param.shared.HasInputColgetInputCol, inputColMethods inherited from interface org.apache.spark.ml.param.shared.HasOutputColgetOutputCol, outputColMethods inherited from interface org.apache.spark.ml.util.IdentifiabletoString, uidMethods inherited from interface org.apache.spark.ml.param.Paramsclear, copy, copyValues, defaultCopy, defaultParamMap, estimateMatadataSize, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, onParamChange, paramMap, params, set, set, set, setDefault, setDefault, shouldOwn
- 
Method Details- 
getWithMeanboolean getWithMean()
- 
getWithStdboolean getWithStd()
- 
validateAndTransformSchemaValidates and transforms the input schema.
- 
withMeanBooleanParam withMean()Whether to center the data with mean before scaling. It will build a dense output, so take care when applying to sparse input. Default: false- Returns:
- (undocumented)
 
- 
withStdBooleanParam withStd()Whether to scale the data to unit standard deviation. Default: true- Returns:
- (undocumented)
 
 
-