StandardScalerModel

class pyspark.mllib.feature.StandardScalerModel(java_model)[source]

Represents a StandardScaler model that can transform vectors.

New in version 1.2.0.

Methods

call(name, *a)

Call method of java_model

setWithMean(withMean)

Setter of the boolean which decides whether it uses mean or not

setWithStd(withStd)

Setter of the boolean which decides whether it uses std or not

transform(vector)

Applies standardization transformation on a vector.

Attributes

mean

Return the column mean values.

std

Return the column standard deviation values.

withMean

Returns if the model centers the data before scaling.

withStd

Returns if the model scales the data to unit standard deviation.

Methods Documentation

call(name, *a)

Call method of java_model

setWithMean(withMean)[source]

Setter of the boolean which decides whether it uses mean or not

New in version 1.4.0.

setWithStd(withStd)[source]

Setter of the boolean which decides whether it uses std or not

New in version 1.4.0.

transform(vector)[source]

Applies standardization transformation on a vector.

New in version 1.2.0.

Parameters
vectorpyspark.mllib.linalg.Vector or pyspark.RDD

Input vector(s) to be standardized.

Returns
pyspark.mllib.linalg.Vector or pyspark.RDD

Standardized vector(s). If the variance of a column is zero, it will return default 0.0 for the column with zero variance.

Notes

In Python, transform cannot currently be used within an RDD transformation or action. Call transform directly on the RDD instead.

Attributes Documentation

mean

Return the column mean values.

New in version 2.0.0.

std

Return the column standard deviation values.

New in version 2.0.0.

withMean

Returns if the model centers the data before scaling.

New in version 2.0.0.

withStd

Returns if the model scales the data to unit standard deviation.

New in version 2.0.0.