Package org.apache.spark.ml
Class Transformer
Object
org.apache.spark.ml.PipelineStage
org.apache.spark.ml.Transformer
- All Implemented Interfaces:
Serializable
,org.apache.spark.internal.Logging
,Params
,Identifiable
- Direct Known Subclasses:
Binarizer
,ColumnPruner
,FeatureHasher
,HashingTF
,IndexToString
,Interaction
,Model
,SQLTransformer
,StopWordsRemover
,UnaryTransformer
,VectorAssembler
,VectorAttributeRewriter
,VectorSizeHint
,VectorSlicer
Abstract class for transformers that transform one dataset into another.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionabstract Transformer
Creates a copy of this instance with the same UID and some extra params.Transforms the input dataset.Transforms the dataset with provided parameter map as additional parameters.Transforms the dataset with optional parameterstransform
(Dataset<?> dataset, ParamPair<?> firstParamPair, scala.collection.immutable.Seq<ParamPair<?>> otherParamPairs) Transforms the dataset with optional parametersMethods inherited from class org.apache.spark.ml.PipelineStage
params, transformSchema
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface org.apache.spark.ml.util.Identifiable
toString, uid
Methods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
Methods inherited from interface org.apache.spark.ml.param.Params
clear, copyValues, defaultCopy, explainParam, explainParams, extractParamMap, extractParamMap, get, getDefault, getOrDefault, getParam, hasDefault, hasParam, isDefined, isSet, onParamChange, set, set, set, setDefault, setDefault, shouldOwn
-
Constructor Details
-
Transformer
public Transformer()
-
-
Method Details
-
copy
Description copied from interface:Params
Creates a copy of this instance with the same UID and some extra params. Subclasses should implement this method and set the return type properly. SeedefaultCopy()
.- Specified by:
copy
in interfaceParams
- Specified by:
copy
in classPipelineStage
- Parameters:
extra
- (undocumented)- Returns:
- (undocumented)
-
transform
public Dataset<Row> transform(Dataset<?> dataset, ParamPair<?> firstParamPair, ParamPair<?>... otherParamPairs) Transforms the dataset with optional parameters- Parameters:
dataset
- input datasetfirstParamPair
- the first param pair, overwrite embedded paramsotherParamPairs
- other param pairs, overwrite embedded params- Returns:
- transformed dataset
-
transform
public Dataset<Row> transform(Dataset<?> dataset, ParamPair<?> firstParamPair, scala.collection.immutable.Seq<ParamPair<?>> otherParamPairs) Transforms the dataset with optional parameters- Parameters:
dataset
- input datasetfirstParamPair
- the first param pair, overwrite embedded paramsotherParamPairs
- other param pairs, overwrite embedded params- Returns:
- transformed dataset
-
transform
Transforms the dataset with provided parameter map as additional parameters.- Parameters:
dataset
- input datasetparamMap
- additional parameters, overwrite embedded params- Returns:
- transformed dataset
-
transform
Transforms the input dataset.- Parameters:
dataset
- (undocumented)- Returns:
- (undocumented)
-