package regression
- Alphabetic
- Public
- Protected
Type Members
- abstract class GeneralizedLinearAlgorithm[M <: GeneralizedLinearModel] extends Logging with Serializable
GeneralizedLinearAlgorithm implements methods to train a Generalized Linear Model (GLM).
GeneralizedLinearAlgorithm implements methods to train a Generalized Linear Model (GLM). This class should be extended with an Optimizer to create a new GLM.
- Annotations
- @Since("0.8.0")
- abstract class GeneralizedLinearModel extends Serializable
GeneralizedLinearModel (GLM) represents a model trained using GeneralizedLinearAlgorithm.
GeneralizedLinearModel (GLM) represents a model trained using GeneralizedLinearAlgorithm. GLMs consist of a weight vector and an intercept.
- Annotations
- @Since("0.8.0")
- class IsotonicRegression extends Serializable
Isotonic regression.
Isotonic regression. Currently implemented using parallelized pool adjacent violators algorithm. Only univariate (single feature) algorithm supported.
Sequential PAV implementation based on: Grotzinger, S. J., and C. Witzgall. "Projections onto order simplexes." Applied mathematics and Optimization 12.1 (1984): 247-270.
Sequential PAV parallelization based on: Kearsley, Anthony J., Richard A. Tapia, and Michael W. Trosset. "An approach to parallelizing isotonic regression." Applied Mathematics and Parallel Computing. Physica-Verlag HD, 1996. 141-147. Available from here
- Annotations
- @Since("1.3.0")
- See also
- class IsotonicRegressionModel extends Serializable with Saveable
Regression model for isotonic regression.
Regression model for isotonic regression.
- Annotations
- @Since("1.3.0")
- case class LabeledPoint(label: Double, features: Vector) extends Product with Serializable
Class that represents the features and labels of a data point.
Class that represents the features and labels of a data point.
- label
Label for this data point.
- features
List of features for this data point.
- Annotations
- @Since("0.8.0")
- class LassoModel extends GeneralizedLinearModel with RegressionModel with Serializable with Saveable with PMMLExportable
Regression model trained using Lasso.
Regression model trained using Lasso.
- Annotations
- @Since("0.8.0")
- class LassoWithSGD extends GeneralizedLinearAlgorithm[LassoModel] with Serializable
Train a regression model with L1-regularization using Stochastic Gradient Descent.
Train a regression model with L1-regularization using Stochastic Gradient Descent. This solves the l1-regularized least squares regression formulation f(weights) = 1/2n ||A weights-y||2 + regParam ||weights||_1 Here the data matrix has n rows, and the input RDD holds the set of rows of A, each with its corresponding right hand side label y. See also the documentation for the precise formulation.
- Annotations
- @Since("0.8.0")
- class LinearRegressionModel extends GeneralizedLinearModel with RegressionModel with Serializable with Saveable with PMMLExportable
Regression model trained using LinearRegression.
Regression model trained using LinearRegression.
- Annotations
- @Since("0.8.0")
- class LinearRegressionWithSGD extends GeneralizedLinearAlgorithm[LinearRegressionModel] with Serializable
Train a linear regression model with no regularization using Stochastic Gradient Descent.
Train a linear regression model with no regularization using Stochastic Gradient Descent. This solves the least squares regression formulation f(weights) = 1/n ||A weights-y||2 (which is the mean squared error). Here the data matrix has n rows, and the input RDD holds the set of rows of A, each with its corresponding right hand side label y. See also the documentation for the precise formulation.
- Annotations
- @Since("0.8.0")
- trait RegressionModel extends Serializable
- Annotations
- @Since("0.8.0")
- class RidgeRegressionModel extends GeneralizedLinearModel with RegressionModel with Serializable with Saveable with PMMLExportable
Regression model trained using RidgeRegression.
Regression model trained using RidgeRegression.
- Annotations
- @Since("0.8.0")
- class RidgeRegressionWithSGD extends GeneralizedLinearAlgorithm[RidgeRegressionModel] with Serializable
Train a regression model with L2-regularization using Stochastic Gradient Descent.
Train a regression model with L2-regularization using Stochastic Gradient Descent. This solves the l2-regularized least squares regression formulation f(weights) = 1/2n ||A weights-y||2 + regParam/2 ||weights||2 Here the data matrix has n rows, and the input RDD holds the set of rows of A, each with its corresponding right hand side label y. See also the documentation for the precise formulation.
- Annotations
- @Since("0.8.0")
- abstract class StreamingLinearAlgorithm[M <: GeneralizedLinearModel, A <: GeneralizedLinearAlgorithm[M]] extends Logging
StreamingLinearAlgorithm implements methods for continuously training a generalized linear model on streaming data, and using it for prediction on (possibly different) streaming data.
StreamingLinearAlgorithm implements methods for continuously training a generalized linear model on streaming data, and using it for prediction on (possibly different) streaming data.
This class takes as type parameters a GeneralizedLinearModel, and a GeneralizedLinearAlgorithm, making it easy to extend to construct streaming versions of any analyses using GLMs. Initial weights must be set before calling trainOn or predictOn. Only weights will be updated, not an intercept. If the model needs an intercept, it should be manually appended to the input data.
For example usage, see
StreamingLinearRegressionWithSGD
.NOTE: In some use cases, the order in which trainOn and predictOn are called in an application will affect the results. When called on the same DStream, if trainOn is called before predictOn, when new data arrive the model will update and the prediction will be based on the new model. Whereas if predictOn is called first, the prediction will use the model from the previous update.
NOTE: It is ok to call predictOn repeatedly on multiple streams; this will generate predictions for each one all using the current model. It is also ok to call trainOn on different streams; this will update the model using each of the different sources, in sequence.
- Annotations
- @Since("1.1.0")
- class StreamingLinearRegressionWithSGD extends StreamingLinearAlgorithm[LinearRegressionModel, LinearRegressionWithSGD] with Serializable
Train or predict a linear regression model on streaming data.
Train or predict a linear regression model on streaming data. Training uses Stochastic Gradient Descent to update the model based on each new batch of incoming data from a DStream (see
LinearRegressionWithSGD
for model equation)Each batch of data is assumed to be an RDD of LabeledPoints. The number of data points per batch can vary, but the number of features must be constant. An initial weight vector must be provided.
Use a builder pattern to construct a streaming linear regression analysis in an application, like:
val model = new StreamingLinearRegressionWithSGD() .setStepSize(0.5) .setNumIterations(10) .setInitialWeights(Vectors.dense(...)) .trainOn(DStream)
- Annotations
- @Since("1.1.0")
Value Members
- object IsotonicRegressionModel extends Loader[IsotonicRegressionModel] with Serializable
- Annotations
- @Since("1.4.0")
- object LabeledPoint extends Serializable
Parser for org.apache.spark.mllib.regression.LabeledPoint.
Parser for org.apache.spark.mllib.regression.LabeledPoint.
- Annotations
- @Since("1.1.0")
- object LassoModel extends Loader[LassoModel] with Serializable
- Annotations
- @Since("1.3.0")
- object LinearRegressionModel extends Loader[LinearRegressionModel] with Serializable
- Annotations
- @Since("1.3.0")
- object RidgeRegressionModel extends Loader[RidgeRegressionModel] with Serializable
- Annotations
- @Since("1.3.0")