Represents a classification model that predicts to which of a set of categories an example belongs.
Classification model trained using Multinomial/Binary Logistic Regression.
Classification model trained using Multinomial/Binary Logistic Regression.
Train a classification model for Multinomial/Binary Logistic Regression using Limited-memory BFGS.
Train a classification model for Multinomial/Binary Logistic Regression using Limited-memory BFGS. Standard feature scaling and L2 regularization are used by default.
Earlier implementations of LogisticRegressionWithLBFGS applies a regularization penalty to all elements including the intercept. If this is called with one of standard updaters (L1Updater, or SquaredL2Updater) this is translated into a call to ml.LogisticRegression, otherwise this will use the existing mllib GeneralizedLinearAlgorithm trainer, resulting in a regularization penalty to the intercept.
Labels used in Logistic Regression should be {0, 1, ..., k - 1} for k classes multi-label classification problem.
Train a classification model for Binary Logistic Regression using Stochastic Gradient Descent.
Train a classification model for Binary Logistic Regression
using Stochastic Gradient Descent. By default L2 regularization is used,
which can be changed via LogisticRegressionWithSGD.optimizer
.
Using LogisticRegressionWithLBFGS is recommended over this.
Labels used in Logistic Regression should be {0, 1, ..., k - 1} for k classes multi-label classification problem.
Trains a Naive Bayes model given an RDD of (label, features)
pairs.
Trains a Naive Bayes model given an RDD of (label, features)
pairs.
This is the Multinomial NB (see here) which can handle all kinds of discrete data. For example, by converting documents into TF-IDF vectors, it can be used for document classification. By making every vector a 0-1 vector, it can also be used as Bernoulli NB (see here). The input feature values must be nonnegative.
Model for Naive Bayes Classifiers.
Model for Naive Bayes Classifiers.
Model for Support Vector Machines (SVMs).
Model for Support Vector Machines (SVMs).
Train a Support Vector Machine (SVM) using Stochastic Gradient Descent.
Train a Support Vector Machine (SVM) using Stochastic Gradient Descent. By default L2
regularization is used, which can be changed via SVMWithSGD.optimizer
.
Labels used in SVM should be {0, 1}.
Train or predict a logistic regression model on streaming data.
Train or predict a logistic regression model on streaming data. Training uses
Stochastic Gradient Descent to update the model based on each new batch of
incoming data from a DStream (see LogisticRegressionWithSGD
for model equation)
Each batch of data is assumed to be an RDD of LabeledPoints. The number of data points per batch can vary, but the number of features must be constant. An initial weight vector must be provided.
Use a builder pattern to construct a streaming logistic regression analysis in an application, like:
val model = new StreamingLogisticRegressionWithSGD() .setStepSize(0.5) .setNumIterations(10) .setInitialWeights(Vectors.dense(...)) .trainOn(DStream)
Top-level methods for calling naive Bayes.
Top-level methods for calling naive Bayes.
Top-level methods for calling SVM.
Top-level methods for calling SVM.
Labels used in SVM should be {0, 1}.
Top-level methods for calling Logistic Regression using Stochastic Gradient Descent.
Top-level methods for calling Logistic Regression using Stochastic Gradient Descent.
(Since version 2.0.0) Use ml.classification.LogisticRegression or LogisticRegressionWithLBFGS
Labels used in Logistic Regression should be {0, 1}
Represents a classification model that predicts to which of a set of categories an example belongs. The categories are represented by double values: 0.0, 1.0, 2.0, etc.