object SVMWithSGD extends Serializable
Top-level methods for calling SVM.
- Annotations
- @Since("0.8.0")
- Source
- SVM.scala
- Note
Labels used in SVM should be {0, 1}.
- Alphabetic
- By Inheritance
- SVMWithSGD
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- def train(input: RDD[LabeledPoint], numIterations: Int): SVMModel
Train a SVM model given an RDD of (label, features) pairs.
Train a SVM model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using a step size of 1.0. We use the entire data set to update the gradient in each iteration.
- input
RDD of (label, array of features) pairs.
- numIterations
Number of iterations of gradient descent to run.
- returns
a SVMModel which has the weights and offset from training.
- Annotations
- @Since("0.8.0")
- Note
Labels used in SVM should be {0, 1}
- def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, regParam: Double): SVMModel
Train a SVM model given an RDD of (label, features) pairs.
Train a SVM model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. We use the entire data set to update the gradient in each iteration.
- input
RDD of (label, array of features) pairs.
- numIterations
Number of iterations of gradient descent to run.
- stepSize
Step size to be used for each iteration of Gradient Descent.
- regParam
Regularization parameter.
- returns
a SVMModel which has the weights and offset from training.
- Annotations
- @Since("0.8.0")
- Note
Labels used in SVM should be {0, 1}
- def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, regParam: Double, miniBatchFraction: Double): SVMModel
Train a SVM model given an RDD of (label, features) pairs.
Train a SVM model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses
miniBatchFraction
fraction of the data to calculate the gradient.- input
RDD of (label, array of features) pairs.
- numIterations
Number of iterations of gradient descent to run.
- stepSize
Step size to be used for each iteration of gradient descent.
- regParam
Regularization parameter.
- miniBatchFraction
Fraction of data to be used per iteration.
- Annotations
- @Since("0.8.0")
- Note
Labels used in SVM should be {0, 1}
- def train(input: RDD[LabeledPoint], numIterations: Int, stepSize: Double, regParam: Double, miniBatchFraction: Double, initialWeights: Vector): SVMModel
Train a SVM model given an RDD of (label, features) pairs.
Train a SVM model given an RDD of (label, features) pairs. We run a fixed number of iterations of gradient descent using the specified step size. Each iteration uses
miniBatchFraction
fraction of the data to calculate the gradient. The weights used in gradient descent are initialized using the initial weights provided.- input
RDD of (label, array of features) pairs.
- numIterations
Number of iterations of gradient descent to run.
- stepSize
Step size to be used for each iteration of gradient descent.
- regParam
Regularization parameter.
- miniBatchFraction
Fraction of data to be used per iteration.
- initialWeights
Initial set of weights to be used. Array should be equal in size to the number of features in the data.
- Annotations
- @Since("0.8.0")
- Note
Labels used in SVM should be {0, 1}.
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)