Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package mllib

    RDD-based machine learning APIs (in maintenance mode).

    RDD-based machine learning APIs (in maintenance mode).

    The spark.mllib package is in maintenance mode as of the Spark 2.0.0 release to encourage migration to the DataFrame-based APIs under the org.apache.spark.ml package. While in maintenance mode,

    • no new features in the RDD-based spark.mllib package will be accepted, unless they block implementing new features in the DataFrame-based spark.ml package;
    • bug fixes in the RDD-based APIs will still be accepted.

    The developers will continue adding more features to the DataFrame-based APIs in the 2.x series to reach feature parity with the RDD-based APIs. And once we reach feature parity, this package will be deprecated.

    Definition Classes
    spark
    See also

    SPARK-4591 to track the progress of feature parity

  • package classification
    Definition Classes
    mllib
  • ClassificationModel
  • LogisticRegressionModel
  • LogisticRegressionWithLBFGS
  • LogisticRegressionWithSGD
  • NaiveBayes
  • NaiveBayesModel
  • SVMModel
  • SVMWithSGD
  • StreamingLogisticRegressionWithSGD

class SVMModel extends GeneralizedLinearModel with ClassificationModel with Serializable with Saveable with PMMLExportable

Model for Support Vector Machines (SVMs).

Annotations
@Since("0.8.0")
Source
SVM.scala
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SVMModel
  2. PMMLExportable
  3. Saveable
  4. ClassificationModel
  5. GeneralizedLinearModel
  6. Serializable
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new SVMModel(weights: Vector, intercept: Double)

    weights

    Weights computed for every feature.

    intercept

    Intercept computed for this model.

    Annotations
    @Since("1.1.0")

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clearThreshold(): SVMModel.this.type

    Clears the threshold so that predict will output raw prediction scores.

    Clears the threshold so that predict will output raw prediction scores.

    Annotations
    @Since("1.0.0")
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  9. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  10. def getThreshold: Option[Double]

    Returns the threshold (if any) used for converting raw prediction scores into 0/1 predictions.

    Returns the threshold (if any) used for converting raw prediction scores into 0/1 predictions.

    Annotations
    @Since("1.3.0")
  11. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  12. val intercept: Double
    Definition Classes
    SVMModelGeneralizedLinearModel
    Annotations
    @Since("0.8.0")
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  17. def predict(testData: JavaRDD[Vector]): JavaRDD[Double]

    Predict values for examples stored in a JavaRDD.

    Predict values for examples stored in a JavaRDD.

    testData

    JavaRDD representing data points to be predicted

    returns

    a JavaRDD[java.lang.Double] where each entry contains the corresponding prediction

    Definition Classes
    ClassificationModel
    Annotations
    @Since("1.0.0")
  18. def predict(testData: Vector): Double

    Predict values for a single data point using the model trained.

    Predict values for a single data point using the model trained.

    testData

    array representing a single data point

    returns

    Double prediction from the trained model

    Definition Classes
    GeneralizedLinearModel
    Annotations
    @Since("1.0.0")
  19. def predict(testData: RDD[Vector]): RDD[Double]

    Predict values for the given data set using the model trained.

    Predict values for the given data set using the model trained.

    testData

    RDD representing data points to be predicted

    returns

    RDD[Double] where each entry contains the corresponding prediction

    Definition Classes
    GeneralizedLinearModel
    Annotations
    @Since("1.0.0")
  20. def predictPoint(dataMatrix: Vector, weightMatrix: Vector, intercept: Double): Double

    Predict the result given a data point and the weights learned.

    Predict the result given a data point and the weights learned.

    dataMatrix

    Row vector containing the features for this data point

    weightMatrix

    Column vector containing the weights of the model

    intercept

    Intercept of the model.

    Attributes
    protected
    Definition Classes
    SVMModelGeneralizedLinearModel
  21. def save(sc: SparkContext, path: String): Unit

    Save this model to the given path.

    Save this model to the given path.

    This saves:

    • human-readable (JSON) model metadata to path/metadata/
    • Parquet formatted data to path/data/

    The model may be loaded using Loader.load.

    sc

    Spark context used to save model data.

    path

    Path specifying the directory in which to save this model. If the directory already exists, this method throws an exception.

    Definition Classes
    SVMModelSaveable
    Annotations
    @Since("1.3.0")
  22. def setThreshold(threshold: Double): SVMModel.this.type

    Sets the threshold that separates positive predictions from negative predictions.

    Sets the threshold that separates positive predictions from negative predictions. An example with prediction score greater than or equal to this threshold is identified as a positive, and negative otherwise. The default value is 0.0.

    Annotations
    @Since("1.0.0")
  23. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  24. def toPMML(): String

    Export the model to a String in PMML format

    Export the model to a String in PMML format

    Definition Classes
    PMMLExportable
    Annotations
    @Since("1.4.0")
  25. def toPMML(outputStream: OutputStream): Unit

    Export the model to the OutputStream in PMML format

    Export the model to the OutputStream in PMML format

    Definition Classes
    PMMLExportable
    Annotations
    @Since("1.4.0")
  26. def toPMML(sc: SparkContext, path: String): Unit

    Export the model to a directory on a distributed file system in PMML format

    Export the model to a directory on a distributed file system in PMML format

    Definition Classes
    PMMLExportable
    Annotations
    @Since("1.4.0")
  27. def toPMML(localPath: String): Unit

    Export the model to a local file in PMML format

    Export the model to a local file in PMML format

    Definition Classes
    PMMLExportable
    Annotations
    @Since("1.4.0")
  28. def toString(): String

    Print a summary of the model.

    Print a summary of the model.

    Definition Classes
    SVMModelGeneralizedLinearModel → AnyRef → Any
  29. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  30. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  31. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  32. val weights: Vector
    Definition Classes
    SVMModelGeneralizedLinearModel
    Annotations
    @Since("1.0.0")

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from PMMLExportable

Inherited from Saveable

Inherited from ClassificationModel

Inherited from GeneralizedLinearModel

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped