org.apache.spark.mllib.tree.configuration

BoostingStrategy

case class BoostingStrategy(treeStrategy: Strategy, loss: Loss, numIterations: Int = 100, learningRate: Double = 0.1, validationTol: Double = 1.0E-5) extends Serializable with Product

:: Experimental :: Configuration options for org.apache.spark.mllib.tree.GradientBoostedTrees.

treeStrategy

Parameters for the tree algorithm. We support regression and binary classification for boosting. Impurity setting will be ignored.

loss

Loss function used for minimization during gradient boosting.

numIterations

Number of iterations of boosting. In other words, the number of weak hypotheses used in the final model.

learningRate

Learning rate for shrinking the contribution of each estimator. The learning rate should be between in the interval (0, 1]

validationTol

Useful when runWithValidation is used. If the error rate on the validation input between two iterations is less than the validationTol then stop. Ignored when org.apache.spark.mllib.tree.GradientBoostedTrees.run() is used.

Annotations
@Since( "1.2.0" ) @Experimental()
Linear Supertypes
Product, Equals, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. BoostingStrategy
  2. Product
  3. Equals
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new BoostingStrategy(treeStrategy: Strategy, loss: Loss, numIterations: Int = 100, learningRate: Double = 0.1, validationTol: Double = 1.0E-5)

    treeStrategy

    Parameters for the tree algorithm. We support regression and binary classification for boosting. Impurity setting will be ignored.

    loss

    Loss function used for minimization during gradient boosting.

    numIterations

    Number of iterations of boosting. In other words, the number of weak hypotheses used in the final model.

    learningRate

    Learning rate for shrinking the contribution of each estimator. The learning rate should be between in the interval (0, 1]

    validationTol

    Useful when runWithValidation is used. If the error rate on the validation input between two iterations is less than the validationTol then stop. Ignored when org.apache.spark.mllib.tree.GradientBoostedTrees.run() is used.

    Annotations
    @Since( "1.4.0" )

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  9. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  11. def getLearningRate(): Double

    Annotations
    @Since( "1.2.0" )
  12. def getLoss(): Loss

    Annotations
    @Since( "1.2.0" )
  13. def getNumIterations(): Int

    Annotations
    @Since( "1.2.0" )
  14. def getTreeStrategy(): Strategy

    Annotations
    @Since( "1.2.0" )
  15. def getValidationTol(): Double

    Annotations
    @Since( "1.4.0" )
  16. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  17. var learningRate: Double

    Learning rate for shrinking the contribution of each estimator.

    Learning rate for shrinking the contribution of each estimator. The learning rate should be between in the interval (0, 1]

    Annotations
    @Since( "1.2.0" )
  18. var loss: Loss

    Loss function used for minimization during gradient boosting.

    Loss function used for minimization during gradient boosting.

    Annotations
    @Since( "1.2.0" )
  19. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  20. final def notify(): Unit

    Definition Classes
    AnyRef
  21. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  22. var numIterations: Int

    Number of iterations of boosting.

    Number of iterations of boosting. In other words, the number of weak hypotheses used in the final model.

    Annotations
    @Since( "1.2.0" )
  23. def setLearningRate(arg0: Double): Unit

    Annotations
    @Since( "1.2.0" )
  24. def setLoss(arg0: Loss): Unit

    Annotations
    @Since( "1.2.0" )
  25. def setNumIterations(arg0: Int): Unit

    Annotations
    @Since( "1.2.0" )
  26. def setTreeStrategy(arg0: Strategy): Unit

    Annotations
    @Since( "1.2.0" )
  27. def setValidationTol(arg0: Double): Unit

    Annotations
    @Since( "1.4.0" )
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  29. var treeStrategy: Strategy

    Parameters for the tree algorithm.

    Parameters for the tree algorithm. We support regression and binary classification for boosting. Impurity setting will be ignored.

    Annotations
    @Since( "1.2.0" )
  30. var validationTol: Double

    Useful when runWithValidation is used.

    Useful when runWithValidation is used. If the error rate on the validation input between two iterations is less than the validationTol then stop. Ignored when org.apache.spark.mllib.tree.GradientBoostedTrees.run() is used.

    Annotations
    @Since( "1.4.0" )
  31. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Product

Inherited from Equals

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped