case class BoostingStrategy(treeStrategy: Strategy, loss: Loss, numIterations: Int = 100, learningRate: Double = 0.1, validationTol: Double = 0.001) extends Serializable with Product
Configuration options for org.apache.spark.mllib.tree.GradientBoostedTrees.
- treeStrategy
Parameters for the tree algorithm. We support regression and binary classification for boosting. Impurity setting will be ignored.
- loss
Loss function used for minimization during gradient boosting.
- numIterations
Number of iterations of boosting. In other words, the number of weak hypotheses used in the final model.
- learningRate
Learning rate for shrinking the contribution of each estimator. The learning rate should be between in the interval (0, 1]
- validationTol
validationTol is a condition which decides iteration termination when runWithValidation is used. The end of iteration is decided based on below logic: If the current loss on the validation set is greater than 0.01, the diff of validation error is compared to relative tolerance which is validationTol * (current loss on the validation set). If the current loss on the validation set is less than or equal to 0.01, the diff of validation error is compared to absolute tolerance which is validationTol * 0.01. Ignored when
org.apache.spark.mllib.tree.GradientBoostedTrees.run()
is used.
- Annotations
- @Since( "1.2.0" )
- Source
- BoostingStrategy.scala
- Alphabetic
- By Inheritance
- BoostingStrategy
- Product
- Equals
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
BoostingStrategy(treeStrategy: Strategy, loss: Loss, numIterations: Int = 100, learningRate: Double = 0.1, validationTol: Double = 0.001)
- treeStrategy
Parameters for the tree algorithm. We support regression and binary classification for boosting. Impurity setting will be ignored.
- loss
Loss function used for minimization during gradient boosting.
- numIterations
Number of iterations of boosting. In other words, the number of weak hypotheses used in the final model.
- learningRate
Learning rate for shrinking the contribution of each estimator. The learning rate should be between in the interval (0, 1]
- validationTol
validationTol is a condition which decides iteration termination when runWithValidation is used. The end of iteration is decided based on below logic: If the current loss on the validation set is greater than 0.01, the diff of validation error is compared to relative tolerance which is validationTol * (current loss on the validation set). If the current loss on the validation set is less than or equal to 0.01, the diff of validation error is compared to absolute tolerance which is validationTol * 0.01. Ignored when
org.apache.spark.mllib.tree.GradientBoostedTrees.run()
is used.
- Annotations
- @Since( "1.4.0" )
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @IntrinsicCandidate()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @IntrinsicCandidate()
-
def
getLearningRate(): Double
- Annotations
- @Since( "1.2.0" )
-
def
getLoss(): Loss
- Annotations
- @Since( "1.2.0" )
-
def
getNumIterations(): Int
- Annotations
- @Since( "1.2.0" )
-
def
getTreeStrategy(): Strategy
- Annotations
- @Since( "1.2.0" )
-
def
getValidationTol(): Double
- Annotations
- @Since( "1.4.0" )
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
var
learningRate: Double
- Annotations
- @Since( "1.2.0" )
-
var
loss: Loss
- Annotations
- @Since( "1.2.0" )
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @IntrinsicCandidate()
-
var
numIterations: Int
- Annotations
- @Since( "1.2.0" )
-
def
setLearningRate(arg0: Double): Unit
- Annotations
- @Since( "1.2.0" )
-
def
setLoss(arg0: Loss): Unit
- Annotations
- @Since( "1.2.0" )
-
def
setNumIterations(arg0: Int): Unit
- Annotations
- @Since( "1.2.0" )
-
def
setTreeStrategy(arg0: Strategy): Unit
- Annotations
- @Since( "1.2.0" )
-
def
setValidationTol(arg0: Double): Unit
- Annotations
- @Since( "1.4.0" )
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
var
treeStrategy: Strategy
- Annotations
- @Since( "1.2.0" )
-
var
validationTol: Double
- Annotations
- @Since( "1.4.0" )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated
- Deprecated