object AbsoluteError extends Loss
Class for absolute error loss calculation (for regression).
The absolute (L1) error is defined as: |y - F(x)| where y is the label and F(x) is the model prediction for features x.
- Annotations
- @Since( "1.2.0" )
- Source
- AbsoluteError.scala
- Alphabetic
- By Inheritance
- AbsoluteError
- Loss
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
computeError(model: TreeEnsembleModel, data: RDD[LabeledPoint]): Double
Method to calculate error of the base learner for the gradient boosting calculation.
Method to calculate error of the base learner for the gradient boosting calculation.
- model
Model of the weak learner.
- data
Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint.
- returns
Measure of model error on data
- Definition Classes
- Loss
- Annotations
- @Since( "1.2.0" )
- Note
This method is not used by the gradient boosting algorithm but is useful for debugging purposes.
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
gradient(prediction: Double, label: Double): Double
Method to calculate the gradients for the gradient boosting calculation for least absolute error calculation.
Method to calculate the gradients for the gradient boosting calculation for least absolute error calculation. The gradient with respect to F(x) is: sign(F(x) - y)
- prediction
Predicted label.
- label
True label.
- returns
Loss gradient
- Definition Classes
- AbsoluteError → Loss
- Annotations
- @Since( "1.2.0" )
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()