Object

org.apache.spark.mllib.tree

RandomForest

Related Doc: package tree

Permalink

object RandomForest extends Serializable with Logging

Annotations
@Since( "1.2.0" )
Source
RandomForest.scala
Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. RandomForest
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  12. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  13. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  14. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  15. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  16. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  17. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  18. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  19. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  20. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  21. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  22. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  23. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  24. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  25. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  26. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  27. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  29. val supportedFeatureSubsetStrategies: Array[String]

    Permalink

    List of supported feature subset sampling strategies.

    List of supported feature subset sampling strategies.

    Annotations
    @Since( "1.2.0" )
  30. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  31. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  32. def trainClassifier(input: JavaRDD[LabeledPoint], numClasses: Int, categoricalFeaturesInfo: Map[Integer, Integer], numTrees: Int, featureSubsetStrategy: String, impurity: String, maxDepth: Int, maxBins: Int, seed: Int): RandomForestModel

    Permalink

    Java-friendly API for org.apache.spark.mllib.tree.RandomForest$#trainClassifier

    Annotations
    @Since( "1.2.0" )
  33. def trainClassifier(input: RDD[LabeledPoint], numClasses: Int, categoricalFeaturesInfo: Map[Int, Int], numTrees: Int, featureSubsetStrategy: String, impurity: String, maxDepth: Int, maxBins: Int, seed: Int = Utils.random.nextInt()): RandomForestModel

    Permalink

    Method to train a decision tree model for binary or multiclass classification.

    Method to train a decision tree model for binary or multiclass classification.

    input

    Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint. Labels should take values {0, 1, ..., numClasses-1}.

    numClasses

    Number of classes for classification.

    categoricalFeaturesInfo

    Map storing arity of categorical features. An entry (n to k) indicates that feature n is categorical with k categories indexed from 0: {0, 1, ..., k-1}.

    numTrees

    Number of trees in the random forest.

    featureSubsetStrategy

    Number of features to consider for splits at each node. Supported values: "auto", "all", "sqrt", "log2", "onethird". If "auto" is set, this parameter is set based on numTrees: if numTrees == 1, set to "all"; if numTrees is greater than 1 (forest) set to "sqrt".

    impurity

    Criterion used for information gain calculation. Supported values: "gini" (recommended) or "entropy".

    maxDepth

    Maximum depth of the tree (e.g. depth 0 means 1 leaf node, depth 1 means 1 internal node + 2 leaf nodes). (suggested value: 4)

    maxBins

    Maximum number of bins used for splitting features (suggested value: 100)

    seed

    Random seed for bootstrapping and choosing feature subsets.

    returns

    RandomForestModel that can be used for prediction.

    Annotations
    @Since( "1.2.0" )
  34. def trainClassifier(input: RDD[LabeledPoint], strategy: Strategy, numTrees: Int, featureSubsetStrategy: String, seed: Int): RandomForestModel

    Permalink

    Method to train a decision tree model for binary or multiclass classification.

    Method to train a decision tree model for binary or multiclass classification.

    input

    Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint. Labels should take values {0, 1, ..., numClasses-1}.

    strategy

    Parameters for training each tree in the forest.

    numTrees

    Number of trees in the random forest.

    featureSubsetStrategy

    Number of features to consider for splits at each node. Supported values: "auto", "all", "sqrt", "log2", "onethird". If "auto" is set, this parameter is set based on numTrees: if numTrees == 1, set to "all"; if numTrees is greater than 1 (forest) set to "sqrt".

    seed

    Random seed for bootstrapping and choosing feature subsets.

    returns

    RandomForestModel that can be used for prediction.

    Annotations
    @Since( "1.2.0" )
  35. def trainRegressor(input: JavaRDD[LabeledPoint], categoricalFeaturesInfo: Map[Integer, Integer], numTrees: Int, featureSubsetStrategy: String, impurity: String, maxDepth: Int, maxBins: Int, seed: Int): RandomForestModel

    Permalink

    Java-friendly API for org.apache.spark.mllib.tree.RandomForest$#trainRegressor

    Annotations
    @Since( "1.2.0" )
  36. def trainRegressor(input: RDD[LabeledPoint], categoricalFeaturesInfo: Map[Int, Int], numTrees: Int, featureSubsetStrategy: String, impurity: String, maxDepth: Int, maxBins: Int, seed: Int = Utils.random.nextInt()): RandomForestModel

    Permalink

    Method to train a decision tree model for regression.

    Method to train a decision tree model for regression.

    input

    Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint. Labels are real numbers.

    categoricalFeaturesInfo

    Map storing arity of categorical features. An entry (n to k) indicates that feature n is categorical with k categories indexed from 0: {0, 1, ..., k-1}.

    numTrees

    Number of trees in the random forest.

    featureSubsetStrategy

    Number of features to consider for splits at each node. Supported values: "auto", "all", "sqrt", "log2", "onethird". If "auto" is set, this parameter is set based on numTrees: if numTrees == 1, set to "all"; if numTrees is greater than 1 (forest) set to "onethird".

    impurity

    Criterion used for information gain calculation. The only supported value for regression is "variance".

    maxDepth

    Maximum depth of the tree. (e.g., depth 0 means 1 leaf node, depth 1 means 1 internal node + 2 leaf nodes). (suggested value: 4)

    maxBins

    Maximum number of bins used for splitting features. (suggested value: 100)

    seed

    Random seed for bootstrapping and choosing feature subsets.

    returns

    RandomForestModel that can be used for prediction.

    Annotations
    @Since( "1.2.0" )
  37. def trainRegressor(input: RDD[LabeledPoint], strategy: Strategy, numTrees: Int, featureSubsetStrategy: String, seed: Int): RandomForestModel

    Permalink

    Method to train a decision tree model for regression.

    Method to train a decision tree model for regression.

    input

    Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint. Labels are real numbers.

    strategy

    Parameters for training each tree in the forest.

    numTrees

    Number of trees in the random forest.

    featureSubsetStrategy

    Number of features to consider for splits at each node. Supported values: "auto", "all", "sqrt", "log2", "onethird". If "auto" is set, this parameter is set based on numTrees: if numTrees == 1, set to "all"; if numTrees is greater than 1 (forest) set to "onethird".

    seed

    Random seed for bootstrapping and choosing feature subsets.

    returns

    RandomForestModel that can be used for prediction.

    Annotations
    @Since( "1.2.0" )
  38. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped