org.apache.spark.sql

execution

package execution

:: DeveloperApi :: An execution engine for relational query plans that runs on top Spark and returns RDDs.

Note that the operators in this package are created automatically by a query planner using a SQLContext and are not intended to be used directly by end users of Spark SQL. They are documented here in order to make it easier for others to understand the performance characteristics of query plans that are generated by Spark SQL.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. execution
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. case class Aggregate(partial: Boolean, groupingExpressions: Seq[Expression], aggregateExpressions: Seq[NamedExpression], child: SparkPlan)(sqlContext: SQLContext) extends SparkPlan with UnaryNode with NoBind with Product with Serializable

    :: DeveloperApi :: Groups input data by groupingExpressions and computes the aggregateExpressions for each group.

  2. case class BroadcastNestedLoopJoin(streamed: SparkPlan, broadcast: SparkPlan, joinType: JoinType, condition: Option[Expression])(sqlContext: SQLContext) extends SparkPlan with BinaryNode with Product with Serializable

    :: DeveloperApi ::

  3. sealed abstract class BuildSide extends AnyRef

    Annotations
    @DeveloperApi()
  4. case class CacheCommand(tableName: String, doCache: Boolean)(context: SQLContext) extends SparkPlan with LeafNode with Command with Product with Serializable

    :: DeveloperApi ::

  5. case class CartesianProduct(left: SparkPlan, right: SparkPlan) extends SparkPlan with BinaryNode with Product with Serializable

    :: DeveloperApi ::

  6. trait Command extends AnyRef

  7. case class DescribeCommand(child: SparkPlan, output: Seq[Attribute])(context: SQLContext) extends SparkPlan with LeafNode with Command with Product with Serializable

    :: DeveloperApi ::

  8. case class Exchange(newPartitioning: Partitioning, child: SparkPlan) extends SparkPlan with UnaryNode with NoBind with Product with Serializable

    :: DeveloperApi ::

  9. case class ExistingRdd(output: Seq[Attribute], rdd: RDD[catalyst.expressions.Row]) extends SparkPlan with LeafNode with Product with Serializable

    :: DeveloperApi ::

  10. case class ExplainCommand(logicalPlan: LogicalPlan, output: Seq[Attribute])(context: SQLContext) extends SparkPlan with LeafNode with Command with Product with Serializable

    An explain command for users to see how a command will be executed.

  11. case class Filter(condition: Expression, child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi ::

  12. case class Generate(generator: Generator, join: Boolean, outer: Boolean, child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi :: Applies a Generator to a stream of input rows, combining the output of each into a new stream of rows.

  13. case class HashJoin(leftKeys: Seq[Expression], rightKeys: Seq[Expression], buildSide: BuildSide, left: SparkPlan, right: SparkPlan) extends SparkPlan with BinaryNode with Product with Serializable

    :: DeveloperApi ::

  14. case class LeftSemiJoinBNL(streamed: SparkPlan, broadcast: SparkPlan, condition: Option[Expression])(sqlContext: SQLContext) extends SparkPlan with BinaryNode with Product with Serializable

    :: DeveloperApi :: Using BroadcastNestedLoopJoin to calculate left semi join result when there's no join keys for hash join.

  15. case class LeftSemiJoinHash(leftKeys: Seq[Expression], rightKeys: Seq[Expression], left: SparkPlan, right: SparkPlan) extends SparkPlan with BinaryNode with Product with Serializable

    :: DeveloperApi :: Build the right table's join keys into a HashSet, and iteratively go through the left table, to find the if join keys are in the Hash set.

  16. case class Limit(limit: Int, child: SparkPlan)(sqlContext: SQLContext) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi :: Take the first limit elements.

  17. case class Project(projectList: Seq[NamedExpression], child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi ::

  18. class QueryExecutionException extends Exception

  19. case class Sample(fraction: Double, withReplacement: Boolean, seed: Long, child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi ::

  20. case class SetCommand(key: Option[String], value: Option[String], output: Seq[Attribute])(context: SQLContext) extends SparkPlan with LeafNode with Command with Product with Serializable

    :: DeveloperApi ::

  21. case class Sort(sortOrder: Seq[SortOrder], global: Boolean, child: SparkPlan) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi ::

  22. case class SparkLogicalPlan(alreadyPlanned: SparkPlan) extends LogicalPlan with MultiInstanceRelation with Product with Serializable

    :: DeveloperApi :: Allows already planned SparkQueries to be linked into logical query plans.

  23. abstract class SparkPlan extends QueryPlan[SparkPlan] with Logging

    :: DeveloperApi ::

  24. case class TakeOrdered(limit: Int, sortOrder: Seq[SortOrder], child: SparkPlan)(sqlContext: SQLContext) extends SparkPlan with UnaryNode with Product with Serializable

    :: DeveloperApi :: Take the first limit elements as defined by the sortOrder.

  25. case class Union(children: Seq[SparkPlan])(sqlContext: SQLContext) extends SparkPlan with Product with Serializable

    :: DeveloperApi ::

Value Members

  1. object BuildLeft extends BuildSide with Product with Serializable

    Annotations
    @DeveloperApi()
  2. object BuildRight extends BuildSide with Product with Serializable

    Annotations
    @DeveloperApi()
  3. object ExistingRdd extends Serializable

    :: DeveloperApi ::

  4. package debug

    :: DeveloperApi :: Contains methods for debugging query execution.

Inherited from AnyRef

Inherited from Any

Ungrouped