Packages

c

org.apache.spark.sql.hive.execution

InsertIntoHiveTable

case class InsertIntoHiveTable(table: CatalogTable, partition: Map[String, Option[String]], query: LogicalPlan, overwrite: Boolean, ifPartitionNotExists: Boolean, outputColumnNames: Seq[String]) extends LogicalPlan with SaveAsHiveFile with Product with Serializable

Command for writing data out to a Hive table.

This class is mostly a mess, for legacy reasons (since it evolved in organic ways and had to follow Hive's internal implementations closely, which itself was a mess too). Please don't blame Reynold for this! He was just moving code around!

In the future we should converge the write path for Hive with the normal data source write path, as defined in org.apache.spark.sql.execution.datasources.FileFormatWriter.

table

the metadata of the table.

partition

a map from the partition key to the partition value (optional). If the partition value is optional, dynamic partition insert will be performed. As an example, INSERT INTO tbl PARTITION (a=1, b=2) AS ... would have

Map('a' -> Some('1'), 'b' -> Some('2'))

and INSERT INTO tbl PARTITION (a=1, b) AS ... would have

Map('a' -> Some('1'), 'b' -> None)

.

query

the logical plan representing data to write to.

overwrite

overwrite existing table or partitions.

ifPartitionNotExists

If true, only write if the partition does not exist. Only valid for static partitions.

Source
InsertIntoHiveTable.scala
Linear Supertypes
Serializable, Serializable, SaveAsHiveFile, DataWritingCommand, Command, LogicalPlan, Logging, QueryPlanConstraints, ConstraintHelper, LogicalPlanStats, AnalysisHelper, QueryPlan[LogicalPlan], TreeNode[LogicalPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. InsertIntoHiveTable
  2. Serializable
  3. Serializable
  4. SaveAsHiveFile
  5. DataWritingCommand
  6. Command
  7. LogicalPlan
  8. Logging
  9. QueryPlanConstraints
  10. ConstraintHelper
  11. LogicalPlanStats
  12. AnalysisHelper
  13. QueryPlan
  14. TreeNode
  15. Product
  16. Equals
  17. AnyRef
  18. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new InsertIntoHiveTable(table: CatalogTable, partition: Map[String, Option[String]], query: LogicalPlan, overwrite: Boolean, ifPartitionNotExists: Boolean, outputColumnNames: Seq[String])

    table

    the metadata of the table.

    partition

    a map from the partition key to the partition value (optional). If the partition value is optional, dynamic partition insert will be performed. As an example, INSERT INTO tbl PARTITION (a=1, b=2) AS ... would have

    Map('a' -> Some('1'), 'b' -> Some('2'))

    and INSERT INTO tbl PARTITION (a=1, b) AS ... would have

    Map('a' -> Some('1'), 'b' -> None)

    .

    query

    the logical plan representing data to write to.

    overwrite

    overwrite existing table or partitions.

    ifPartitionNotExists

    If true, only write if the partition does not exist. Only valid for static partitions.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. lazy val allAttributes: AttributeSeq
    Definition Classes
    QueryPlan
  5. def analyzed: Boolean
    Definition Classes
    AnalysisHelper
  6. def apply(number: Int): TreeNode[_]
    Definition Classes
    TreeNode
  7. def argString(maxFields: Int): String
    Definition Classes
    TreeNode
  8. def asCode: String
    Definition Classes
    TreeNode
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def assertNotAnalysisRule(): Unit
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  11. def basicWriteJobStatsTracker(hadoopConf: Configuration): BasicWriteJobStatsTracker
    Definition Classes
    DataWritingCommand
  12. final lazy val canonicalized: LogicalPlan
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  13. final def children: Seq[LogicalPlan]
    Definition Classes
    DataWritingCommand → Command → TreeNode
  14. def childrenResolved: Boolean
    Definition Classes
    LogicalPlan
  15. def clone(): LogicalPlan
    Definition Classes
    TreeNode → AnyRef
  16. def collect[B](pf: PartialFunction[LogicalPlan, B]): Seq[B]
    Definition Classes
    TreeNode
  17. def collectFirst[B](pf: PartialFunction[LogicalPlan, B]): Option[B]
    Definition Classes
    TreeNode
  18. def collectLeaves(): Seq[LogicalPlan]
    Definition Classes
    TreeNode
  19. def collectWithSubqueries[B](f: PartialFunction[LogicalPlan, B]): Seq[B]
    Definition Classes
    QueryPlan
  20. def conf: SQLConf
    Definition Classes
    QueryPlan
  21. lazy val constraints: ExpressionSet
    Definition Classes
    QueryPlanConstraints
  22. def constructIsNotNullConstraints(constraints: Set[Expression], output: Seq[Attribute]): Set[Expression]
    Definition Classes
    ConstraintHelper
  23. lazy val containsChild: Set[TreeNode[_]]
    Definition Classes
    TreeNode
  24. def copyTagsFrom(other: LogicalPlan): Unit
    Attributes
    protected
    Definition Classes
    TreeNode
  25. val createdTempDir: Option[Path]
    Definition Classes
    SaveAsHiveFile
  26. def deleteExternalTmpPath(hadoopConf: Configuration): Unit
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  27. def doCanonicalize(): LogicalPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  28. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  29. final def expressions: Seq[Expression]
    Definition Classes
    QueryPlan
  30. def fastEquals(other: TreeNode[_]): Boolean
    Definition Classes
    TreeNode
  31. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  32. def find(f: (LogicalPlan) ⇒ Boolean): Option[LogicalPlan]
    Definition Classes
    TreeNode
  33. def flatMap[A](f: (LogicalPlan) ⇒ TraversableOnce[A]): Seq[A]
    Definition Classes
    TreeNode
  34. def foreach(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  35. def foreachUp(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  36. def formattedNodeName: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  37. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], append: (String) ⇒ Unit, verbose: Boolean, prefix: String, addSuffix: Boolean, maxFields: Int, printNodeId: Boolean): Unit
    Definition Classes
    TreeNode
  38. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  39. def getExternalTmpPath(sparkSession: SparkSession, hadoopConf: Configuration, path: Path): Path
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  40. def getTagValue[T](tag: TreeNodeTag[T]): Option[T]
    Definition Classes
    TreeNode
  41. def hashCode(): Int
    Definition Classes
    TreeNode → AnyRef → Any
  42. val ifPartitionNotExists: Boolean
  43. def inferAdditionalConstraints(constraints: Set[Expression]): Set[Expression]
    Definition Classes
    ConstraintHelper
  44. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  45. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  46. def innerChildren: Seq[QueryPlan[_]]
    Definition Classes
    QueryPlan → TreeNode
  47. def inputSet: AttributeSet
    Definition Classes
    QueryPlan
  48. final def invalidateStatsCache(): Unit
    Definition Classes
    LogicalPlanStats
  49. def isCanonicalizedPlan: Boolean
    Attributes
    protected
    Definition Classes
    QueryPlan
  50. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  51. def isStreaming: Boolean
    Definition Classes
    LogicalPlan
  52. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  53. def jsonFields: List[JField]
    Attributes
    protected
    Definition Classes
    TreeNode
  54. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  55. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  56. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  57. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  58. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  59. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  60. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  61. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  62. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  63. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  64. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  65. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  66. def makeCopy(newArgs: Array[AnyRef]): LogicalPlan
    Definition Classes
    TreeNode
  67. def map[A](f: (LogicalPlan) ⇒ A): Seq[A]
    Definition Classes
    TreeNode
  68. def mapChildren(f: (LogicalPlan) ⇒ LogicalPlan): LogicalPlan
    Definition Classes
    TreeNode
  69. def mapExpressions(f: (Expression) ⇒ Expression): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  70. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]
    Attributes
    protected
    Definition Classes
    TreeNode
  71. def maxRows: Option[Long]
    Definition Classes
    LogicalPlan
  72. def maxRowsPerPartition: Option[Long]
    Definition Classes
    LogicalPlan
  73. lazy val metrics: Map[String, SQLMetric]
    Definition Classes
    DataWritingCommand
  74. final def missingInput: AttributeSet
    Definition Classes
    QueryPlan
  75. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  76. def nodeName: String
    Definition Classes
    TreeNode
  77. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  78. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  79. def numberedTreeString: String
    Definition Classes
    TreeNode
  80. val origin: Origin
    Definition Classes
    TreeNode
  81. def otherCopyArgs: Seq[AnyRef]
    Attributes
    protected
    Definition Classes
    TreeNode
  82. def output: Seq[Attribute]
    Definition Classes
    Command → QueryPlan
  83. val outputColumnNames: Seq[String]
    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  84. def outputColumns: Seq[Attribute]
    Definition Classes
    DataWritingCommand
  85. def outputOrdering: Seq[SortOrder]
    Definition Classes
    LogicalPlan
  86. lazy val outputSet: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  87. val overwrite: Boolean
  88. def p(number: Int): LogicalPlan
    Definition Classes
    TreeNode
  89. val partition: Map[String, Option[String]]
  90. def prettyJson: String
    Definition Classes
    TreeNode
  91. def printSchema(): Unit
    Definition Classes
    QueryPlan
  92. def producedAttributes: AttributeSet
    Definition Classes
    QueryPlan
  93. val query: LogicalPlan
    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  94. lazy val references: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  95. def refresh(): Unit
    Definition Classes
    LogicalPlan
  96. def resolve(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  97. def resolve(schema: StructType, resolver: Resolver): Seq[Attribute]
    Definition Classes
    LogicalPlan
  98. def resolveChildren(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  99. def resolveExpressions(r: PartialFunction[Expression, Expression]): LogicalPlan
    Definition Classes
    AnalysisHelper
  100. def resolveOperators(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  101. def resolveOperatorsDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  102. def resolveOperatorsUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  103. def resolveQuoted(name: String, resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  104. lazy val resolved: Boolean
    Definition Classes
    LogicalPlan
  105. def run(sparkSession: SparkSession, child: SparkPlan): Seq[Row]

    Inserts all the rows in the table into Hive.

    Inserts all the rows in the table into Hive. Row objects are properly serialized with the org.apache.hadoop.hive.serde2.SerDe and the org.apache.hadoop.mapred.OutputFormat provided by the table definition.

    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  106. def sameOutput(other: LogicalPlan): Boolean
    Definition Classes
    LogicalPlan
  107. final def sameResult(other: LogicalPlan): Boolean
    Definition Classes
    QueryPlan
  108. def saveAsHiveFile(sparkSession: SparkSession, plan: SparkPlan, hadoopConf: Configuration, fileSinkConf: ShimFileSinkDesc, outputLocation: String, customPartitionLocations: Map[TablePartitionSpec, String] = Map.empty, partitionAttributes: Seq[Attribute] = Nil): Set[String]
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  109. lazy val schema: StructType
    Definition Classes
    QueryPlan
  110. def schemaString: String
    Definition Classes
    QueryPlan
  111. final def semanticHash(): Int
    Definition Classes
    QueryPlan
  112. def setTagValue[T](tag: TreeNodeTag[T], value: T): Unit
    Definition Classes
    TreeNode
  113. def simpleString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  114. def simpleStringWithNodeId(): String
    Definition Classes
    QueryPlan → TreeNode
  115. def statePrefix: String
    Attributes
    protected
    Definition Classes
    LogicalPlan → QueryPlan
  116. def stats: Statistics
    Definition Classes
    Command → LogicalPlanStats
  117. val statsCache: Option[Statistics]
    Attributes
    protected
    Definition Classes
    LogicalPlanStats
  118. def stringArgs: Iterator[Any]
    Attributes
    protected
    Definition Classes
    TreeNode
  119. def subqueries: Seq[LogicalPlan]
    Definition Classes
    QueryPlan
  120. def subqueriesAll: Seq[LogicalPlan]
    Definition Classes
    QueryPlan
  121. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  122. val table: CatalogTable
  123. def toJSON: String
    Definition Classes
    TreeNode
  124. def toString(): String
    Definition Classes
    TreeNode → AnyRef → Any
  125. def transform(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  126. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    AnalysisHelper → QueryPlan
  127. def transformDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  128. def transformExpressions(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  129. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  130. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  131. def transformUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  132. def treeString(append: (String) ⇒ Unit, verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): Unit
    Definition Classes
    TreeNode
  133. final def treeString(verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): String
    Definition Classes
    TreeNode
  134. final def treeString: String
    Definition Classes
    TreeNode
  135. def unsetTagValue[T](tag: TreeNodeTag[T]): Unit
    Definition Classes
    TreeNode
  136. lazy val validConstraints: Set[Expression]
    Attributes
    protected
    Definition Classes
    QueryPlanConstraints
  137. def verboseString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  138. def verboseStringWithOperatorId(): String
    Definition Classes
    QueryPlan
  139. def verboseStringWithSuffix(maxFields: Int): String
    Definition Classes
    LogicalPlan → TreeNode
  140. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  141. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  142. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  143. def withNewChildren(newChildren: Seq[LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode

Inherited from Serializable

Inherited from Serializable

Inherited from SaveAsHiveFile

Inherited from DataWritingCommand

Inherited from Command

Inherited from LogicalPlan

Inherited from Logging

Inherited from QueryPlanConstraints

Inherited from ConstraintHelper

Inherited from LogicalPlanStats

Inherited from AnalysisHelper

Inherited from QueryPlan[LogicalPlan]

Inherited from TreeNode[LogicalPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped