public class PreInsertCastAndRename
extends org.apache.spark.sql.catalyst.rules.Rule<org.apache.spark.sql.catalyst.plans.logical.LogicalPlan>
InsertableRelation
, we will use this rule to make sure that
the columns to be inserted have the correct data type and fields have the correct names.Constructor and Description |
---|
PreInsertCastAndRename() |
Modifier and Type | Method and Description |
---|---|
static org.apache.spark.sql.catalyst.plans.logical.LogicalPlan |
apply(org.apache.spark.sql.catalyst.plans.logical.LogicalPlan plan) |
static org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable |
castAndRenameChildOutput(org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable insertInto,
scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> expectedOutput,
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child)
If necessary, cast data types and rename fields to the expected types and names.
|
apply, isTraceEnabled, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning, org$apache$spark$Logging$$log__$eq, org$apache$spark$Logging$$log_, ruleName
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
initializeIfNecessary, initializeLogging, log_
public static org.apache.spark.sql.catalyst.plans.logical.LogicalPlan apply(org.apache.spark.sql.catalyst.plans.logical.LogicalPlan plan)
public static org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable castAndRenameChildOutput(org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable insertInto, scala.collection.Seq<org.apache.spark.sql.catalyst.expressions.Attribute> expectedOutput, org.apache.spark.sql.catalyst.plans.logical.LogicalPlan child)