Package org.apache.spark.sql.jdbc
Class MySQLDialect
Object
org.apache.spark.sql.jdbc.JdbcDialect
org.apache.spark.sql.jdbc.MySQLDialect
- All Implemented Interfaces:
- Serializable,- org.apache.spark.internal.Logging,- org.apache.spark.sql.catalyst.SQLConfHelper,- NoLegacyJDBCError,- scala.Equals,- scala.Product
public class MySQLDialect
extends JdbcDialect
implements org.apache.spark.sql.catalyst.SQLConfHelper, NoLegacyJDBCError, scala.Product, Serializable
- See Also:
- 
Nested Class SummaryNested ClassesModifier and TypeClassDescriptionclassclassNested classes/interfaces inherited from interface org.apache.spark.internal.Loggingorg.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionabstract static Rapply()booleanCheck if this dialect instance can handle a certain jdbc url.classifyException(Throwable e, String condition, scala.collection.immutable.Map<String, String> messageParameters, String description, boolean isRuntime) Gets a dialect exception, classifies it and wraps it byAnalysisException.scala.Option<String>compileExpression(Expression expr) Converts V2 expression to String representing a SQL expression.createIndex(String indexName, Identifier tableIdent, NamedReference[] columns, Map<NamedReference, Map<String, String>> columnsProperties, Map<String, String> properties) Build a create index SQL statement.dropIndex(String indexName, Identifier tableIdent) Build a drop index SQL statement.dropSchema(String schema, boolean cascade) scala.Option<DataType>getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) Get the custom datatype mapping for the given jdbc meta information.getJdbcSQLQueryBuilder(org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Returns the SQL builder for the SELECT statement.scala.Option<JdbcType>getJDBCType(DataType dt) Retrieve the jdbc / sql type for a given datatype.getRenameColumnQuery(String tableName, String columnName, String newName, int dbMajorVersion) getSchemaCommentQuery(String schema, String comment) getTableCommentQuery(String table, String comment) getUpdateColumnNullabilityQuery(String tableName, String columnName, boolean isNullable) getUpdateColumnTypeQuery(String tableName, String columnName, String newDataType) booleanindexExists(Connection conn, String indexName, Identifier tableIdent, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Checks whether an index existsscala.Option<Object>Return Some[true] iffTRUNCATE TABLEcauses cascading default.booleanbooleanisSupportedFunction(String funcName) Returns whether the database supports function.booleanisSyntaxErrorBestEffort(SQLException exception) Attempts to determine if the given SQLException is a SQL syntax error.listIndexes(Connection conn, Identifier tableIdent, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Lists all the indexes in this table.String[][]listSchemas(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Lists all the schemas in this table.quoteIdentifier(String colName) Quotes the identifier.removeSchemaCommentQuery(String schema) booleanschemasExists(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options, String schema) Check schema exists or not.booleanbooleanReturns true if dialect supports JOIN operator.booleanReturns ture if dialect supports LIMIT clause.booleanReturns ture if dialect supports OFFSET clause.static StringtoString()Methods inherited from class org.apache.spark.sql.jdbc.JdbcDialectalterTable, beforeFetch, classifyException, compileAggregate, compileValue, convertJavaDateToDate, convertJavaTimestampToTimestamp, convertJavaTimestampToTimestampNTZ, convertTimestampNTZToJavaTimestamp, createConnectionFactory, createSchema, createTable, dropTable, functions, getAddColumnQuery, getDayTimeIntervalAsMicros, getDeleteColumnQuery, getFullyQualifiedQuotedTableName, getLimitClause, getOffsetClause, getSchemaQuery, getTableExistsQuery, getTableSample, getTruncateQuery, getTruncateQuery, getYearMonthIntervalAsMonths, insertIntoTable, renameTable, renameTable, supportsTableSample, updateExtraColumnMetaMethods inherited from class java.lang.Objectequals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface scala.EqualscanEqual, equalsMethods inherited from interface org.apache.spark.internal.LogginginitializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logBasedOnLevel, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, MDC, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContextMethods inherited from interface scala.ProductproductArity, productElement, productElementName, productElementNames, productIterator, productPrefixMethods inherited from interface org.apache.spark.sql.catalyst.SQLConfHelperconf, withSQLConf
- 
Constructor Details- 
MySQLDialectpublic MySQLDialect()
 
- 
- 
Method Details- 
applypublic abstract static R apply()
- 
toString
- 
canHandleDescription copied from class:JdbcDialectCheck if this dialect instance can handle a certain jdbc url.- Specified by:
- canHandlein class- JdbcDialect
- Parameters:
- url- the jdbc url.
- Returns:
- True if the dialect can be applied on the given jdbc url.
 
- 
isSupportedFunctionDescription copied from class:JdbcDialectReturns whether the database supports function.- Overrides:
- isSupportedFunctionin class- JdbcDialect
- Parameters:
- funcName- Upper-cased function name
- Returns:
- True if the database supports function.
 
- 
isObjectNotFoundException- Overrides:
- isObjectNotFoundExceptionin class- JdbcDialect
 
- 
compileExpressionDescription copied from class:JdbcDialectConverts V2 expression to String representing a SQL expression.- Overrides:
- compileExpressionin class- JdbcDialect
- Parameters:
- expr- The V2 expression to be converted.
- Returns:
- Converted value.
 
- 
getCatalystTypepublic scala.Option<DataType> getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) Description copied from class:JdbcDialectGet the custom datatype mapping for the given jdbc meta information.Guidelines for mapping database defined timestamps to Spark SQL timestamps: - 
     TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
     TimestampNTZType
- 
     TIMESTAMP WITHOUT TIME ZONE if !preferTimestampNTZ ->
     TimestampType(LTZ)
- TIMESTAMP WITH TIME ZONE -> TimestampType(LTZ)
- TIMESTAMP WITH LOCAL TIME ZONE -> TimestampType(LTZ)
- 
     If the TIMESTAMP cannot be distinguished by sqlTypeandtypeName, preferTimestampNTZ is respected for now, but we may need to add another option in the future if necessary.
 - Overrides:
- getCatalystTypein class- JdbcDialect
- Parameters:
- sqlType- Refers to- Typesconstants, or other constants defined by the target database, e.g.- -101is Oracle's TIMESTAMP WITH TIME ZONE type. This value is returned by- ResultSetMetaData.getColumnType(int).
- typeName- The column type name used by the database (e.g. "BIGINT UNSIGNED"). This is sometimes used to determine the target data type when- sqlTypeis not sufficient if multiple database types are conflated into a single id. This value is returned by- ResultSetMetaData.getColumnTypeName(int).
- size- The size of the type, e.g. the maximum precision for numeric types, length for character string, etc. This value is returned by- ResultSetMetaData.getPrecision(int).
- md- Result metadata associated with this type. This contains additional information from- ResultSetMetaDataor user specified options.- 
               isTimestampNTZ: Whether read a TIMESTAMP WITHOUT TIME ZONE value asTimestampNTZTypeor not. This is configured byJDBCOptions.preferTimestampNTZ.
- 
               scale: The length of fractional partResultSetMetaData.getScale(int)
 
- 
               
- Returns:
- An option the actual DataType (subclasses of DataType) or None if the default type mapping should be used.
 
- 
     TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
     
- 
quoteIdentifierDescription copied from class:JdbcDialectQuotes the identifier. This is used to put quotes around the identifier in case the column name is a reserved keyword, or in case it contains characters that require quotes (e.g. space).- Overrides:
- quoteIdentifierin class- JdbcDialect
- Parameters:
- colName- (undocumented)
- Returns:
- (undocumented)
 
- 
schemasExistspublic boolean schemasExists(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options, String schema) Description copied from class:JdbcDialectCheck schema exists or not.- Overrides:
- schemasExistsin class- JdbcDialect
- Parameters:
- conn- (undocumented)
- options- (undocumented)
- schema- (undocumented)
- Returns:
- (undocumented)
 
- 
listSchemaspublic String[][] listSchemas(Connection conn, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialectLists all the schemas in this table.- Overrides:
- listSchemasin class- JdbcDialect
- Parameters:
- conn- (undocumented)
- options- (undocumented)
- Returns:
- (undocumented)
 
- 
isCascadingTruncateTableDescription copied from class:JdbcDialectReturn Some[true] iffTRUNCATE TABLEcauses cascading default. Some[true] : TRUNCATE TABLE causes cascading. Some[false] : TRUNCATE TABLE does not cause cascading. None: The behavior of TRUNCATE TABLE is unknown (default).- Overrides:
- isCascadingTruncateTablein class- JdbcDialect
- Returns:
- (undocumented)
 
- 
isSyntaxErrorBestEffortDescription copied from class:JdbcDialectAttempts to determine if the given SQLException is a SQL syntax error.This check is best-effort: it may not detect all syntax errors across all JDBC dialects. However, if this method returns true, the exception is guaranteed to be a syntax error. This is used to decide whether to wrap the exception in a more appropriate Spark exception. - Overrides:
- isSyntaxErrorBestEffortin class- JdbcDialect
- Parameters:
- exception- (undocumented)
- Returns:
- true if the exception is confidently identified as a syntax error; false otherwise.
 
- 
getUpdateColumnTypeQuery- Overrides:
- getUpdateColumnTypeQueryin class- JdbcDialect
 
- 
getRenameColumnQuerypublic String getRenameColumnQuery(String tableName, String columnName, String newName, int dbMajorVersion) - Overrides:
- getRenameColumnQueryin class- JdbcDialect
 
- 
getUpdateColumnNullabilityQuerypublic String getUpdateColumnNullabilityQuery(String tableName, String columnName, boolean isNullable) - Overrides:
- getUpdateColumnNullabilityQueryin class- JdbcDialect
 
- 
getTableCommentQuery- Overrides:
- getTableCommentQueryin class- JdbcDialect
 
- 
getJDBCTypeDescription copied from class:JdbcDialectRetrieve the jdbc / sql type for a given datatype.- Overrides:
- getJDBCTypein class- JdbcDialect
- Parameters:
- dt- The datatype (e.g.- StringType)
- Returns:
- The new JdbcType if there is an override for this DataType
 
- 
getSchemaCommentQuery- Overrides:
- getSchemaCommentQueryin class- JdbcDialect
 
- 
removeSchemaCommentQuery- Overrides:
- removeSchemaCommentQueryin class- JdbcDialect
 
- 
createIndexpublic String createIndex(String indexName, Identifier tableIdent, NamedReference[] columns, Map<NamedReference, Map<String, String>> columnsProperties, Map<String, String> properties) Description copied from class:JdbcDialectBuild a create index SQL statement.- Overrides:
- createIndexin class- JdbcDialect
- Parameters:
- indexName- the name of the index to be created
- tableIdent- the table on which index to be created
- columns- the columns on which index to be created
- columnsProperties- the properties of the columns on which index to be created
- properties- the properties of the index to be created
- Returns:
- the SQL statement to use for creating the index.
 
- 
indexExistspublic boolean indexExists(Connection conn, String indexName, Identifier tableIdent, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialectChecks whether an index exists- Overrides:
- indexExistsin class- JdbcDialect
- Parameters:
- conn- (undocumented)
- indexName- the name of the index
- tableIdent- the table on which index to be checked
- options- JDBCOptions of the table
- Returns:
- true if the index with indexNameexists in the table withtableName, false otherwise
 
- 
dropIndexDescription copied from class:JdbcDialectBuild a drop index SQL statement.- Overrides:
- dropIndexin class- JdbcDialect
- Parameters:
- indexName- the name of the index to be dropped.
- tableIdent- the table on which index to be dropped.
- Returns:
- the SQL statement to use for dropping the index.
 
- 
listIndexespublic TableIndex[] listIndexes(Connection conn, Identifier tableIdent, org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialectLists all the indexes in this table.- Overrides:
- listIndexesin class- JdbcDialect
- Parameters:
- conn- (undocumented)
- tableIdent- (undocumented)
- options- (undocumented)
- Returns:
- (undocumented)
 
- 
classifyExceptionpublic Throwable classifyException(Throwable e, String condition, scala.collection.immutable.Map<String, String> messageParameters, String description, boolean isRuntime) Description copied from class:JdbcDialectGets a dialect exception, classifies it and wraps it byAnalysisException.- Specified by:
- classifyExceptionin interface- NoLegacyJDBCError
- Overrides:
- classifyExceptionin class- JdbcDialect
- Parameters:
- e- The dialect specific exception.
- condition- The error condition assigned in the case of an unclassified- e
- messageParameters- The message parameters of- errorClass
- description- The error description
- isRuntime- Whether the exception is a runtime exception or not.
- Returns:
- SparkThrowable + Throwableor its sub-class.
 
- 
dropSchema- Overrides:
- dropSchemain class- JdbcDialect
 
- 
getJdbcSQLQueryBuilderpublic JdbcSQLQueryBuilder getJdbcSQLQueryBuilder(org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialectReturns the SQL builder for the SELECT statement.- Overrides:
- getJdbcSQLQueryBuilderin class- JdbcDialect
- Parameters:
- options- (undocumented)
- Returns:
- (undocumented)
 
- 
supportsLimitpublic boolean supportsLimit()Description copied from class:JdbcDialectReturns ture if dialect supports LIMIT clause.Note: Some build-in dialect supports LIMIT clause with some trick, please see: OracleDialect.OracleSQLQueryBuilderandMsSqlServerDialect.MsSqlServerSQLQueryBuilder.- Overrides:
- supportsLimitin class- JdbcDialect
- Returns:
- (undocumented)
 
- 
supportsOffsetpublic boolean supportsOffset()Description copied from class:JdbcDialectReturns ture if dialect supports OFFSET clause.Note: Some build-in dialect supports OFFSET clause with some trick, please see: OracleDialect.OracleSQLQueryBuilderandMySQLDialect.MySQLSQLQueryBuilder.- Overrides:
- supportsOffsetin class- JdbcDialect
- Returns:
- (undocumented)
 
- 
supportsHintpublic boolean supportsHint()- Overrides:
- supportsHintin class- JdbcDialect
 
- 
supportsJoinpublic boolean supportsJoin()Description copied from class:JdbcDialectReturns true if dialect supports JOIN operator.- Overrides:
- supportsJoinin class- JdbcDialect
- Returns:
- (undocumented)
 
 
-