Package org.apache.spark.sql.jdbc
Class MsSqlServerDialect
Object
org.apache.spark.sql.jdbc.JdbcDialect
org.apache.spark.sql.jdbc.MsSqlServerDialect
- All Implemented Interfaces:
Serializable
,org.apache.spark.internal.Logging
,scala.Equals
,scala.Product
- See Also:
-
Nested Class Summary
Modifier and TypeClassDescriptionclass
class
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionboolean
Check if this dialect instance can handle a certain jdbc url.classifyException
(Throwable e, String errorClass, scala.collection.immutable.Map<String, String> messageParameters, String description) Gets a dialect exception, classifies it and wraps it byAnalysisException
.scala.Option<String>
compileExpression
(Expression expr) Converts V2 expression to String representing a SQL expression.compileValue
(Object value) Converts value to SQL expression.static final int
static final int
GEOMETRY()
getAddColumnQuery
(String tableName, String columnName, String dataType) scala.Option<DataType>
getCatalystType
(int sqlType, String typeName, int size, MetadataBuilder md) Get the custom datatype mapping for the given jdbc meta information.getJdbcSQLQueryBuilder
(org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Returns the SQL builder for the SELECT statement.scala.Option<JdbcType>
getJDBCType
(DataType dt) Retrieve the jdbc / sql type for a given datatype.getLimitClause
(Integer limit) Returns the LIMIT clause for the SELECT statementgetRenameColumnQuery
(String tableName, String columnName, String newName, int dbMajorVersion) getTableCommentQuery
(String table, String comment) getUpdateColumnNullabilityQuery
(String tableName, String columnName, boolean isNullable) scala.Option<Object>
Return Some[true] iffTRUNCATE TABLE
causes cascading default.boolean
isSupportedFunction
(String funcName) Returns whether the database supports function.renameTable
(Identifier oldTable, Identifier newTable) Rename an existing table.boolean
Returns ture if dialect supports LIMIT clause.Methods inherited from class org.apache.spark.sql.jdbc.JdbcDialect
alterTable, beforeFetch, classifyException, compileAggregate, convertJavaDateToDate, convertJavaTimestampToTimestamp, convertJavaTimestampToTimestampNTZ, convertTimestampNTZToJavaTimestamp, createConnectionFactory, createIndex, createSchema, createTable, dropIndex, dropSchema, dropTable, functions, getDayTimeIntervalAsMicros, getDeleteColumnQuery, getFullyQualifiedQuotedTableName, getOffsetClause, getSchemaCommentQuery, getSchemaQuery, getTableExistsQuery, getTableSample, getTruncateQuery, getTruncateQuery, getUpdateColumnTypeQuery, getYearMonthIntervalAsMonths, indexExists, insertIntoTable, listIndexes, listSchemas, quoteIdentifier, removeSchemaCommentQuery, renameTable, schemasExists, supportsOffset, supportsTableSample, updateExtraColumnMeta
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface scala.Equals
canEqual, equals
Methods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
Methods inherited from interface scala.Product
productArity, productElement, productElementName, productElementNames, productIterator, productPrefix
-
Constructor Details
-
MsSqlServerDialect
public MsSqlServerDialect()
-
-
Method Details
-
GEOMETRY
public static final int GEOMETRY() -
GEOGRAPHY
public static final int GEOGRAPHY() -
canHandle
Description copied from class:JdbcDialect
Check if this dialect instance can handle a certain jdbc url.- Specified by:
canHandle
in classJdbcDialect
- Parameters:
url
- the jdbc url.- Returns:
- True if the dialect can be applied on the given jdbc url.
-
compileValue
Description copied from class:JdbcDialect
Converts value to SQL expression.- Overrides:
compileValue
in classJdbcDialect
- Parameters:
value
- The value to be converted.- Returns:
- Converted value.
-
isSupportedFunction
Description copied from class:JdbcDialect
Returns whether the database supports function.- Overrides:
isSupportedFunction
in classJdbcDialect
- Parameters:
funcName
- Upper-cased function name- Returns:
- True if the database supports function.
-
compileExpression
Description copied from class:JdbcDialect
Converts V2 expression to String representing a SQL expression.- Overrides:
compileExpression
in classJdbcDialect
- Parameters:
expr
- The V2 expression to be converted.- Returns:
- Converted value.
-
getCatalystType
public scala.Option<DataType> getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) Description copied from class:JdbcDialect
Get the custom datatype mapping for the given jdbc meta information.Guidelines for mapping database defined timestamps to Spark SQL timestamps:
-
TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
TimestampNTZType
-
TIMESTAMP WITHOUT TIME ZONE if !preferTimestampNTZ ->
TimestampType
(LTZ) - TIMESTAMP WITH TIME ZONE ->
TimestampType
(LTZ) - TIMESTAMP WITH LOCAL TIME ZONE ->
TimestampType
(LTZ) -
If the TIMESTAMP cannot be distinguished by
sqlType
andtypeName
, preferTimestampNTZ is respected for now, but we may need to add another option in the future if necessary.
- Overrides:
getCatalystType
in classJdbcDialect
- Parameters:
sqlType
- Refers toTypes
constants, or other constants defined by the target database, e.g.-101
is Oracle's TIMESTAMP WITH TIME ZONE type. This value is returned byResultSetMetaData.getColumnType(int)
.typeName
- The column type name used by the database (e.g. "BIGINT UNSIGNED"). This is sometimes used to determine the target data type whensqlType
is not sufficient if multiple database types are conflated into a single id. This value is returned byResultSetMetaData.getColumnTypeName(int)
.size
- The size of the type, e.g. the maximum precision for numeric types, length for character string, etc. This value is returned byResultSetMetaData.getPrecision(int)
.md
- Result metadata associated with this type. This contains additional information fromResultSetMetaData
or user specified options.-
isTimestampNTZ
: Whether read a TIMESTAMP WITHOUT TIME ZONE value asTimestampNTZType
or not. This is configured byJDBCOptions.preferTimestampNTZ
. -
scale
: The length of fractional partResultSetMetaData.getScale(int)
-
- Returns:
- An option the actual DataType (subclasses of
DataType
) or None if the default type mapping should be used.
-
TIMESTAMP WITHOUT TIME ZONE if preferTimestampNTZ ->
-
getJDBCType
Description copied from class:JdbcDialect
Retrieve the jdbc / sql type for a given datatype.- Overrides:
getJDBCType
in classJdbcDialect
- Parameters:
dt
- The datatype (e.g.StringType
)- Returns:
- The new JdbcType if there is an override for this DataType
-
isCascadingTruncateTable
Description copied from class:JdbcDialect
Return Some[true] iffTRUNCATE TABLE
causes cascading default. Some[true] : TRUNCATE TABLE causes cascading. Some[false] : TRUNCATE TABLE does not cause cascading. None: The behavior of TRUNCATE TABLE is unknown (default).- Overrides:
isCascadingTruncateTable
in classJdbcDialect
- Returns:
- (undocumented)
-
renameTable
Description copied from class:JdbcDialect
Rename an existing table.- Overrides:
renameTable
in classJdbcDialect
- Parameters:
oldTable
- The existing table.newTable
- New name of the table.- Returns:
- The SQL statement to use for renaming the table.
-
getAddColumnQuery
- Overrides:
getAddColumnQuery
in classJdbcDialect
-
getRenameColumnQuery
public String getRenameColumnQuery(String tableName, String columnName, String newName, int dbMajorVersion) - Overrides:
getRenameColumnQuery
in classJdbcDialect
-
getUpdateColumnNullabilityQuery
public String getUpdateColumnNullabilityQuery(String tableName, String columnName, boolean isNullable) - Overrides:
getUpdateColumnNullabilityQuery
in classJdbcDialect
-
getTableCommentQuery
- Overrides:
getTableCommentQuery
in classJdbcDialect
-
getLimitClause
Description copied from class:JdbcDialect
Returns the LIMIT clause for the SELECT statement- Overrides:
getLimitClause
in classJdbcDialect
- Parameters:
limit
- (undocumented)- Returns:
- (undocumented)
-
classifyException
public AnalysisException classifyException(Throwable e, String errorClass, scala.collection.immutable.Map<String, String> messageParameters, String description) Description copied from class:JdbcDialect
Gets a dialect exception, classifies it and wraps it byAnalysisException
.- Overrides:
classifyException
in classJdbcDialect
- Parameters:
e
- The dialect specific exception.errorClass
- The error class assigned in the case of an unclassifiede
messageParameters
- The message parameters oferrorClass
description
- The error description- Returns:
AnalysisException
or its sub-class.
-
getJdbcSQLQueryBuilder
public JdbcSQLQueryBuilder getJdbcSQLQueryBuilder(org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions options) Description copied from class:JdbcDialect
Returns the SQL builder for the SELECT statement.- Overrides:
getJdbcSQLQueryBuilder
in classJdbcDialect
- Parameters:
options
- (undocumented)- Returns:
- (undocumented)
-
supportsLimit
public boolean supportsLimit()Description copied from class:JdbcDialect
Returns ture if dialect supports LIMIT clause.Note: Some build-in dialect supports LIMIT clause with some trick, please see:
OracleDialect.OracleSQLQueryBuilder
andMsSqlServerDialect.MsSqlServerSQLQueryBuilder
.- Overrides:
supportsLimit
in classJdbcDialect
- Returns:
- (undocumented)
-