Class

org.apache.spark.sql.jdbc

JdbcDialect

Related Doc: package jdbc

Permalink

abstract class JdbcDialect extends Serializable

:: DeveloperApi :: Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver. Lots of databases define types that aren't explicitly supported by the JDBC spec. Some JDBC drivers also report inaccurate information---for instance, BIT(n>1) being reported as a BIT type is quite common, even though BIT in JDBC is meant for single-bit values. Also, there does not appear to be a standard name for an unbounded string or binary type; we use BLOB and CLOB by default but override with database-specific alternatives when these are absent or do not behave correctly.

Currently, the only thing done by the dialect is type mapping. getCatalystType is used when reading from a JDBC table and getJDBCType is used when writing to a JDBC table. If getCatalystType returns null, the default type handling is used for the given JDBC type. Similarly, if getJDBCType returns (null, None), the default type handling is used for the given Catalyst type.

Annotations
@DeveloperApi()
Source
JdbcDialects.scala
Linear Supertypes
Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. JdbcDialect
  2. Serializable
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new JdbcDialect()

    Permalink

Abstract Value Members

  1. abstract def canHandle(url: String): Boolean

    Permalink

    Check if this dialect instance can handle a certain jdbc url.

    Check if this dialect instance can handle a certain jdbc url.

    url

    the jdbc url.

    returns

    True if the dialect can be applied on the given jdbc url.

    Exceptions thrown

    NullPointerException if the url is null.

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def beforeFetch(connection: Connection, properties: Map[String, String]): Unit

    Permalink

    Override connection specific properties to run before a select is made.

    Override connection specific properties to run before a select is made. This is in place to allow dialects that need special treatment to optimize behavior.

    connection

    The connection object

    properties

    The connection properties. This is passed through from the relation.

  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  9. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. def getCatalystType(sqlType: Int, typeName: String, size: Int, md: MetadataBuilder): Option[DataType]

    Permalink

    Get the custom datatype mapping for the given jdbc meta information.

    Get the custom datatype mapping for the given jdbc meta information.

    sqlType

    The sql type (see java.sql.Types)

    typeName

    The sql type name (e.g. "BIGINT UNSIGNED")

    size

    The size of the type.

    md

    Result metadata associated with this type.

    returns

    The actual DataType (subclasses of org.apache.spark.sql.types.DataType) or null if the default type mapping should be used.

  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. def getJDBCType(dt: DataType): Option[JdbcType]

    Permalink

    Retrieve the jdbc / sql type for a given datatype.

    Retrieve the jdbc / sql type for a given datatype.

    dt

    The datatype (e.g. org.apache.spark.sql.types.StringType)

    returns

    The new JdbcType if there is an override for this DataType

  13. def getTableExistsQuery(table: String): String

    Permalink

    Get the SQL query that should be used to find if the given table exists.

    Get the SQL query that should be used to find if the given table exists. Dialects can override this method to return a query that works best in a particular database.

    table

    The name of the table.

    returns

    The SQL query to use for checking the table.

  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. def quoteIdentifier(colName: String): String

    Permalink

    Quotes the identifier.

    Quotes the identifier. This is used to put quotes around the identifier in case the column name is a reserved keyword, or in case it contains characters that require quotes (e.g. space).

  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped