Package

org.apache.spark.sql

jdbc

Permalink

package jdbc

Visibility
  1. Public
  2. All

Type Members

  1. abstract class JdbcDialect extends Serializable

    Permalink

    :: DeveloperApi :: Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver.

    :: DeveloperApi :: Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver. Lots of databases define types that aren't explicitly supported by the JDBC spec. Some JDBC drivers also report inaccurate information---for instance, BIT(n>1) being reported as a BIT type is quite common, even though BIT in JDBC is meant for single-bit values. Also, there does not appear to be a standard name for an unbounded string or binary type; we use BLOB and CLOB by default but override with database-specific alternatives when these are absent or do not behave correctly.

    Currently, the only thing done by the dialect is type mapping. getCatalystType is used when reading from a JDBC table and getJDBCType is used when writing to a JDBC table. If getCatalystType returns null, the default type handling is used for the given JDBC type. Similarly, if getJDBCType returns (null, None), the default type handling is used for the given Catalyst type.

    Annotations
    @DeveloperApi() @Evolving()
  2. case class JdbcType(databaseTypeDefinition: String, jdbcNullType: Int) extends Product with Serializable

    Permalink

    :: DeveloperApi :: A database type definition coupled with the jdbc type needed to send null values to the database.

    :: DeveloperApi :: A database type definition coupled with the jdbc type needed to send null values to the database.

    databaseTypeDefinition

    The database type definition

    jdbcNullType

    The jdbc type (as defined in java.sql.Types) used to send a null value to the database.

    Annotations
    @DeveloperApi() @Evolving()

Value Members

  1. object JdbcDialects

    Permalink

    :: DeveloperApi :: Registry of dialects that apply to every new jdbc org.apache.spark.sql.DataFrame.

    :: DeveloperApi :: Registry of dialects that apply to every new jdbc org.apache.spark.sql.DataFrame.

    If multiple matching dialects are registered then all matching ones will be tried in reverse order. A user-added dialect will thus be applied first, overwriting the defaults.

    Annotations
    @DeveloperApi() @Evolving()
    Note

    All new dialects are applied to new jdbc DataFrames only. Make sure to register your dialects first.

Ungrouped