Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package api

    Contains API classes that are specific to a single language (i.e.

    Contains API classes that are specific to a single language (i.e. Java).

    Definition Classes
    sql
  • package artifact
    Definition Classes
    sql
  • package avro
    Definition Classes
    sql
  • package catalog
    Definition Classes
    sql
  • package catalyst
    Definition Classes
    sql
  • package columnar
    Definition Classes
    sql
  • package connector
    Definition Classes
    sql
  • package exceptions
    Definition Classes
    sql
  • package expressions
    Definition Classes
    sql
  • package jdbc
    Definition Classes
    sql
  • JdbcConnectionProvider
  • JdbcDialect
  • JdbcDialects
  • JdbcSQLQueryBuilder
  • JdbcType
  • NoLegacyJDBCError
  • package ml
    Definition Classes
    sql
  • package scripting
    Definition Classes
    sql
  • package sources

    A set of APIs for adding data sources to Spark SQL.

    A set of APIs for adding data sources to Spark SQL.

    Definition Classes
    sql
  • package streaming
    Definition Classes
    sql
  • package types

    Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.

    Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.

    Definition Classes
    sql
  • package util
    Definition Classes
    sql
  • package vectorized
    Definition Classes
    sql

package jdbc

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. Protected

Type Members

  1. abstract class JdbcConnectionProvider extends AnyRef

    ::DeveloperApi:: Connection provider which opens connection toward various databases (database specific instance needed).

    ::DeveloperApi:: Connection provider which opens connection toward various databases (database specific instance needed). If any authentication required then it's the provider's responsibility to set all the parameters. Important to mention connection providers within a JVM used from multiple threads so adding internal state is not advised. If any state added then it must be synchronized properly.

    Annotations
    @DeveloperApi() @Unstable()
    Since

    3.1.0

  2. abstract class JdbcDialect extends Serializable with Logging

    :: DeveloperApi :: Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver.

    :: DeveloperApi :: Encapsulates everything (extensions, workarounds, quirks) to handle the SQL dialect of a certain database or jdbc driver. Lots of databases define types that aren't explicitly supported by the JDBC spec. Some JDBC drivers also report inaccurate information---for instance, BIT(n>1) being reported as a BIT type is quite common, even though BIT in JDBC is meant for single-bit values. Also, there does not appear to be a standard name for an unbounded string or binary type; we use BLOB and CLOB by default but override with database-specific alternatives when these are absent or do not behave correctly.

    Currently, the only thing done by the dialect is type mapping. getCatalystType is used when reading from a JDBC table and getJDBCType is used when writing to a JDBC table. If getCatalystType returns null, the default type handling is used for the given JDBC type. Similarly, if getJDBCType returns (null, None), the default type handling is used for the given Catalyst type.

    Annotations
    @DeveloperApi()
  3. class JdbcSQLQueryBuilder extends AnyRef

    The builder to build a single SELECT query.

    The builder to build a single SELECT query.

    Note: All the withXXX methods will be invoked at most once. The invocation order does not matter, as all these clauses follow the natural SQL order: sample the table first, then filter, then group by, then sort, then offset, then limit.

    Since

    3.5.0

  4. case class JdbcType(databaseTypeDefinition: String, jdbcNullType: Int) extends Product with Serializable

    :: DeveloperApi :: A database type definition coupled with the jdbc type needed to send null values to the database.

    :: DeveloperApi :: A database type definition coupled with the jdbc type needed to send null values to the database.

    databaseTypeDefinition

    The database type definition

    jdbcNullType

    The jdbc type (as defined in java.sql.Types) used to send a null value to the database.

    Annotations
    @DeveloperApi()
  5. trait NoLegacyJDBCError extends JdbcDialect

    Make the classifyException method throw out the original exception

Value Members

  1. object JdbcDialects

    :: DeveloperApi :: Registry of dialects that apply to every new jdbc org.apache.spark.sql.DataFrame.

    :: DeveloperApi :: Registry of dialects that apply to every new jdbc org.apache.spark.sql.DataFrame.

    If multiple matching dialects are registered then all matching ones will be tried in reverse order. A user-added dialect will thus be applied first, overwriting the defaults.

    Annotations
    @DeveloperApi()
    Note

    All new dialects are applied to new jdbc DataFrames only. Make sure to register your dialects first.

Ungrouped