Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package exceptions
    Definition Classes
    sql
  • SqlScriptingException
c

org.apache.spark.sql.exceptions

SqlScriptingException

class SqlScriptingException extends Exception with SparkThrowable

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SqlScriptingException
  2. SparkThrowable
  3. Exception
  4. Throwable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new SqlScriptingException(errorClass: String, cause: Throwable, origin: Origin, messageParameters: Map[String, String] = Map.empty)

Value Members

  1. final def addSuppressed(arg0: Throwable): Unit
    Definition Classes
    Throwable
  2. def fillInStackTrace(): Throwable
    Definition Classes
    Throwable
  3. def getCause(): Throwable
    Definition Classes
    Throwable
  4. def getErrorClass(): String
  5. def getLocalizedMessage(): String
    Definition Classes
    Throwable
  6. def getMessage(): String
    Definition Classes
    Throwable
  7. def getMessageParameters(): Map[String, String]
  8. def getQueryContext(): Array[QueryContext]
    Definition Classes
    SparkThrowable
  9. def getSqlState(): String
    Definition Classes
    SparkThrowable
  10. def getStackTrace(): Array[StackTraceElement]
    Definition Classes
    Throwable
  11. final def getSuppressed(): Array[Throwable]
    Definition Classes
    Throwable
  12. def initCause(arg0: Throwable): Throwable
    Definition Classes
    Throwable
  13. def isInternalError(): Boolean
    Definition Classes
    SparkThrowable
  14. val origin: Origin
  15. def printStackTrace(arg0: PrintWriter): Unit
    Definition Classes
    Throwable
  16. def printStackTrace(arg0: PrintStream): Unit
    Definition Classes
    Throwable
  17. def printStackTrace(): Unit
    Definition Classes
    Throwable
  18. def setStackTrace(arg0: Array[StackTraceElement]): Unit
    Definition Classes
    Throwable
  19. def toString(): String
    Definition Classes
    Throwable → AnyRef → Any