org.apache.spark

sql

package sql

Allows the execution of relational queries, including those expressed in SQL using Spark.

Linear Supertypes
AnyRef, Any
Ordering
  1. Grouped
  2. Alphabetic
  3. By inheritance
Inherited
  1. sql
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. class AnalysisException extends Exception with Serializable

    :: DeveloperApi :: Thrown when a query fails to analyze, usually because the query itself is invalid.

  2. class Column extends Logging

    :: Experimental :: A column in a DataFrame.

  3. class ColumnName extends Column

    :: Experimental :: A convenient class used for constructing schema.

  4. class DataFrame extends Serializable

    :: Experimental :: A distributed collection of data organized into named columns.

  5. final class DataFrameNaFunctions extends AnyRef

    :: Experimental :: Functionality for working with missing data in DataFrames.

  6. class DataFrameReader extends Logging

    :: Experimental :: Interface used to load a DataFrame from external storage systems (e.

  7. final class DataFrameStatFunctions extends AnyRef

    :: Experimental :: Statistic functions for DataFrames.

  8. final class DataFrameWriter extends AnyRef

    :: Experimental :: Interface used to write a DataFrame to external storage systems (e.

  9. class ExperimentalMethods extends AnyRef

    :: Experimental :: Holder for experimental methods for the bravest.

  10. class GroupedData extends AnyRef

    :: Experimental :: A set of methods for aggregations on a DataFrame, created by DataFrame.groupBy.

  11. trait Row extends Serializable

    Represents one row of output from a relational operator.

  12. class RowFactory extends AnyRef

  13. class SQLContext extends Logging with Serializable

    The entry point for working with structured data (rows and columns) in Spark.

  14. class SaveMode extends Enum[SaveMode]

  15. type Strategy = GenericStrategy[SparkPlan]

    Converts a logical plan into zero or more SparkPlans.

    Converts a logical plan into zero or more SparkPlans. This API is exposed for experimenting with the query planner and is not designed to be stable across spark releases. Developers writing libraries should instead consider using the stable APIs provided in org.apache.spark.sql.sources

    Annotations
    @DeveloperApi()
  16. class UDFRegistration extends Logging

    Functions for registering user-defined functions.

  17. case class UserDefinedFunction(f: AnyRef, dataType: DataType, inputTypes: Seq[DataType] = immutable.this.Nil) extends Product with Serializable

    A user-defined function.

  18. type SchemaRDD = DataFrame

    Type alias for DataFrame.

    Type alias for DataFrame. Kept here for backward source compatibility for Scala.

    Annotations
    @deprecated
    Deprecated

    (Since version 1.3.0) use DataFrame

Value Members

  1. object Row extends Serializable

  2. object SQLContext extends Serializable

    This SQLContext object contains utility functions to create a singleton SQLContext instance, or to get the last created SQLContext instance.

  3. package api

    Contains API classes that are specific to a single language (i.

  4. package expressions

  5. object functions

    :: Experimental :: Functions available for DataFrame.

  6. package hive

    Support for running Spark SQL queries using functionality from Apache Hive (does not require an existing Hive installation).

  7. package jdbc

  8. package sources

    A set of APIs for adding data sources to Spark SQL.

  9. package types

    Contains a type system for attributes produced by relations, including complex types like structs, arrays and maps.

Inherited from AnyRef

Inherited from Any

Row

Ungrouped