org.apache.spark.sql.test

TestSQLContext

object TestSQLContext extends SQLContext

A SQLContext that can be used for local testing.

Linear Supertypes
SQLContext, Serializable, Serializable, ExpressionConversions, com.typesafe.scalalogging.slf4j.Logging, AnyRef, Any
Ordering
  1. Grouped
  2. Alphabetic
  3. By inheritance
Inherited
  1. TestSQLContext
  2. SQLContext
  3. Serializable
  4. Serializable
  5. ExpressionConversions
  6. Logging
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  6. abstract class QueryExecution extends AnyRef

    The primary workflow for executing relational queries using Spark.

  7. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  9. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  10. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  11. def cacheTable(tableName: String): Unit

    Caches the specified table in-memory.

    Caches the specified table in-memory.

    Definition Classes
    SQLContext
  12. lazy val catalog: Catalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  13. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  14. def createParquetFile[A <: Product](path: String, allowExisting: Boolean = true, conf: Configuration = new Configuration())(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table.

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table. This registered table can be used as the target of future insertInto operations.

    val sqlContext = new SQLContext(...)
    import sqlContext._
    
    case class Person(name: String, age: Int)
    createParquetFile[Person]("path/to/file.parquet").registerAsTable("people")
    sql("INSERT INTO people SELECT 'michael', 29")
    A

    A case class type that describes the desired schema of the parquet file to be created.

    path

    The path where the directory containing parquet metadata should be created. Data inserted into this table will also be stored at this location.

    allowExisting

    When false, an exception will be thrown if this directory already exists.

    conf

    A Hadoop configuration object that can be used to specify options to the parquet output format.

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  15. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Creates a SchemaRDD from an RDD of case classes.

    Creates a SchemaRDD from an RDD of case classes.

    Definition Classes
    SQLContext
  16. implicit def decimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  17. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  18. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  19. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  20. def executePlan(plan: LogicalPlan): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  21. def executeSql(sql: String): QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  22. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  23. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  24. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  25. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  26. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  27. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  28. lazy val logger: Logger

    Attributes
    protected
    Definition Classes
    Logging
  29. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD.

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD. Note that the LogicalPlan interface is considered internal, and thus not guaranteed to be stable. As a result, using them directly is not recommended.

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  30. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  31. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  32. final def notify(): Unit

    Definition Classes
    AnyRef
  33. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  34. val optimizer: Optimizer.type

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  35. def parquetFile(path: String): SchemaRDD

    Loads a Parquet file, returning the result as a SchemaRDD.

    Loads a Parquet file, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
  36. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  37. val parser: SqlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  38. val planner: SparkPlanner

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  39. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Prepares a planned SparkPlan for execution by binding references to specific ordinals, and inserting shuffle operations as needed.

    Prepares a planned SparkPlan for execution by binding references to specific ordinals, and inserting shuffle operations as needed.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  40. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Registers the given RDD as a temporary table in the catalog.

    Registers the given RDD as a temporary table in the catalog. Temporary tables exist only during the lifetime of this instance of SQLContext.

    Definition Classes
    SQLContext
  41. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  42. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  43. def sql(sqlText: String): SchemaRDD

    Executes a SQL query using Spark, returning the result as a SchemaRDD.

    Executes a SQL query using Spark, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
  44. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  45. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  46. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  47. def table(tableName: String): SchemaRDD

    Returns the specified table as a SchemaRDD

    Returns the specified table as a SchemaRDD

    Definition Classes
    SQLContext
  48. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  49. def toString(): String

    Definition Classes
    AnyRef → Any
  50. def uncacheTable(tableName: String): Unit

    Removes the specified table from the in-memory cache.

    Removes the specified table from the in-memory cache.

    Definition Classes
    SQLContext
  51. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  52. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  53. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from ExpressionConversions

Inherited from com.typesafe.scalalogging.slf4j.Logging

Inherited from AnyRef

Inherited from Any

Spark SQL Functions

Ungrouped