org.apache.spark.sql.hive.test

TestHiveContext

class TestHiveContext extends HiveContext

A locally running test instance of Spark's Hive execution engine.

Data from testTables will be automatically loaded whenever a query is run over those tables. Calling reset will delete all tables and other state in the database, leaving the database in a "clean" state.

TestHive is singleton object version of this class because instantiating multiple copies of the hive metastore seems to lead to weird non-deterministic failures. Therefore, the execution of test cases that rely on TestHive must be serialized.

Self Type
TestHiveContext
Linear Supertypes
HiveContext, SQLContext, Serializable, Serializable, UDFRegistration, ExpressionConversions, CacheManager, SQLConf, Logging, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. TestHiveContext
  2. HiveContext
  3. SQLContext
  4. Serializable
  5. Serializable
  6. UDFRegistration
  7. ExpressionConversions
  8. CacheManager
  9. SQLConf
  10. Logging
  11. AnyRef
  12. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new TestHiveContext(sc: SparkContext)

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. class HiveQLQueryExecution extends QueryExecution

    Attributes
    protected[org.apache.spark.sql.hive]
  6. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  7. abstract class QueryExecution extends TestHiveContext.QueryExecution

    Override QueryExecution with special debug workflow.

  8. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. implicit class SqlCmd extends AnyRef

    Attributes
    protected[org.apache.spark.sql.hive]
  10. case class TestTable(name: String, commands: () ⇒ Unit*) extends Product with Serializable

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def abs(e: Expression): Abs

    Definition Classes
    ExpressionConversions
  7. def analyze(tableName: String): Unit

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Analyzes the given table in the current database to generate statistics, which will be used in query optimizations.

    Right now, it only supports Hive tables and it only updates the size of a Hive table in the Hive metastore.

    Definition Classes
    HiveContext
  8. lazy val analyzer: Analyzer { val extendedRules: List[org.apache.spark.sql.catalyst.rules.Rule[org.apache.spark.sql.catalyst.plans.logical.LogicalPlan]] }

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  9. def applySchema(rowRDD: RDD[Row], schema: StructType): SchemaRDD

    :: DeveloperApi :: Creates a SchemaRDD from an RDD containing Rows by applying a schema to this RDD.

    :: DeveloperApi :: Creates a SchemaRDD from an RDD containing Rows by applying a schema to this RDD. It is important to make sure that the structure of every Row of the provided RDD matches the provided schema. Otherwise, there will be runtime exception. Example:

    import org.apache.spark.sql._
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    
    val schema =
      StructType(
        StructField("name", StringType, false) ::
        StructField("age", IntegerType, true) :: Nil)
    
    val people =
      sc.textFile("examples/src/main/resources/people.txt").map(
        _.split(",")).map(p => Row(p(0), p(1).trim.toInt))
    val peopleSchemaRDD = sqlContext. applySchema(people, schema)
    peopleSchemaRDD.printSchema
    // root
    // |-- name: string (nullable = false)
    // |-- age: integer (nullable = true)
    
      peopleSchemaRDD.registerTempTable("people")
    sqlContext.sql("select name from people").collect.foreach(println)
    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  10. def approxCountDistinct(e: Expression, rsd: Double): ApproxCountDistinct

    Definition Classes
    ExpressionConversions
  11. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  12. def avg(e: Expression): Average

    Definition Classes
    ExpressionConversions
  13. implicit def baseRelationToSchemaRDD(baseRelation: BaseRelation): SchemaRDD

    Definition Classes
    SQLContext
  14. implicit def bigDecimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  15. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  16. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  17. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  18. def cacheTable(tableName: String): Unit

    Caches the specified table in-memory.

    Caches the specified table in-memory.

    Definition Classes
    CacheManager
  19. var cacheTables: Boolean

  20. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  21. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
  23. def count(e: Expression): Count

    Definition Classes
    ExpressionConversions
  24. def countDistinct(e: Expression*): CountDistinct

    Definition Classes
    ExpressionConversions
  25. def createParquetFile[A <: Product](path: String, allowExisting: Boolean = true, conf: Configuration = new Configuration())(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table.

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table. This registered table can be used as the target of future insertInto operations.

    val sqlContext = new SQLContext(...)
    import sqlContext._
    
    case class Person(name: String, age: Int)
    createParquetFile[Person]("path/to/file.parquet").registerTempTable("people")
    sql("INSERT INTO people SELECT 'michael', 29")
    A

    A case class type that describes the desired schema of the parquet file to be created.

    path

    The path where the directory containing parquet metadata should be created. Data inserted into this table will also be stored at this location.

    allowExisting

    When false, an exception will be thrown if this directory already exists.

    conf

    A Hadoop configuration object that can be used to specify options to the parquet output format.

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  26. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Creates a SchemaRDD from an RDD of case classes.

    Creates a SchemaRDD from an RDD of case classes.

    Definition Classes
    SQLContext
  27. def createTable[A <: Product](tableName: String, allowExisting: Boolean = true)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Unit

    Creates a table using the schema of the given class.

    Creates a table using the schema of the given class.

    A

    A case class that is used to describe the schema of the table to be created.

    tableName

    The name of the table to create.

    allowExisting

    When false, an exception will be thrown if the table already exists.

    Definition Classes
    HiveContext
  28. implicit def dateToLiteral(d: Date): Literal

    Definition Classes
    ExpressionConversions
  29. val ddlParser: DDLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  30. implicit def decimalToLiteral(d: Decimal): Literal

    Definition Classes
    ExpressionConversions
  31. val describedTable: Regex

  32. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  33. def dropTempTable(tableName: String): Unit

    Drops the temporary table with the given table name in the catalog.

    Drops the temporary table with the given table name in the catalog. If the table has been cached/persisted before, it's also unpersisted.

    tableName

    the name of the table to be unregistered.

    Definition Classes
    SQLContext
  34. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  35. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  36. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  37. def executePlan(plan: LogicalPlan): QueryExecution

    Definition Classes
    TestHiveContextHiveContextSQLContext
  38. def executeSql(sql: String): TestHiveContext.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  39. var extraStrategies: Seq[Strategy]

    :: DeveloperApi :: Allows extra strategies to be injected into the query planner at runtime.

    :: DeveloperApi :: Allows extra strategies to be injected into the query planner at runtime. Note this API should be consider experimental and is not intended to be stable across releases.

    Definition Classes
    SQLContext
  40. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  41. def first(e: Expression): First

    Definition Classes
    ExpressionConversions
  42. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  43. lazy val functionRegistry: HiveFunctionRegistry with OverrideFunctionRegistry

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  44. def getAllConfs: Map[String, String]

    Return all the configuration properties that have been set (i.

    Return all the configuration properties that have been set (i.e. not the default). This creates a new copy of the config properties in the form of a Map.

    Definition Classes
    SQLConf
  45. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  46. def getConf(key: String, defaultValue: String): String

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue.

    Definition Classes
    SQLConf
  47. def getConf(key: String): String

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key.

    Definition Classes
    SQLConf
  48. def getHiveFile(path: String): File

  49. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  50. lazy val hiveDevHome: Option[File]

    The location of the hive source code.

  51. val hiveFilesTemp: File

  52. lazy val hiveHome: Option[File]

    The location of the compiled hive distribution

  53. val hivePlanner: SparkPlanner with HiveStrategies

    Definition Classes
    HiveContext
  54. val hiveQTestUtilTables: Seq[TestTable]

  55. lazy val hiveconf: HiveConf

    SQLConf and HiveConf contracts:

    SQLConf and HiveConf contracts:

    1. reuse existing started SessionState if any 2. when the Hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  56. val inRepoTests: File

  57. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  58. def isCached(tableName: String): Boolean

    Returns true if the table is currently cached in-memory.

    Returns true if the table is currently cached in-memory.

    Definition Classes
    CacheManager
  59. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  60. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  61. def jsonFile(path: String, samplingRatio: Double): SchemaRDD

    :: Experimental ::

    :: Experimental ::

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  62. def jsonFile(path: String, schema: StructType): SchemaRDD

    :: Experimental :: Loads a JSON file (one object per line) and applies the given schema, returning the result as a SchemaRDD.

    :: Experimental :: Loads a JSON file (one object per line) and applies the given schema, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  63. def jsonFile(path: String): SchemaRDD

    Loads a JSON file (one object per line), returning the result as a SchemaRDD.

    Loads a JSON file (one object per line), returning the result as a SchemaRDD. It goes through the entire dataset once to determine the schema.

    Definition Classes
    SQLContext
  64. def jsonRDD(json: RDD[String], samplingRatio: Double): SchemaRDD

    :: Experimental ::

    :: Experimental ::

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  65. def jsonRDD(json: RDD[String], schema: StructType): SchemaRDD

    :: Experimental :: Loads an RDD[String] storing JSON objects (one object per record) and applies the given schema, returning the result as a SchemaRDD.

    :: Experimental :: Loads an RDD[String] storing JSON objects (one object per record) and applies the given schema, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  66. def jsonRDD(json: RDD[String]): SchemaRDD

    Loads an RDD[String] storing JSON objects (one object per record), returning the result as a SchemaRDD.

    Loads an RDD[String] storing JSON objects (one object per record), returning the result as a SchemaRDD. It goes through the entire dataset once to determine the schema.

    Definition Classes
    SQLContext
  67. def last(e: Expression): Last

    Definition Classes
    ExpressionConversions
  68. def loadTestTable(name: String): Unit

  69. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  70. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  72. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  73. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  74. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  75. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  76. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  77. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  78. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  79. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  80. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  81. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD.

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD. Note that the LogicalPlan interface is considered internal, and thus not guaranteed to be stable. As a result, using them directly is not recommended.

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  82. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  83. def lower(e: Expression): Lower

    Definition Classes
    ExpressionConversions
  84. def max(e: Expression): Max

    Definition Classes
    ExpressionConversions
  85. lazy val metastorePath: String

  86. def min(e: Expression): Min

    Definition Classes
    ExpressionConversions
  87. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  88. final def notify(): Unit

    Definition Classes
    AnyRef
  89. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  90. lazy val optimizer: Optimizer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  91. val originalUdfs: Set[String]

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Attributes
    protected
  92. lazy val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  93. def parquetFile(path: String): SchemaRDD

    Loads a Parquet file, returning the result as a SchemaRDD.

    Loads a Parquet file, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
  94. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  95. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  96. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Prepares a planned SparkPlan for execution by inserting shuffle operations as needed.

    Prepares a planned SparkPlan for execution by inserting shuffle operations as needed.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  97. def registerFunction[T](name: String, func: Function22[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  98. def registerFunction[T](name: String, func: Function21[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  99. def registerFunction[T](name: String, func: Function20[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  100. def registerFunction[T](name: String, func: Function19[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  101. def registerFunction[T](name: String, func: Function18[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  102. def registerFunction[T](name: String, func: Function17[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  103. def registerFunction[T](name: String, func: Function16[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  104. def registerFunction[T](name: String, func: Function15[_, _, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  105. def registerFunction[T](name: String, func: Function14[_, _, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  106. def registerFunction[T](name: String, func: Function13[_, _, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  107. def registerFunction[T](name: String, func: Function12[_, _, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  108. def registerFunction[T](name: String, func: Function11[_, _, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  109. def registerFunction[T](name: String, func: Function10[_, _, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  110. def registerFunction[T](name: String, func: Function9[_, _, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  111. def registerFunction[T](name: String, func: Function8[_, _, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  112. def registerFunction[T](name: String, func: Function7[_, _, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  113. def registerFunction[T](name: String, func: Function6[_, _, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  114. def registerFunction[T](name: String, func: Function5[_, _, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  115. def registerFunction[T](name: String, func: Function4[_, _, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  116. def registerFunction[T](name: String, func: Function3[_, _, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  117. def registerFunction[T](name: String, func: Function2[_, _, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    Definition Classes
    UDFRegistration
  118. def registerFunction[T](name: String, func: Function1[_, T])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Unit

    registerFunction 1-22 were generated by this script

    registerFunction 1-22 were generated by this script

    (1 to 22).map { x => val types = (1 to x).map(x => "_").reduce(_ + ", " + _) s""" def registerFunction[T: TypeTag](name: String, func: Function$x[$types, T]): Unit = { def builder(e: Seq[Expression]) = ScalaUdf(func, ScalaReflection.schemaFor[T].dataType, e) functionRegistry.registerFunction(name, builder) } """ }

    Definition Classes
    UDFRegistration
  119. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Registers the given RDD as a temporary table in the catalog.

    Registers the given RDD as a temporary table in the catalog. Temporary tables exist only during the lifetime of this instance of SQLContext.

    Definition Classes
    SQLContext
  120. def registerTestTable(testTable: TestTable): HashMap[String, TestTable]

  121. def reset(): Unit

    Resets the test instance by deleting any tables that have been created.

    Resets the test instance by deleting any tables that have been created. TODO: also clear out UDFs, views, etc.

  122. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  123. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Definition Classes
    TestHiveContextHiveContext
  124. lazy val sessionState: SessionState

    SQLConf and HiveConf contracts:

    SQLConf and HiveConf contracts:

    1. reuse existing started SessionState if any 2. when the Hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  125. def setConf(key: String, value: String): Unit

    Set the given Spark SQL configuration property.

    Set the given Spark SQL configuration property.

    Definition Classes
    HiveContext → SQLConf
  126. def setConf(props: Properties): Unit

    Set Spark SQL configuration properties.

    Set Spark SQL configuration properties.

    Definition Classes
    SQLConf
  127. val settings: Map[String, String]

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Attributes
    protected[org.apache.spark]
    Definition Classes
    SQLConf
  128. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  129. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  130. def sql(sqlText: String): SchemaRDD

    Executes a SQL query using Spark, returning the result as a SchemaRDD.

    Executes a SQL query using Spark, returning the result as a SchemaRDD. The dialect that is used for SQL parsing can be configured with 'spark.sql.dialect'.

    Definition Classes
    HiveContextSQLContext
  131. val sqlParser: SparkSQLParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  132. def sqrt(e: Expression): Sqrt

    Definition Classes
    ExpressionConversions
  133. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  134. def sum(e: Expression): Sum

    Definition Classes
    ExpressionConversions
  135. def sumDistinct(e: Expression): SumDistinct

    Definition Classes
    ExpressionConversions
  136. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  137. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  138. def table(tableName: String): SchemaRDD

    Returns the specified table as a SchemaRDD

    Returns the specified table as a SchemaRDD

    Definition Classes
    SQLContext
  139. lazy val testTables: HashMap[String, TestTable]

    A list of test tables and the DDL required to initialize them.

    A list of test tables and the DDL required to initialize them. A test table is loaded on demand when a query are run against it.

  140. val testTempDir: File

  141. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  142. def toString(): String

    Definition Classes
    AnyRef → Any
  143. def uncacheTable(tableName: String): Unit

    Removes the specified table from the in-memory cache.

    Removes the specified table from the in-memory cache.

    Definition Classes
    CacheManager
  144. def upper(e: Expression): Upper

    Definition Classes
    ExpressionConversions
  145. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  146. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  147. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  148. lazy val warehousePath: String

Deprecated Value Members

  1. def hiveql(hqlQuery: String): SchemaRDD

    Definition Classes
    HiveContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.1)

  2. def hql(hqlQuery: String): SchemaRDD

    Definition Classes
    HiveContext
    Annotations
    @deprecated
    Deprecated

    (Since version 1.1)

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from UDFRegistration

Inherited from ExpressionConversions

Inherited from CacheManager

Inherited from SQLConf

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Spark SQL Functions

Ungrouped