org.apache.spark.sql.hive.test

TestHive

object TestHive extends TestHiveContext

Linear Supertypes
TestHiveContext, LocalHiveContext, HiveContext, SQLContext, Serializable, Serializable, ExpressionConversions, SQLConf, com.typesafe.scalalogging.slf4j.Logging, AnyRef, Any
Ordering
  1. Grouped
  2. Alphabetic
  3. By inheritance
Inherited
  1. TestHive
  2. TestHiveContext
  3. LocalHiveContext
  4. HiveContext
  5. SQLContext
  6. Serializable
  7. Serializable
  8. ExpressionConversions
  9. SQLConf
  10. Logging
  11. AnyRef
  12. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. implicit class DslAttribute extends AnyRef

    Definition Classes
    ExpressionConversions
  2. implicit class DslExpression extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  3. implicit class DslString extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  4. implicit class DslSymbol extends ImplicitAttribute

    Definition Classes
    ExpressionConversions
  5. class HiveQLQueryExecution extends QueryExecution

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    TestHiveContext
  6. abstract class ImplicitAttribute extends ImplicitOperators

    Definition Classes
    ExpressionConversions
  7. abstract class QueryExecution extends TestHiveContext.QueryExecution

    Override QueryExecution with special debug workflow.

  8. class SparkPlanner extends SparkStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  9. implicit class SqlCmd extends AnyRef

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    TestHiveContext
  10. case class TestTable(name: String, commands: () ⇒ Unit*) extends Product with Serializable

    Definition Classes
    TestHiveContext

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. lazy val analyzer: Analyzer

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def avg(e: Expression): Average

    Definition Classes
    ExpressionConversions
  9. implicit def binaryToLiteral(a: Array[Byte]): Literal

    Definition Classes
    ExpressionConversions
  10. implicit def booleanToLiteral(b: Boolean): Literal

    Definition Classes
    ExpressionConversions
  11. implicit def byteToLiteral(b: Byte): Literal

    Definition Classes
    ExpressionConversions
  12. def cacheTable(tableName: String): Unit

    Caches the specified table in-memory.

    Caches the specified table in-memory.

    Definition Classes
    SQLContext
  13. var cacheTables: Boolean

    Definition Classes
    TestHiveContext
  14. lazy val catalog: HiveMetastoreCatalog with OverrideCatalog

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. def configure(): Unit

    Sets up the system initially or after a RESET command

    Sets up the system initially or after a RESET command

    Attributes
    protected
    Definition Classes
    LocalHiveContext
  17. def contains(key: String): Boolean

    Definition Classes
    SQLConf
  18. def count(e: Expression): Count

    Definition Classes
    ExpressionConversions
  19. def countDistinct(e: Expression*): CountDistinct

    Definition Classes
    ExpressionConversions
  20. def createParquetFile[A <: Product](path: String, allowExisting: Boolean = true, conf: Configuration = new Configuration())(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table.

    :: Experimental :: Creates an empty parquet file with the schema of class A, which can be registered as a table. This registered table can be used as the target of future insertInto operations.

    val sqlContext = new SQLContext(...)
    import sqlContext._
    
    case class Person(name: String, age: Int)
    createParquetFile[Person]("path/to/file.parquet").registerAsTable("people")
    sql("INSERT INTO people SELECT 'michael', 29")
    A

    A case class type that describes the desired schema of the parquet file to be created.

    path

    The path where the directory containing parquet metadata should be created. Data inserted into this table will also be stored at this location.

    allowExisting

    When false, an exception will be thrown if this directory already exists.

    conf

    A Hadoop configuration object that can be used to specify options to the parquet output format.

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  21. implicit def createSchemaRDD[A <: Product](rdd: RDD[A])(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): SchemaRDD

    Creates a SchemaRDD from an RDD of case classes.

    Creates a SchemaRDD from an RDD of case classes.

    Definition Classes
    SQLContext
  22. def createTable[A <: Product](tableName: String, allowExisting: Boolean = true)(implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[A]): Unit

    Creates a table using the schema of the given class.

    Creates a table using the schema of the given class.

    A

    A case class that is used to describe the schema of the table to be created.

    tableName

    The name of the table to create.

    allowExisting

    When false, an exception will be thrown if the table already exists.

    Definition Classes
    HiveContext
  23. implicit def decimalToLiteral(d: BigDecimal): Literal

    Definition Classes
    ExpressionConversions
  24. val describedTable: Regex

    Definition Classes
    TestHiveContext
  25. implicit def doubleToLiteral(d: Double): Literal

    Definition Classes
    ExpressionConversions
  26. lazy val emptyResult: RDD[Row]

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  27. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  28. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  29. def executePlan(plan: LogicalPlan): QueryExecution

    Definition Classes
    TestHiveContextHiveContextSQLContext
  30. def executeSql(sql: String): TestHive.QueryExecution

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  31. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  32. def first(e: Expression): First

    Definition Classes
    ExpressionConversions
  33. implicit def floatToLiteral(f: Float): Literal

    Definition Classes
    ExpressionConversions
  34. def get(key: String, defaultValue: String): String

    Definition Classes
    SQLConf
  35. def get(key: String): String

    Definition Classes
    SQLConf
  36. def getAll: Array[(String, String)]

    Definition Classes
    SQLConf
  37. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  38. def getHiveFile(path: String): File

    Definition Classes
    TestHiveContext
  39. def getOption(key: String): Option[String]

    Definition Classes
    SQLConf
  40. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  41. lazy val hiveDevHome: Option[File]

    The location of the hive source code.

    The location of the hive source code.

    Definition Classes
    TestHiveContext
  42. val hiveFilesTemp: File

    Definition Classes
    TestHiveContext
  43. lazy val hiveHome: Option[File]

    The location of the compiled hive distribution

    The location of the compiled hive distribution

    Definition Classes
    TestHiveContext
  44. val hivePlanner: SparkPlanner with HiveStrategies

    Definition Classes
    HiveContext
  45. val hiveQTestUtilTables: Seq[TestTable]

    Definition Classes
    TestHiveContext
  46. lazy val hiveconf: HiveConf

    SQLConf and HiveConf contracts: when the hive session is first initialized, params in HiveConf will get picked up by the SQLConf.

    SQLConf and HiveConf contracts: when the hive session is first initialized, params in HiveConf will get picked up by the SQLConf. Additionally, any properties set by set() or a SET command inside hql() or sql() will be set in the SQLConf *as well as* in the HiveConf.

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  47. def hiveql(hqlQuery: String): SchemaRDD

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Executes a query expressed in HiveQL using Spark, returning the result as a SchemaRDD.

    Definition Classes
    HiveContext
  48. def hql(hqlQuery: String): SchemaRDD

    An alias for hiveql.

    An alias for hiveql.

    Definition Classes
    HiveContext
  49. val inRepoTests: File

    Definition Classes
    TestHiveContext
  50. implicit def intToLiteral(i: Int): Literal

    Definition Classes
    ExpressionConversions
  51. def isCached(tableName: String): Boolean

    Returns true if the table is currently cached in-memory.

    Returns true if the table is currently cached in-memory.

    Definition Classes
    SQLContext
  52. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  53. def jsonFile(path: String, samplingRatio: Double): SchemaRDD

    :: Experimental ::

    :: Experimental ::

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  54. def jsonFile(path: String): SchemaRDD

    Loads a JSON file (one object per line), returning the result as a SchemaRDD.

    Loads a JSON file (one object per line), returning the result as a SchemaRDD. It goes through the entire dataset once to determine the schema.

    Definition Classes
    SQLContext
  55. def jsonRDD(json: RDD[String], samplingRatio: Double): SchemaRDD

    :: Experimental ::

    :: Experimental ::

    Definition Classes
    SQLContext
    Annotations
    @Experimental()
  56. def jsonRDD(json: RDD[String]): SchemaRDD

    Loads an RDD[String] storing JSON objects (one object per record), returning the result as a SchemaRDD.

    Loads an RDD[String] storing JSON objects (one object per record), returning the result as a SchemaRDD. It goes through the entire dataset once to determine the schema.

    Definition Classes
    SQLContext
  57. def loadTestTable(name: String): Unit

    Definition Classes
    TestHiveContext
  58. lazy val logger: Logger

    Attributes
    protected
    Definition Classes
    Logging
  59. implicit def logicalPlanToSparkQuery(plan: LogicalPlan): SchemaRDD

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD.

    :: DeveloperApi :: Allows catalyst LogicalPlans to be executed as a SchemaRDD. Note that the LogicalPlan interface is considered internal, and thus not guaranteed to be stable. As a result, using them directly is not recommended.

    Definition Classes
    SQLContext
    Annotations
    @DeveloperApi()
  60. implicit def longToLiteral(l: Long): Literal

    Definition Classes
    ExpressionConversions
  61. def lower(e: Expression): Lower

    Definition Classes
    ExpressionConversions
  62. def max(e: Expression): Max

    Definition Classes
    ExpressionConversions
  63. lazy val metastorePath: String

    Definition Classes
    TestHiveContextLocalHiveContext
  64. def min(e: Expression): Min

    Definition Classes
    ExpressionConversions
  65. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  66. final def notify(): Unit

    Definition Classes
    AnyRef
  67. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  68. val optimizer: Optimizer.type

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  69. val originalUdfs: Set[String]

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Records the UDFs present when the server starts, so we can delete ones that are created by tests.

    Attributes
    protected
    Definition Classes
    TestHiveContext
  70. val outputBuffer: OutputStream { ... /* 4 definitions in type refinement */ }

    Attributes
    protected
    Definition Classes
    HiveContext
  71. def parquetFile(path: String): SchemaRDD

    Loads a Parquet file, returning the result as a SchemaRDD.

    Loads a Parquet file, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
  72. def parseSql(sql: String): LogicalPlan

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  73. val parser: SqlParser

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  74. val planner: SparkPlanner with HiveStrategies

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    HiveContextSQLContext
  75. val prepareForExecution: RuleExecutor[SparkPlan] { val batches: List[this.Batch] }

    Prepares a planned SparkPlan for execution by binding references to specific ordinals, and inserting shuffle operations as needed.

    Prepares a planned SparkPlan for execution by binding references to specific ordinals, and inserting shuffle operations as needed.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SQLContext
  76. def registerRDDAsTable(rdd: SchemaRDD, tableName: String): Unit

    Registers the given RDD as a temporary table in the catalog.

    Registers the given RDD as a temporary table in the catalog. Temporary tables exist only during the lifetime of this instance of SQLContext.

    Definition Classes
    SQLContext
  77. def registerTestTable(testTable: TestTable): HashMap[String, TestTable]

    Definition Classes
    TestHiveContext
  78. def reset(): Unit

    Resets the test instance by deleting any tables that have been created.

    Resets the test instance by deleting any tables that have been created. TODO: also clear out UDFs, views, etc.

    Definition Classes
    TestHiveContext
  79. def runHive(cmd: String, maxRows: Int = 1000): Seq[String]

    Execute the command using Hive and return the results as a sequence.

    Execute the command using Hive and return the results as a sequence. Each element in the sequence is one row.

    Attributes
    protected
    Definition Classes
    HiveContext
  80. def runSqlHive(sql: String): Seq[String]

    Runs the specified SQL query using Hive.

    Runs the specified SQL query using Hive.

    Definition Classes
    TestHiveContextHiveContext
  81. lazy val sessionState: SessionState

    Attributes
    protected[org.apache.spark.sql.hive]
    Definition Classes
    HiveContext
  82. def set(key: String, value: String): Unit

    Definition Classes
    HiveContextSQLConf
  83. def set(props: Properties): Unit

    Definition Classes
    SQLConf
  84. implicit def shortToLiteral(s: Short): Literal

    Definition Classes
    ExpressionConversions
  85. val sparkContext: SparkContext

    Definition Classes
    SQLContext
  86. def sql(sqlText: String): SchemaRDD

    Executes a SQL query using Spark, returning the result as a SchemaRDD.

    Executes a SQL query using Spark, returning the result as a SchemaRDD.

    Definition Classes
    SQLContext
  87. implicit def stringToLiteral(s: String): Literal

    Definition Classes
    ExpressionConversions
  88. def sum(e: Expression): Sum

    Definition Classes
    ExpressionConversions
  89. def sumDistinct(e: Expression): SumDistinct

    Definition Classes
    ExpressionConversions
  90. implicit def symbolToUnresolvedAttribute(s: Symbol): UnresolvedAttribute

    Definition Classes
    ExpressionConversions
  91. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  92. def table(tableName: String): SchemaRDD

    Returns the specified table as a SchemaRDD

    Returns the specified table as a SchemaRDD

    Definition Classes
    SQLContext
  93. lazy val testTables: HashMap[String, TestTable]

    A list of test tables and the DDL required to initialize them.

    A list of test tables and the DDL required to initialize them. A test table is loaded on demand when a query are run against it.

    Definition Classes
    TestHiveContext
  94. implicit def timestampToLiteral(t: Timestamp): Literal

    Definition Classes
    ExpressionConversions
  95. def toDebugString: String

    Definition Classes
    SQLConf
  96. def toString(): String

    Definition Classes
    AnyRef → Any
  97. def uncacheTable(tableName: String): Unit

    Removes the specified table from the in-memory cache.

    Removes the specified table from the in-memory cache.

    Definition Classes
    SQLContext
  98. def upper(e: Expression): Upper

    Definition Classes
    ExpressionConversions
  99. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  100. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  101. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  102. lazy val warehousePath: String

    Definition Classes
    TestHiveContextLocalHiveContext

Inherited from TestHiveContext

Inherited from LocalHiveContext

Inherited from HiveContext

Inherited from SQLContext

Inherited from Serializable

Inherited from Serializable

Inherited from ExpressionConversions

Inherited from SQLConf

Inherited from com.typesafe.scalalogging.slf4j.Logging

Inherited from AnyRef

Inherited from Any

Spark SQL Functions

Ungrouped