org.apache.spark.sql

parquet

package parquet

Visibility
  1. Public
  2. All

Type Members

  1. class DefaultSource extends RelationProvider

    Allows creation of parquet based tables using the syntax CREATE TEMPORARY TABLE ... USING org.apache.spark.sql.parquet.

  2. case class InsertIntoParquetTable(relation: ParquetRelation, child: SparkPlan, overwrite: Boolean = false) extends SparkPlan with UnaryNode with SparkHadoopMapReduceUtil with Product with Serializable

    :: DeveloperApi :: Operator that acts as a sink for queries on RDDs and can be used to store the output inside a directory of Parquet files.

  3. case class ParquetRelation2(path: String)(sqlContext: SQLContext) extends CatalystScan with Logging with Product with Serializable

    An alternative to ParquetRelation that plugs in using the data sources API.

  4. case class ParquetTableScan(attributes: Seq[Attribute], relation: ParquetRelation, columnPruningPred: Seq[Expression]) extends SparkPlan with LeafNode with Product with Serializable

    :: DeveloperApi :: Parquet table scan operator.

Ungrouped