public class DefaultSource extends Object implements RelationProvider, SchemaRelationProvider, CreatableRelationProvider
CREATE TEMPORARY TABLE ... USING org.apache.spark.sql.parquet OPTIONS (...)
Supported options include:
- path
: Required. When reading Parquet files, path
should point to the location of the
Parquet file(s). It can be either a single raw Parquet file, or a directory of Parquet files.
In the latter case, this data source tries to discover partitioning information if the the
directory is structured in the same style of Hive partitioned tables. When writing Parquet
file, path
should point to the destination folder.
- mergeSchema
: Optional. Indicates whether we should merge potentially different (but
compatible) schemas stored in all Parquet part-files.
- partition.defaultName
: Optional. Partition name used when a value of a partition column is
null or empty string. This is similar to the hive.exec.default.partition.name
configuration
in Hive.
Constructor and Description |
---|
DefaultSource() |
Modifier and Type | Method and Description |
---|---|
BaseRelation |
createRelation(SQLContext sqlContext,
scala.collection.immutable.Map<String,String> parameters)
Returns a new base relation with the given parameters.
|
BaseRelation |
createRelation(SQLContext sqlContext,
scala.collection.immutable.Map<String,String> parameters,
StructType schema)
Returns a new base relation with the given parameters and schema.
|
BaseRelation |
createRelation(SQLContext sqlContext,
SaveMode mode,
scala.collection.immutable.Map<String,String> parameters,
DataFrame data)
Returns a new base relation with the given parameters and save given data into it.
|
public BaseRelation createRelation(SQLContext sqlContext, scala.collection.immutable.Map<String,String> parameters)
createRelation
in interface RelationProvider
public BaseRelation createRelation(SQLContext sqlContext, scala.collection.immutable.Map<String,String> parameters, StructType schema)
createRelation
in interface SchemaRelationProvider
public BaseRelation createRelation(SQLContext sqlContext, SaveMode mode, scala.collection.immutable.Map<String,String> parameters, DataFrame data)
createRelation
in interface CreatableRelationProvider