trait SchemaRelationProvider extends AnyRef
Implemented by objects that produce relations for a specific kind of data source with a given schema. When Spark SQL is given a DDL operation with a USING clause specified ( to specify the implemented SchemaRelationProvider) and a user defined schema, this interface is used to pass in the parameters specified by a user.
Users may specify the fully qualified class name of a given data source. When that class is
not found Spark SQL will append the class name DefaultSource
to the path, allowing for
less verbose invocation. For example, 'org.apache.spark.sql.json' would resolve to the
data source 'org.apache.spark.sql.json.DefaultSource'
A new instance of this class will be instantiated each time a DDL call is made.
The difference between a RelationProvider and a SchemaRelationProvider is that users need to provide a schema when using a SchemaRelationProvider. A relation provider can inherit both RelationProvider and SchemaRelationProvider if it can support both schema inference and user-specified schemas.
- Annotations
- @Stable()
- Source
- interfaces.scala
- Since
1.3.0
- Alphabetic
- By Inheritance
- SchemaRelationProvider
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Abstract Value Members
-
abstract
def
createRelation(sqlContext: SQLContext, parameters: Map[String, String], schema: StructType): BaseRelation
Returns a new base relation with the given parameters and user defined schema.
Returns a new base relation with the given parameters and user defined schema.
- Note
The parameters' keywords are case insensitive and this insensitivity is enforced by the Map that is passed to the function.