org.apache.spark.sql.sources
Interface SchemaRelationProvider


public interface SchemaRelationProvider

::DeveloperApi:: Implemented by objects that produce relations for a specific kind of data source with a given schema. When Spark SQL is given a DDL operation with a USING clause specified ( to specify the implemented SchemaRelationProvider) and a user defined schema, this interface is used to pass in the parameters specified by a user.

Users may specify the fully qualified class name of a given data source. When that class is not found Spark SQL will append the class name DefaultSource to the path, allowing for less verbose invocation. For example, 'org.apache.spark.sql.json' would resolve to the data source 'org.apache.spark.sql.json.DefaultSource'

A new instance of this class with be instantiated each time a DDL call is made.

The difference between a RelationProvider and a SchemaRelationProvider is that users need to provide a schema when using a SchemaRelationProvider. A relation provider can inherits both RelationProvider and SchemaRelationProvider if it can support both schema inference and user-specified schemas.

Since:
1.3.0

Method Summary
 BaseRelation createRelation(SQLContext sqlContext, scala.collection.immutable.Map<String,String> parameters, StructType schema)
          Returns a new base relation with the given parameters and user defined schema.
 

Method Detail

createRelation

BaseRelation createRelation(SQLContext sqlContext,
                            scala.collection.immutable.Map<String,String> parameters,
                            StructType schema)
Returns a new base relation with the given parameters and user defined schema. Note: the parameters' keywords are case insensitive and this insensitivity is enforced by the Map that is passed to the function.

Parameters:
sqlContext - (undocumented)
parameters - (undocumented)
schema - (undocumented)
Returns:
(undocumented)