@Experimental public interface CatalogPlugin
Implementations can provide catalog functions by implementing additional interfaces for tables, views, and functions.
Catalog implementations must implement this marker interface to be loaded by
Catalogs.load(String, SQLConf). The loader will instantiate catalog classes using the
required public no-arg constructor. After creating an instance, it will be configured by calling
Catalog implementations are registered to a name by adding a configuration option to Spark:
spark.sql.catalog.catalog-name=com.example.YourCatalogClass. All configuration properties
in the Spark configuration that share the catalog name prefix,
spark.sql.catalog.catalog-name.(key)=(value) will be passed in the case insensitive
string map of options in initialization with the prefix removed.
name, is also passed and is the catalog's name; in this case, "catalog-name".
void initialize(String name, CaseInsensitiveStringMap options)
This method is called once, just after the provider is instantiated.
name- the name used to identify and load this catalog
options- a case-insensitive string map of configuration
This method is only called after
initialize(String, CaseInsensitiveStringMap) is
called to pass the catalog's name.