trait DriverPlugin extends AnyRef
:: DeveloperApi ::
Driver component of a SparkPlugin.
- Annotations
- @DeveloperApi()
- Source
- DriverPlugin.java
- Since
3.0.0
- Alphabetic
- By Inheritance
- DriverPlugin
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def init(sc: SparkContext, pluginContext: PluginContext): Map[String, String]
Initialize the plugin.
Initialize the plugin.
This method is called early in the initialization of the Spark driver. Explicitly, it is called before the Spark driver's task scheduler is initialized. This means that a lot of other Spark subsystems may yet not have been initialized. This call also blocks driver initialization.
It's recommended that plugins be careful about what operations are performed in this call, preferably performing expensive operations in a separate thread, or postponing them until the application has fully started.
- sc
The SparkContext loading the plugin.
- pluginContext
Additional plugin-specific about the Spark application where the plugin is running.
- returns
A map that will be provided to the
ExecutorPlugin#init(PluginContext,Map)method.
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- def receive(message: AnyRef): AnyRef
RPC message handler.
RPC message handler.
Plugins can use Spark's RPC system to send messages from executors to the driver (but not the other way around, currently). Messages sent by the executor component of the plugin will be delivered to this method, and the returned value will be sent back to the executor as the reply, if the executor has requested one.
Any exception thrown will be sent back to the executor as an error, in case it is expecting a reply. In case a reply is not expected, a log message will be written to the driver log.
The implementation of this handler should be thread-safe.
Note all plugins share RPC dispatch threads, and this method is called synchronously. So performing expensive operations in this handler may affect the operation of other active plugins. Internal Spark endpoints are not directly affected, though, since they use different threads.
Spark guarantees that the driver component will be ready to receive messages through this handler when executors are started.
- message
The incoming message.
- returns
Value to be returned to the caller. Ignored if the caller does not expect a reply.
- def registerMetrics(appId: String, pluginContext: PluginContext): Unit
Register metrics published by the plugin with Spark's metrics system.
Register metrics published by the plugin with Spark's metrics system.
This method is called later in the initialization of the Spark application, after most subsystems are up and the application ID is known. If there are metrics registered in the registry (
PluginContext#metricRegistry()), then a metrics source with the plugin name will be created.Note that even though the metric registry is still accessible after this method is called, registering new metrics after this method is called may result in the metrics not being available.
- appId
The application ID from the cluster manager.
- pluginContext
Additional plugin-specific about the Spark application where the plugin is running.
- def shutdown(): Unit
Informs the plugin that the Spark application is shutting down.
Informs the plugin that the Spark application is shutting down.
This method is called during the driver shutdown phase. It is recommended that plugins not use any Spark functions (e.g. send RPC messages) during this call.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)