Packages

trait DriverPlugin extends AnyRef

:: DeveloperApi :: Driver component of a SparkPlugin.

Annotations
@DeveloperApi()
Source
DriverPlugin.java
Since

3.0.0

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DriverPlugin
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  8. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  9. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  10. def init(sc: SparkContext, pluginContext: PluginContext): Map[String, String]

    Initialize the plugin.

    Initialize the plugin.

    This method is called early in the initialization of the Spark driver. Explicitly, it is called before the Spark driver's task scheduler is initialized. This means that a lot of other Spark subsystems may yet not have been initialized. This call also blocks driver initialization.

    It's recommended that plugins be careful about what operations are performed in this call, preferably performing expensive operations in a separate thread, or postponing them until the application has fully started.

    sc

    The SparkContext loading the plugin.

    pluginContext

    Additional plugin-specific about the Spark application where the plugin is running.

    returns

    A map that will be provided to the ExecutorPlugin#init(PluginContext,Map) method.

  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  15. def receive(message: AnyRef): AnyRef

    RPC message handler.

    RPC message handler.

    Plugins can use Spark's RPC system to send messages from executors to the driver (but not the other way around, currently). Messages sent by the executor component of the plugin will be delivered to this method, and the returned value will be sent back to the executor as the reply, if the executor has requested one.

    Any exception thrown will be sent back to the executor as an error, in case it is expecting a reply. In case a reply is not expected, a log message will be written to the driver log.

    The implementation of this handler should be thread-safe.

    Note all plugins share RPC dispatch threads, and this method is called synchronously. So performing expensive operations in this handler may affect the operation of other active plugins. Internal Spark endpoints are not directly affected, though, since they use different threads.

    Spark guarantees that the driver component will be ready to receive messages through this handler when executors are started.

    message

    The incoming message.

    returns

    Value to be returned to the caller. Ignored if the caller does not expect a reply.

  16. def registerMetrics(appId: String, pluginContext: PluginContext): Unit

    Register metrics published by the plugin with Spark's metrics system.

    Register metrics published by the plugin with Spark's metrics system.

    This method is called later in the initialization of the Spark application, after most subsystems are up and the application ID is known. If there are metrics registered in the registry (PluginContext#metricRegistry()), then a metrics source with the plugin name will be created.

    Note that even though the metric registry is still accessible after this method is called, registering new metrics after this method is called may result in the metrics not being available.

    appId

    The application ID from the cluster manager.

    pluginContext

    Additional plugin-specific about the Spark application where the plugin is running.

  17. def shutdown(): Unit

    Informs the plugin that the Spark application is shutting down.

    Informs the plugin that the Spark application is shutting down.

    This method is called during the driver shutdown phase. It is recommended that plugins not use any Spark functions (e.g. send RPC messages) during this call.

  18. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  19. def toString(): String
    Definition Classes
    AnyRef → Any
  20. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  21. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  22. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from AnyRef

Inherited from Any

Ungrouped