abstract class Receiver[T] extends Serializable
:: DeveloperApi ::
Abstract class of a receiver that can be run on worker nodes to receive external data. A
custom receiver can be defined by defining the functions onStart()
and onStop()
. onStart()
should define the setup steps necessary to start receiving data,
and onStop()
should define the cleanup steps necessary to stop receiving data.
Exceptions while receiving can be handled either by restarting the receiver with restart(...)
or stopped completely by stop(...)
.
A custom receiver in Scala would look like this.
class MyReceiver(storageLevel: StorageLevel) extends NetworkReceiver[String](storageLevel) { def onStart() { // Setup stuff (start threads, open sockets, etc.) to start receiving data. // Must start new thread to receive data, as onStart() must be non-blocking. // Call store(...) in those threads to store received data into Spark's memory. // Call stop(...), restart(...) or reportError(...) on any thread based on how // different errors need to be handled. // See corresponding method documentation for more details } def onStop() { // Cleanup stuff (stop threads, close sockets, etc.) to stop receiving data. } }
A custom receiver in Java would look like this.
class MyReceiver extends Receiver<String> { public MyReceiver(StorageLevel storageLevel) { super(storageLevel); } public void onStart() { // Setup stuff (start threads, open sockets, etc.) to start receiving data. // Must start new thread to receive data, as onStart() must be non-blocking. // Call store(...) in those threads to store received data into Spark's memory. // Call stop(...), restart(...) or reportError(...) on any thread based on how // different errors need to be handled. // See corresponding method documentation for more details } public void onStop() { // Cleanup stuff (stop threads, close sockets, etc.) to stop receiving data. } }
- Annotations
- @DeveloperApi()
- Source
- Receiver.scala
- Alphabetic
- By Inheritance
- Receiver
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new Receiver(storageLevel: StorageLevel)
Abstract Value Members
- abstract def onStart(): Unit
This method is called by the system when the receiver is started.
This method is called by the system when the receiver is started. This function must initialize all resources (threads, buffers, etc.) necessary for receiving data. This function must be non-blocking, so receiving the data must occur on a different thread. Received data can be stored with Spark by calling
store(data)
.If there are errors in threads started here, then following options can be done (i)
reportError(...)
can be called to report the error to the driver. The receiving of data will continue uninterrupted. (ii)stop(...)
can be called to stop receiving data. This will callonStop()
to clear up all resources allocated (threads, buffers, etc.) duringonStart()
. (iii)restart(...)
can be called to restart the receiver. This will callonStop()
immediately, and thenonStart()
after a delay. - abstract def onStop(): Unit
This method is called by the system when the receiver is stopped.
This method is called by the system when the receiver is stopped. All resources (threads, buffers, etc.) set up in
onStart()
must be cleaned up in this method.
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @IntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def isStarted(): Boolean
Check if the receiver has started or not.
- def isStopped(): Boolean
Check if receiver has been marked for stopping.
Check if receiver has been marked for stopping. Use this to identify when the receiving of data should be stopped.
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @IntrinsicCandidate() @native()
- def preferredLocation: Option[String]
Override this to specify a preferred location (hostname).
- def reportError(message: String, throwable: Throwable): Unit
Report exceptions in receiving data.
- def restart(message: String, error: Throwable, millisecond: Int): Unit
Restart the receiver.
Restart the receiver. This method schedules the restart and returns immediately. The stopping and subsequent starting of the receiver (by calling
onStop()
andonStart()
) is performed asynchronously in a background thread. - def restart(message: String, error: Throwable): Unit
Restart the receiver.
Restart the receiver. This method schedules the restart and returns immediately. The stopping and subsequent starting of the receiver (by calling
onStop()
andonStart()
) is performed asynchronously in a background thread. The delay between the stopping and the starting is defined by the Spark configurationspark.streaming.receiverRestartDelay
. Themessage
andexception
will be reported to the driver. - def restart(message: String): Unit
Restart the receiver.
Restart the receiver. This method schedules the restart and returns immediately. The stopping and subsequent starting of the receiver (by calling
onStop()
andonStart()
) is performed asynchronously in a background thread. The delay between the stopping and the starting is defined by the Spark configurationspark.streaming.receiverRestartDelay
. Themessage
will be reported to the driver. - def stop(message: String, error: Throwable): Unit
Stop the receiver completely due to an exception
- def stop(message: String): Unit
Stop the receiver completely.
- val storageLevel: StorageLevel
- def store(bytes: ByteBuffer, metadata: Any): Unit
Store the bytes of received data as a data block into Spark's memory.
Store the bytes of received data as a data block into Spark's memory. The metadata will be associated with this block of data for being used in the corresponding InputDStream.
- def store(bytes: ByteBuffer): Unit
Store the bytes of received data as a data block into Spark's memory.
Store the bytes of received data as a data block into Spark's memory. Note that the data in the ByteBuffer must be serialized using the same serializer that Spark is configured to use.
- def store(dataIterator: Iterator[T], metadata: Any): Unit
Store an iterator of received data as a data block into Spark's memory.
Store an iterator of received data as a data block into Spark's memory. The metadata will be associated with this block of data for being used in the corresponding InputDStream.
- def store(dataIterator: Iterator[T]): Unit
Store an iterator of received data as a data block into Spark's memory.
- def store(dataIterator: Iterator[T], metadata: Any): Unit
Store an iterator of received data as a data block into Spark's memory.
Store an iterator of received data as a data block into Spark's memory. The metadata will be associated with this block of data for being used in the corresponding InputDStream.
- def store(dataIterator: Iterator[T]): Unit
Store an iterator of received data as a data block into Spark's memory.
- def store(dataBuffer: ArrayBuffer[T], metadata: Any): Unit
Store an ArrayBuffer of received data as a data block into Spark's memory.
Store an ArrayBuffer of received data as a data block into Spark's memory. The metadata will be associated with this block of data for being used in the corresponding InputDStream.
- def store(dataBuffer: ArrayBuffer[T]): Unit
Store an ArrayBuffer of received data as a data block into Spark's memory.
- def store(dataItem: T): Unit
Store a single item of received data to Spark's memory.
Store a single item of received data to Spark's memory. These single items will be aggregated together into data blocks before being pushed into Spark's memory.
- def streamId: Int
Get the unique identifier the receiver input stream that this receiver is associated with.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)