All Implemented Interfaces:
An input stream that reads blocks of serialized objects from a given network address.
The blocks will be inserted directly into the block store. This is the fastest way to get
data into Spark Streaming, though it requires the sender to batch data and serialize it
in the format that the system is configured with.
See Also: Serialized Form
Modifier and Type
Method and Description
Gets the receiver object that will be sent to the worker nodes
to receive data.
Methods inherited from class org.apache.spark.streaming.dstream.
DStream cache, checkpoint, checkpointData, checkpointDuration, clearCheckpointData, context, count, countByValue, countByValueAndWindow, countByWindow, creationSite, filter, flatMap, foreach, foreach, foreachRDD, foreachRDD, generatedRDDs, generateJob, getCreationSite, getOrCompute, glom, graph, initialize, isInitialized, map, mapPartitions, mustCheckpoint, parentRememberDuration, persist, persist, print, reduce, reduceByWindow, reduceByWindow, register, remember, rememberDuration, repartition, restoreCheckpointData, saveAsObjectFiles, saveAsTextFiles, setContext, setGraph, slice, slice, ssc, storageLevel, transform, transform, transformWith, transformWith, union, updateCheckpointData, validate, window, window, zeroTime
Methods inherited from class Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface org.apache.spark.
Logging initializeIfNecessary, initializeLogging, isTraceEnabled, log_, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarning
scala.reflect.ClassTag< T> evidence$1)