Packages

c

org.apache.spark.serializer

DummySerializerInstance

final class DummySerializerInstance extends SerializerInstance

Unfortunately, we need a serializer instance in order to construct a DiskBlockObjectWriter. Our shuffle write path doesn't actually use this serializer (since we end up calling the write() OutputStream methods), but DiskBlockObjectWriter still calls some methods on it. To work around this, we pass a dummy no-op serializer.

Annotations
@Private()
Source
DummySerializerInstance.java
Linear Supertypes
SerializerInstance, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DummySerializerInstance
  2. SerializerInstance
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Abstract Value Members

  1. abstract def deserialize[T](bytes: ByteBuffer, loader: ClassLoader)(implicit arg0: ClassTag[T]): T
    Definition Classes
    SerializerInstance
  2. abstract def deserialize[T](bytes: ByteBuffer)(implicit arg0: ClassTag[T]): T
    Definition Classes
    SerializerInstance
  3. abstract def serialize[T](t: T)(implicit arg0: ClassTag[T]): ByteBuffer
    Definition Classes
    SerializerInstance

Concrete Value Members

  1. def deserialize[T](bytes: ByteBuffer, ev1: ClassTag[T]): T
    Annotations
    @Override()
  2. def deserialize[T](bytes: ByteBuffer, loader: ClassLoader, ev1: ClassTag[T]): T
    Annotations
    @Override()
  3. def deserializeStream(s: InputStream): DeserializationStream
    Definition Classes
    DummySerializerInstanceSerializerInstance
    Annotations
    @Override()
  4. def serialize[T](t: T, ev1: ClassTag[T]): ByteBuffer
    Annotations
    @Override()
  5. def serializeStream(s: OutputStream): SerializationStream
    Definition Classes
    DummySerializerInstanceSerializerInstance
    Annotations
    @Override()