package io
IO codecs used for compression. See org.apache.spark.io.CompressionCodec.
- Source
- package.scala
Type Members
- trait CompressionCodec extends AnyRef
:: DeveloperApi :: CompressionCodec allows the customization of choosing different compression implementations to be used in block storage.
:: DeveloperApi :: CompressionCodec allows the customization of choosing different compression implementations to be used in block storage.
- Annotations
- @DeveloperApi()
- Note
The wire protocol for a codec is not guaranteed compatible across versions of Spark. This is intended for use as an internal compression utility within a single Spark application.
- class LZ4CompressionCodec extends CompressionCodec
:: DeveloperApi :: LZ4 implementation of org.apache.spark.io.CompressionCodec.
:: DeveloperApi :: LZ4 implementation of org.apache.spark.io.CompressionCodec. Block size can be configured by
spark.io.compression.lz4.blockSize
.- Annotations
- @DeveloperApi()
- Note
The wire protocol for this codec is not guaranteed to be compatible across versions of Spark. This is intended for use as an internal compression utility within a single Spark application.
- class LZFCompressionCodec extends CompressionCodec
:: DeveloperApi :: LZF implementation of org.apache.spark.io.CompressionCodec.
:: DeveloperApi :: LZF implementation of org.apache.spark.io.CompressionCodec.
- Annotations
- @DeveloperApi()
- Note
The wire protocol for this codec is not guaranteed to be compatible across versions of Spark. This is intended for use as an internal compression utility within a single Spark application.
- final class NioBufferedFileInputStream extends InputStream
InputStream
implementation which uses direct buffer to read a file to avoid extra copy of data between Java and native memory which happens when usingjava.io.BufferedInputStream
.InputStream
implementation which uses direct buffer to read a file to avoid extra copy of data between Java and native memory which happens when usingjava.io.BufferedInputStream
. Unfortunately, this is not something already available in JDK,sun.nio.ch.ChannelInputStream
supports reading a file using nio, but does not support buffering. - class ReadAheadInputStream extends InputStream
InputStream
implementation which asynchronously reads ahead from the underlying input stream when specified amount of data has been read from the current buffer.InputStream
implementation which asynchronously reads ahead from the underlying input stream when specified amount of data has been read from the current buffer. It does it by maintaining two buffers - active buffer and read ahead buffer. Active buffer contains data which should be returned when a read() call is issued. The read ahead buffer is used to asynchronously read from the underlying input stream and once the current active buffer is exhausted, we flip the two buffers so that we can start reading from the read ahead buffer without being blocked in disk I/O. - class SnappyCompressionCodec extends CompressionCodec
:: DeveloperApi :: Snappy implementation of org.apache.spark.io.CompressionCodec.
:: DeveloperApi :: Snappy implementation of org.apache.spark.io.CompressionCodec. Block size can be configured by
spark.io.compression.snappy.blockSize
.- Annotations
- @DeveloperApi()
- Note
The wire protocol for this codec is not guaranteed to be compatible across versions of Spark. This is intended for use as an internal compression utility within a single Spark application.
- class ZStdCompressionCodec extends CompressionCodec
:: DeveloperApi :: ZStandard implementation of org.apache.spark.io.CompressionCodec.
:: DeveloperApi :: ZStandard implementation of org.apache.spark.io.CompressionCodec. For more details see - http://facebook.github.io/zstd/
- Annotations
- @DeveloperApi()
- Note
The wire protocol for this codec is not guaranteed to be compatible across versions of Spark. This is intended for use as an internal compression utility within a single Spark application.