Package org.apache.spark.rdd
Class SequenceFileRDDFunctions<K,V> 
Object
org.apache.spark.rdd.SequenceFileRDDFunctions<K,V> 
- All Implemented Interfaces:
- Serializable,- org.apache.spark.internal.Logging
public class SequenceFileRDDFunctions<K,V> 
extends Object
implements org.apache.spark.internal.Logging, Serializable
Extra functions available on RDDs of (key, value) pairs to create a Hadoop SequenceFile,
 through an implicit conversion.
 
- See Also:
- Note:
- This can't be part of PairRDDFunctions because we need more implicit parameters to convert our keys and values to Writable.
- 
Nested Class SummaryNested classes/interfaces inherited from interface org.apache.spark.internal.Loggingorg.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
- 
Constructor SummaryConstructorsConstructorDescriptionSequenceFileRDDFunctions(RDD<scala.Tuple2<K, V>> self, Class<? extends org.apache.hadoop.io.Writable> _keyWritableClass, Class<? extends org.apache.hadoop.io.Writable> _valueWritableClass, scala.Function1<K, org.apache.hadoop.io.Writable> evidence$1, scala.reflect.ClassTag<K> evidence$2, scala.Function1<V, org.apache.hadoop.io.Writable> evidence$3, scala.reflect.ClassTag<V> evidence$4) 
- 
Method SummaryModifier and TypeMethodDescriptionvoidsaveAsSequenceFile(String path, scala.Option<Class<? extends org.apache.hadoop.io.compress.CompressionCodec>> codec) Output the RDD as a Hadoop SequenceFile using the Writable types we infer from the RDD's key and value types.Methods inherited from class java.lang.Objectequals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.internal.LogginginitializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logBasedOnLevel, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, MDC, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
- 
Constructor Details- 
SequenceFileRDDFunctionspublic SequenceFileRDDFunctions(RDD<scala.Tuple2<K, V>> self, Class<? extends org.apache.hadoop.io.Writable> _keyWritableClass, Class<? extends org.apache.hadoop.io.Writable> _valueWritableClass, scala.Function1<K, org.apache.hadoop.io.Writable> evidence$1, scala.reflect.ClassTag<K> evidence$2, scala.Function1<V, org.apache.hadoop.io.Writable> evidence$3, scala.reflect.ClassTag<V> evidence$4) 
 
- 
- 
Method Details- 
saveAsSequenceFilepublic void saveAsSequenceFile(String path, scala.Option<Class<? extends org.apache.hadoop.io.compress.CompressionCodec>> codec) Output the RDD as a Hadoop SequenceFile using the Writable types we infer from the RDD's key and value types. If the key or value are Writable, then we use their classes directly; otherwise we map primitive types such as Int and Double to IntWritable, DoubleWritable, etc, byte arrays to BytesWritable, and Strings to Text. Thepathcan be on any Hadoop-supported file system.- Parameters:
- path- (undocumented)
- codec- (undocumented)
 
 
-