Package org.apache.spark

Core Spark classes in Scala.

See:
          Description

Interface Summary
AccumulableParam<R,T> Helper object defining how to accumulate values of a particular type.
AccumulatorParam<T> A simpler version of AccumulableParam where the only data type you can add in is the same type as the accumulated value.
CleanupTask Classes that represent cleaning tasks.
FutureAction<T> A future for the result of an action to support cancellation.
Logging :: DeveloperApi :: Utility trait for classes that want to log data.
Partition An identifier for a partition in an RDD.
SparkJobInfo Exposes information about Spark Jobs.
SparkStageInfo Exposes information about Spark Stages.
TaskEndReason :: DeveloperApi :: Various possible reasons why a task ended.
TaskFailedReason :: DeveloperApi :: Various possible reasons why a task failed.
 

Class Summary
Accumulable<R,T> A data type that can be accumulated, ie has an commutative and associative "add" operation, but where the result type, R, may be different from the element type being added, T.
Accumulator<T> A simpler value of Accumulable where the result type being accumulated is the same as the types of elements being merged, i.e.
AccumulatorParam.DoubleAccumulatorParam$  
AccumulatorParam.FloatAccumulatorParam$  
AccumulatorParam.IntAccumulatorParam$  
AccumulatorParam.LongAccumulatorParam$  
Aggregator<K,V,C> :: DeveloperApi :: A set of functions used to aggregate data.
CleanAccum  
CleanBroadcast  
CleanCheckpoint  
CleanRDD  
CleanShuffle  
CleanupTaskWeakReference A WeakReference associated with a CleanupTask.
ComplexFutureAction<T> A FutureAction for actions that could trigger multiple Spark jobs.
Dependency<T> :: DeveloperApi :: Base class for dependencies.
ExceptionFailure :: DeveloperApi :: Task failed due to a runtime exception.
ExecutorLostFailure :: DeveloperApi :: The task failed because the executor that it was running on was lost.
FetchFailed :: DeveloperApi :: Task failed to fetch shuffle data from a remote node.
HashPartitioner A Partitioner that implements hash-based partitioning using Java's Object.hashCode.
InterruptibleIterator<T> :: DeveloperApi :: An iterator that wraps around an existing iterator to provide task killing functionality.
JavaSparkListener Java clients should extend this class instead of implementing SparkListener directly.
NarrowDependency<T> :: DeveloperApi :: Base class for dependencies where each partition of the child RDD depends on a small number of partitions of the parent RDD.
OneToOneDependency<T> :: DeveloperApi :: Represents a one-to-one dependency between partitions of the parent and child RDDs.
Partitioner An object that defines how the elements in a key-value pair RDD are partitioned by key.
RangeDependency<T> :: DeveloperApi :: Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.
RangePartitioner<K,V> A Partitioner that partitions sortable records by range into roughly equal ranges.
Resubmitted :: DeveloperApi :: A ShuffleMapTask that completed successfully earlier, but we lost the executor before the stage completed.
SerializableWritable<T extends Writable>  
ShuffleDependency<K,V,C> :: DeveloperApi :: Represents a dependency on the output of a shuffle stage.
SimpleFutureAction<T> A FutureAction holding the result of an action that triggers a single job.
SparkConf Configuration for a Spark application.
SparkContext Main entry point for Spark functionality.
SparkContext.DoubleAccumulatorParam$  
SparkContext.FloatAccumulatorParam$  
SparkContext.IntAccumulatorParam$  
SparkContext.LongAccumulatorParam$  
SparkEnv :: DeveloperApi :: Holds all the runtime environment objects for a running Spark instance (either master or worker), including the serializer, Akka actor system, block manager, map output tracker, etc.
SparkFiles Resolves paths to files added through SparkContext.addFile().
SparkFirehoseListener Class that allows users to receive all SparkListener events.
SparkJobInfoImpl  
SparkStageInfoImpl  
SparkStatusTracker Low-level status reporting APIs for monitoring job and stage progress.
Success :: DeveloperApi :: Task succeeded.
TaskCommitDenied :: DeveloperApi :: Task requested the driver to commit, but was denied.
TaskContext Contextual information about a task which can be read or mutated during execution.
TaskKilled :: DeveloperApi :: Task was killed intentionally and needs to be rescheduled.
TaskResultLost :: DeveloperApi :: The task finished successfully, but the result was lost from the executor's block manager before it was fetched.
UnknownReason :: DeveloperApi :: We don't know why the task ended -- for example, because of a ClassNotFound exception when deserializing the task result.
 

Enum Summary
JobExecutionStatus  
 

Exception Summary
SparkException  
TaskKilledException :: DeveloperApi :: Exception thrown when a task is explicitly killed (i.e., task failure is expected).
 

Package org.apache.spark Description

Core Spark classes in Scala. A few classes here, such as Accumulator and StorageLevel, are also used in Java, but the org.apache.spark.api.java package contains the main Java API.