package spark

Core Spark functionality. SparkContext serves as the main entry point to Spark, while RDD is the data type representing a distributed collection, and provides most parallel operations.

In addition, PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; DoubleRDDFunctions contains operations available only on RDDs of Doubles; and SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions when you import org.apache.spark.SparkContext._.

  1. Public
  2. All

Type Members

  1. class Accumulable[R, T] extends Serializable

    A datatype that can be accumulated, i.

  2. trait AccumulableParam[R, T] extends Serializable

    Helper object defining how to accumulate values of a particular type.

  3. class Accumulator[T] extends Accumulable[T, T]

    A simpler value of Accumulable where the result type being accumulated is the same as the types of elements being merged.

  4. trait AccumulatorParam[T] extends AccumulableParam[T, T]

    A simpler version of AccumulableParam where the only datatype you can add in is the same type as the accumulated value.

  5. case class Aggregator[K, V, C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C) extends Product with Serializable

    A set of functions used to aggregate data.

  6. class ComplexFutureAction[T] extends FutureAction[T]

    A FutureAction for actions that could trigger multiple Spark jobs.

  7. abstract class Dependency[T] extends Serializable

    Base class for dependencies.

  8. trait FutureAction[T] extends Future[T]

    A future for the result of an action.

  9. class HashPartitioner extends Partitioner

    A Partitioner that implements hash-based partitioning using Java's Object.hashCode.

  10. class InterruptibleIterator[+T] extends Iterator[T]

    An iterator that wraps around an existing iterator to provide task killing functionality.

  11. trait Logging extends AnyRef

    Utility trait for classes that want to log data.

  12. abstract class NarrowDependency[T] extends Dependency[T]

    Base class for dependencies where each partition of the parent RDD is used by at most one partition of the child RDD.

  13. class OneToOneDependency[T] extends NarrowDependency[T]

    Represents a one-to-one dependency between partitions of the parent and child RDDs.

  14. trait Partition extends Serializable

    A partition of an RDD.

  15. abstract class Partitioner extends Serializable

    An object that defines how the elements in a key-value pair RDD are partitioned by key.

  16. class RangeDependency[T] extends NarrowDependency[T]

    Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.

  17. class RangePartitioner[K, V] extends Partitioner

    A Partitioner that partitions sortable records by range into roughly equal ranges.

  18. class SerializableWritable[T <: Writable] extends Serializable

  19. class ShuffleDependency[K, V] extends Dependency[Product2[K, V]]

    Represents a dependency on the output of a shuffle stage.

  20. class SimpleFutureAction[T] extends FutureAction[T]

    The future holding the result of an action that triggers a single job.

  21. class SparkContext extends Logging

    Main entry point for Spark functionality.

  22. class SparkEnv extends AnyRef

    Holds all the runtime environment objects for a running Spark instance (either master or worker), including the serializer, Akka actor system, block manager, map output tracker, etc.

  23. class SparkException extends Exception

  24. class SparkFiles extends AnyRef

  25. class TaskContext extends Serializable

Value Members

  1. object Partitioner extends Serializable

  2. object SparkContext extends AnyRef

    The SparkContext object contains a number of implicit conversions and parameters for use with various Spark features.

  3. object SparkEnv extends Logging

  4. object SparkFiles extends

  5. package api

  6. package broadcast

  7. package deploy

  8. package executor

  9. package io

  10. package metrics

  11. package network

  12. package partial

  13. package rdd

  14. package scheduler

  15. package serializer

  16. package storage

  17. package util