org.apache

spark

package spark

Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

Source
package.scala
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. spark
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. class Accumulable[R, T] extends Serializable

    A data type that can be accumulated, ie has an commutative and associative "add" operation, but where the result type, R, may be different from the element type being added, T.

  2. trait AccumulableParam[R, T] extends Serializable

    Helper object defining how to accumulate values of a particular type.

  3. class Accumulator[T] extends Accumulable[T, T]

    A simpler value of Accumulable where the result type being accumulated is the same as the types of elements being merged, i.

  4. trait AccumulatorParam[T] extends AccumulableParam[T, T]

    A simpler version of org.apache.spark.AccumulableParam where the only data type you can add in is the same type as the accumulated value.

  5. case class Aggregator[K, V, C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C) extends Product with Serializable

    :: DeveloperApi :: A set of functions used to aggregate data.

  6. class ComplexFutureAction[T] extends FutureAction[T]

    A FutureAction for actions that could trigger multiple Spark jobs.

  7. abstract class Dependency[T] extends Serializable

    :: DeveloperApi :: Base class for dependencies.

  8. case class ExceptionFailure(className: String, description: String, stackTrace: Array[StackTraceElement], fullStackTrace: String, metrics: Option[TaskMetrics], exceptionWrapper: Option[ThrowableSerializationWrapper]) extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: Task failed due to a runtime exception.

  9. case class ExecutorLostFailure(execId: String, exitCausedByApp: Boolean = true, reason: Option[String]) extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: The task failed because the executor that it was running on was lost.

  10. case class FetchFailed(bmAddress: BlockManagerId, shuffleId: Int, mapId: Int, reduceId: Int, message: String) extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: Task failed to fetch shuffle data from a remote node.

  11. trait FutureAction[T] extends Future[T]

    A future for the result of an action to support cancellation.

  12. class HashPartitioner extends Partitioner

    A org.apache.spark.Partitioner that implements hash-based partitioning using Java's Object.hashCode.

  13. class InterruptibleIterator[+T] extends Iterator[T]

    :: DeveloperApi :: An iterator that wraps around an existing iterator to provide task killing functionality.

  14. class JavaSparkListener extends SparkListener

  15. class JobExecutionStatus extends Enum[JobExecutionStatus]

  16. trait Logging extends AnyRef

    Utility trait for classes that want to log data.

  17. abstract class NarrowDependency[T] extends Dependency[T]

    :: DeveloperApi :: Base class for dependencies where each partition of the child RDD depends on a small number of partitions of the parent RDD.

  18. class OneToOneDependency[T] extends NarrowDependency[T]

    :: DeveloperApi :: Represents a one-to-one dependency between partitions of the parent and child RDDs.

  19. trait Partition extends Serializable

    An identifier for a partition in an RDD.

  20. abstract class Partitioner extends Serializable

    An object that defines how the elements in a key-value pair RDD are partitioned by key.

  21. class RangeDependency[T] extends NarrowDependency[T]

    :: DeveloperApi :: Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.

  22. class RangePartitioner[K, V] extends Partitioner

    A org.apache.spark.Partitioner that partitions sortable records by range into roughly equal ranges.

  23. class SerializableWritable[T <: Writable] extends Serializable

    Annotations
    @DeveloperApi()
  24. class ShuffleDependency[K, V, C] extends Dependency[Product2[K, V]]

    :: DeveloperApi :: Represents a dependency on the output of a shuffle stage.

  25. class SimpleFutureAction[T] extends FutureAction[T]

    A FutureAction holding the result of an action that triggers a single job.

  26. class SparkConf extends Cloneable with Logging

    Configuration for a Spark application.

  27. class SparkContext extends Logging with ExecutorAllocationClient

    Main entry point for Spark functionality.

  28. class SparkEnv extends Logging

    :: DeveloperApi :: Holds all the runtime environment objects for a running Spark instance (either master or worker), including the serializer, Akka actor system, block manager, map output tracker, etc.

  29. class SparkException extends Exception

  30. class SparkFirehoseListener extends SparkListener

  31. trait SparkJobInfo extends Serializable

  32. trait SparkStageInfo extends Serializable

  33. class SparkStatusTracker extends AnyRef

    Low-level status reporting APIs for monitoring job and stage progress.

  34. case class TaskCommitDenied(jobID: Int, partitionID: Int, attemptNumber: Int) extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: Task requested the driver to commit, but was denied.

  35. abstract class TaskContext extends Serializable

    Contextual information about a task which can be read or mutated during execution.

  36. sealed trait TaskEndReason extends AnyRef

    :: DeveloperApi :: Various possible reasons why a task ended.

  37. sealed trait TaskFailedReason extends TaskEndReason

    :: DeveloperApi :: Various possible reasons why a task failed.

  38. class TaskKilledException extends RuntimeException

    :: DeveloperApi :: Exception thrown when a task is explicitly killed (i.

Value Members

  1. object AccumulatorParam extends Serializable

  2. object HeartbeatReceiver

  3. object Partitioner extends Serializable

  4. object Resubmitted extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: A org.apache.spark.scheduler.ShuffleMapTask that completed successfully earlier, but we lost the executor before the stage completed.

  5. val SPARK_VERSION: String

  6. object SparkContext extends Logging

    The SparkContext object contains a number of implicit conversions and parameters for use with various Spark features.

  7. object SparkEnv extends Logging

  8. object SparkFiles

    Resolves paths to files added through SparkContext.addFile().

  9. object Success extends TaskEndReason with Product with Serializable

    :: DeveloperApi :: Task succeeded.

  10. object TaskContext extends Serializable

  11. object TaskKilled extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: Task was killed intentionally and needs to be rescheduled.

  12. object TaskResultLost extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: The task finished successfully, but the result was lost from the executor's block manager before it was fetched.

  13. object UnknownReason extends TaskFailedReason with Product with Serializable

    :: DeveloperApi :: We don't know why the task ended -- for example, because of a ClassNotFound exception when deserializing the task result.

  14. object WritableConverter extends Serializable

  15. object WritableFactory extends Serializable

  16. package annotation

    Spark annotations to mark an API experimental or intended only for advanced usages by developers.

  17. package api

  18. package bagel

    Bagel: An implementation of Pregel in Spark.

  19. package broadcast

    Spark's broadcast variables, used to broadcast immutable datasets to all nodes.

  20. package graphx

    ALPHA COMPONENT GraphX is a graph processing framework built on top of Spark.

  21. package input

  22. package io

    IO codecs used for compression.

  23. package launcher

  24. package mapred

  25. package ml

    Spark ML is a BETA component that adds a new set of machine learning APIs to let users quickly assemble and configure practical machine learning pipelines.

  26. package mllib

    Spark's machine learning library.

  27. package partial

    :: Experimental ::

  28. package rdd

    Provides several RDD implementations.

  29. package scheduler

    Spark's scheduling components.

  30. package serializer

    Pluggable serializers for RDD and shuffle data.

  31. package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

  32. package status

  33. package storage

  34. package streaming

    Spark Streaming functionality.

  35. package ui

  36. package util

    Spark utilities.

Inherited from AnyRef

Inherited from Any

Ungrouped