Home
Trees
Indices
Help
PySpark
[
frames
] |
no frames
]
[
Module Hierarchy
|
Class Hierarchy
]
Class Hierarchy
object
:
The most base type
pyspark.accumulators.Accumulator
:
A shared variable that can be accumulated, i.e., has a commutative and associative "add" operation.
pyspark.accumulators.AccumulatorParam
:
Helper object that defines how to accumulate values of a given type.
pyspark.broadcast.Broadcast
pyspark.rdd.RDD
:
A Resilient Distributed Dataset (RDD), the basic abstraction in Spark.
pyspark.context.SparkContext
:
Main entry point for Spark functionality.
pyspark.files.SparkFiles
:
Resolves paths to files added through
SparkContext.addFile()
.
Home
Trees
Indices
Help
PySpark
Generated by Epydoc 3.0.1 on Tue Feb 26 22:47:39 2013
http://epydoc.sourceforge.net