class RangePartitioner[K, V] extends Partitioner
A org.apache.spark.Partitioner that partitions sortable records by range into roughly equal ranges. The ranges are determined by sampling the content of the RDD passed in.
- Source
- Partitioner.scala
- Note
The actual number of partitions created by the RangePartitioner might not be the same as the
partitions
parameter, in the case where the number of sampled records is less than the value ofpartitions
.
- Alphabetic
- By Inheritance
- RangePartitioner
- Partitioner
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new RangePartitioner(partitions: Int, rdd: RDD[_ <: Product2[K, V]], ascending: Boolean)(implicit arg0: Ordering[K], arg1: ClassTag[K])
- new RangePartitioner(partitions: Int, rdd: RDD[_ <: Product2[K, V]], ascending: Boolean = true, samplePointsPerPartitionHint: Int = 20)(implicit arg0: Ordering[K], arg1: ClassTag[K])
Value Members
- def equals(other: Any): Boolean
- Definition Classes
- RangePartitioner → AnyRef → Any
- def getPartition(key: Any): Int
- Definition Classes
- RangePartitioner → Partitioner
- def hashCode(): Int
- Definition Classes
- RangePartitioner → AnyRef → Any
- def numPartitions: Int
- Definition Classes
- RangePartitioner → Partitioner
- val samplePointsPerPartitionHint: Int