Package org.apache.spark.util
Class AccumulatorContext
Object
org.apache.spark.util.AccumulatorContext
An internal class used to track accumulators by Spark itself.
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionstatic void
clear()
Clears all registeredAccumulatorV2
s.static scala.Option<AccumulatorV2<?,
?>> get
(long id) Returns theAccumulatorV2
registered with the given ID, if any.static scala.Option<Object>
internOption
(scala.Option<Object> value) Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.static org.apache.spark.internal.Logging.LogStringContext
LogStringContext
(scala.StringContext sc) static long
newId()
Returns a globally unique ID for a newAccumulatorV2
.static int
Returns the number of accumulators registered.static org.slf4j.Logger
static void
org$apache$spark$internal$Logging$$log__$eq
(org.slf4j.Logger x$1) static void
register
(AccumulatorV2<?, ?> a) Registers anAccumulatorV2
created on the driver such that it can be used on the executors.static void
remove
(long id) Unregisters theAccumulatorV2
with the given ID, if any.
-
Constructor Details
-
AccumulatorContext
public AccumulatorContext()
-
-
Method Details
-
newId
public static long newId()Returns a globally unique ID for a newAccumulatorV2
. Note: Once you copy theAccumulatorV2
the ID is no longer unique.- Returns:
- (undocumented)
-
numAccums
public static int numAccums()Returns the number of accumulators registered. Used in testing. -
register
Registers anAccumulatorV2
created on the driver such that it can be used on the executors.All accumulators registered here can later be used as a container for accumulating partial values across multiple tasks. This is what
org.apache.spark.scheduler.DAGScheduler
does. Note: if an accumulator is registered here, it should also be registered with the active context cleaner for cleanup so as to avoid memory leaks.If an
AccumulatorV2
with the same ID was already registered, this does nothing instead of overwriting it. We will never register same accumulator twice, this is just a sanity check.- Parameters:
a
- (undocumented)
-
remove
public static void remove(long id) Unregisters theAccumulatorV2
with the given ID, if any.- Parameters:
id
- (undocumented)
-
get
Returns theAccumulatorV2
registered with the given ID, if any.- Parameters:
id
- (undocumented)- Returns:
- (undocumented)
-
clear
public static void clear()Clears all registeredAccumulatorV2
s. For testing only. -
internOption
Naive way to reduce the duplicate Some objects for values 0 and -1 TODO: Eventually if this spreads out to more values then using Guava's weak interner would be a better solution.- Parameters:
value
- (undocumented)- Returns:
- (undocumented)
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
-