Package org.apache.spark.util
Interface SparkErrorUtils
- All Superinterfaces:
org.apache.spark.internal.Logging
public interface SparkErrorUtils
extends org.apache.spark.internal.Logging
-
Nested Class Summary
Nested classes/interfaces inherited from interface org.apache.spark.internal.Logging
org.apache.spark.internal.Logging.LogStringContext, org.apache.spark.internal.Logging.SparkShellLoggingFilter
-
Method Summary
Modifier and TypeMethodDescription<R extends Closeable,
T>
TtryInitializeResource
(scala.Function0<R> createResource, scala.Function1<R, T> initialize) Try to initialize a resource.<T> T
tryOrIOException
(scala.Function0<T> block) Execute a block of code that returns a value, re-throwing any non-fatal uncaught exceptions as IOException.<R extends Closeable,
T>
TtryWithResource
(scala.Function0<R> createResource, scala.Function1<R, T> f) <T> T
tryWithSafeFinally
(scala.Function0<T> block, scala.Function0<scala.runtime.BoxedUnit> finallyBlock) Execute a block of code, then a finally block, but if exceptions happen in the finally block, do not suppress the original exception.Methods inherited from interface org.apache.spark.internal.Logging
initializeForcefully, initializeLogIfNecessary, initializeLogIfNecessary, initializeLogIfNecessary$default$2, isTraceEnabled, log, logDebug, logDebug, logDebug, logDebug, logError, logError, logError, logError, logInfo, logInfo, logInfo, logInfo, logName, LogStringContext, logTrace, logTrace, logTrace, logTrace, logWarning, logWarning, logWarning, logWarning, org$apache$spark$internal$Logging$$log_, org$apache$spark$internal$Logging$$log__$eq, withLogContext
-
Method Details
-
tryOrIOException
<T> T tryOrIOException(scala.Function0<T> block) Execute a block of code that returns a value, re-throwing any non-fatal uncaught exceptions as IOException. This is used when implementing Externalizable and Serializable's read and write methods, since Java's serializer will not report non-IOExceptions properly; see SPARK-4080 for more context.- Parameters:
block
- (undocumented)- Returns:
- (undocumented)
-
tryWithResource
<R extends Closeable,T> T tryWithResource(scala.Function0<R> createResource, scala.Function1<R, T> f) -
tryInitializeResource
<R extends Closeable,T> T tryInitializeResource(scala.Function0<R> createResource, scala.Function1<R, T> initialize) Try to initialize a resource. If an exception is throw during initialization, closes the resource before propagating the error. Otherwise, the caller is responsible for closing the resource. This means thatSparkErrorUtils
should provide some way to close the resource.- Parameters:
createResource
- (undocumented)initialize
- (undocumented)- Returns:
- (undocumented)
-
tryWithSafeFinally
<T> T tryWithSafeFinally(scala.Function0<T> block, scala.Function0<scala.runtime.BoxedUnit> finallyBlock) Execute a block of code, then a finally block, but if exceptions happen in the finally block, do not suppress the original exception.This is primarily an issue with
finally { out.close() }
blocks, where close needs to be called to clean upout
, but if an exception happened inout.write
, it's likelyout
may be corrupted andout.close
will fail as well. This would then suppress the original/likely more meaningful exception from the originalout.write
call.- Parameters:
block
- (undocumented)finallyBlock
- (undocumented)- Returns:
- (undocumented)
-
stackTraceToString
-