Package org.apache.spark.streaming.util
Class WriteAheadLogUtils
Object
org.apache.spark.streaming.util.WriteAheadLogUtils
A helper class with utility functions related to the WriteAheadLog interface
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionstatic WriteAheadLog
createLogForDriver
(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver.static WriteAheadLog
createLogForReceiver
(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver.static boolean
enableReceiverLog
(SparkConf conf) static long
getBatchingTimeout
(SparkConf conf) How long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.static int
getMaxFailures
(SparkConf conf, boolean isDriver) static int
getRollingIntervalSecs
(SparkConf conf, boolean isDriver) static boolean
isBatchingEnabled
(SparkConf conf, boolean isDriver) static org.apache.spark.internal.Logging.LogStringContext
LogStringContext
(scala.StringContext sc) static org.slf4j.Logger
static void
org$apache$spark$internal$Logging$$log__$eq
(org.slf4j.Logger x$1) static boolean
shouldCloseFileAfterWrite
(SparkConf conf, boolean isDriver)
-
Constructor Details
-
WriteAheadLogUtils
public WriteAheadLogUtils()
-
-
Method Details
-
enableReceiverLog
-
getRollingIntervalSecs
-
getMaxFailures
-
isBatchingEnabled
-
getBatchingTimeout
How long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.- Parameters:
conf
- (undocumented)- Returns:
- (undocumented)
-
shouldCloseFileAfterWrite
-
createLogForDriver
public static WriteAheadLog createLogForDriver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
sparkConf
- (undocumented)fileWalLogDirectory
- (undocumented)fileWalHadoopConf
- (undocumented)- Returns:
- (undocumented)
-
createLogForReceiver
public static WriteAheadLog createLogForReceiver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
sparkConf
- (undocumented)fileWalLogDirectory
- (undocumented)fileWalHadoopConf
- (undocumented)- Returns:
- (undocumented)
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
-