Package org.apache.spark.streaming.util
Class WriteAheadLogUtils
Object
org.apache.spark.streaming.util.WriteAheadLogUtils
A helper class with utility functions related to the WriteAheadLog interface
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionstatic WriteAheadLogcreateLogForDriver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver.static WriteAheadLogcreateLogForReceiver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver.static booleanenableReceiverLog(SparkConf conf) static longgetBatchingTimeout(SparkConf conf) How long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.static intgetMaxFailures(SparkConf conf, boolean isDriver) static intgetRollingIntervalSecs(SparkConf conf, boolean isDriver) static booleanisBatchingEnabled(SparkConf conf, boolean isDriver) static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static booleanshouldCloseFileAfterWrite(SparkConf conf, boolean isDriver) 
- 
Constructor Details- 
WriteAheadLogUtilspublic WriteAheadLogUtils()
 
- 
- 
Method Details- 
enableReceiverLog
- 
getRollingIntervalSecs
- 
getMaxFailures
- 
isBatchingEnabled
- 
getBatchingTimeoutHow long we will wait for the wrappedLog in the BatchedWriteAheadLog to write the records before we fail the write attempt to unblock receivers.- Parameters:
- conf- (undocumented)
- Returns:
- (undocumented)
 
- 
shouldCloseFileAfterWrite
- 
createLogForDriverpublic static WriteAheadLog createLogForDriver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the driver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
- sparkConf- (undocumented)
- fileWalLogDirectory- (undocumented)
- fileWalHadoopConf- (undocumented)
- Returns:
- (undocumented)
 
- 
createLogForReceiverpublic static WriteAheadLog createLogForReceiver(SparkConf sparkConf, String fileWalLogDirectory, org.apache.hadoop.conf.Configuration fileWalHadoopConf) Create a WriteAheadLog for the receiver. If configured with custom WAL class, it will try to create instance of that class, otherwise it will create the default FileBasedWriteAheadLog.- Parameters:
- sparkConf- (undocumented)
- fileWalLogDirectory- (undocumented)
- fileWalHadoopConf- (undocumented)
- Returns:
- (undocumented)
 
- 
org$apache$spark$internal$Logging$$log_public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
- 
org$apache$spark$internal$Logging$$log__$eqpublic static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) 
- 
LogStringContextpublic static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc) 
 
-