public interface StreamingQuery
Modifier and Type | Method and Description |
---|---|
void |
awaitTermination()
Waits for the termination of
this query, either by query.stop() or by an exception. |
boolean |
awaitTermination(long timeoutMs)
Waits for the termination of
this query, either by query.stop() or by an exception. |
scala.Option<StreamingQueryException> |
exception()
Returns the
StreamingQueryException if the query was terminated by an exception. |
void |
explain()
Prints the physical plan to the console for debugging purposes.
|
void |
explain(boolean extended)
Prints the physical plan to the console for debugging purposes.
|
java.util.UUID |
id()
Returns the unique id of this query that persists across restarts from checkpoint data.
|
boolean |
isActive()
Returns
true if this query is actively running. |
StreamingQueryProgress |
lastProgress()
Returns the most recent
StreamingQueryProgress update of this streaming query. |
String |
name()
Returns the user-specified name of the query, or null if not specified.
|
void |
processAllAvailable()
Blocks until all available data in the source has been processed and committed to the sink.
|
StreamingQueryProgress[] |
recentProgress()
Returns an array of the most recent
StreamingQueryProgress updates for this query. |
java.util.UUID |
runId()
Returns the unique id of this run of the query.
|
SparkSession |
sparkSession()
Returns the
SparkSession associated with this . |
StreamingQueryStatus |
status()
Returns the current status of the query.
|
void |
stop()
Stops the execution of this query if it is running.
|
void awaitTermination() throws StreamingQueryException
this
query, either by query.stop()
or by an exception.
If the query has terminated with an exception, then the exception will be thrown.
If the query has terminated, then all subsequent calls to this method will either return
immediately (if the query was terminated by stop()
), or throw the exception
immediately (if the query has terminated with exception).
StreamingQueryException
- if the query has terminated with an exception.
boolean awaitTermination(long timeoutMs) throws StreamingQueryException
this
query, either by query.stop()
or by an exception.
If the query has terminated with an exception, then the exception will be thrown.
Otherwise, it returns whether the query has terminated or not within the timeoutMs
milliseconds.
If the query has terminated, then all subsequent calls to this method will either return
true
immediately (if the query was terminated by stop()
), or throw the exception
immediately (if the query has terminated with exception).
timeoutMs
- (undocumented)StreamingQueryException
- if the query has terminated with an exception
scala.Option<StreamingQueryException> exception()
StreamingQueryException
if the query was terminated by an exception.void explain()
void explain(boolean extended)
extended
- whether to do extended explain or notjava.util.UUID id()
runId
.
boolean isActive()
true
if this query is actively running.
StreamingQueryProgress lastProgress()
StreamingQueryProgress
update of this streaming query.
String name()
org.apache.spark.sql.streaming.DataStreamWriter
as dataframe.writeStream.queryName("query").start()
.
This name, if set, must be unique across all active queries.
void processAllAvailable()
org.apache.spark.sql.execution.streaming.Source
prior to invocation. (i.e. getOffset
must immediately reflect the addition).StreamingQueryProgress[] recentProgress()
StreamingQueryProgress
updates for this query.
The number of progress updates retained for each stream is configured by Spark session
configuration spark.sql.streaming.numRecentProgressUpdates
.
java.util.UUID runId()
id
but different runId
s.SparkSession sparkSession()
SparkSession
associated with this
.
StreamingQueryStatus status()
void stop() throws java.util.concurrent.TimeoutException
By default stop will block indefinitely. You can configure a timeout by the configuration
spark.sql.streaming.stopTimeout
. A timeout of 0 (or negative) milliseconds will block
indefinitely. If a TimeoutException
is thrown, users can retry stopping the stream. If the
issue persists, it is advisable to kill the Spark application.
java.util.concurrent.TimeoutException