Interface SupportsPushDownFilters
- All Superinterfaces:
ScanBuilder
A mix-in interface for
ScanBuilder
. Data sources can implement this interface to
push down filters to the data source and reduce the size of the data to be read.- Since:
- 3.0.0
-
Method Summary
Modifier and TypeMethodDescriptionFilter[]
Returns the filters that are pushed to the data source viapushFilters(Filter[])
.Filter[]
pushFilters
(Filter[] filters) Pushes down filters, and returns filters that need to be evaluated after scanning.Methods inherited from interface org.apache.spark.sql.connector.read.ScanBuilder
build
-
Method Details
-
pushFilters
Pushes down filters, and returns filters that need to be evaluated after scanning.Rows should be returned from the data source if and only if all of the filters match. That is, filters must be interpreted as ANDed together.
-
pushedFilters
Filter[] pushedFilters()Returns the filters that are pushed to the data source viapushFilters(Filter[])
.There are 3 kinds of filters:
- pushable filters which don't need to be evaluated again after scanning.
- pushable filters which still need to be evaluated after scanning, e.g. parquet row group filter.
- non-pushable filters.
Both case 1 and 2 should be considered as pushed filters and should be returned by this method.
It's possible that there is no filters in the query and
pushFilters(Filter[])
is never called, empty array should be returned for this case.
-