org.apache.spark.sql.parquet

ParquetFilters

object ParquetFilters

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. ParquetFilters
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. val PARQUET_FILTER_DATA: String

  7. val PARQUET_FILTER_PUSHDOWN_ENABLED: String

  8. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. def createFilter(expression: Expression): Option[CatalystFilter]

  11. def createRecordFilter(filterExpressions: Seq[Expression]): UnboundRecordFilter

  12. def deserializeFilterExpressions(conf: Configuration): Seq[Expression]

    Note: Inside the Hadoop API we only have access to Configuration, not to org.apache.spark.SparkContext, so we cannot use broadcasts to convey the actual filter predicate.

  13. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  14. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  16. def findExpression(filter: CatalystFilter, expression: Expression): Option[CatalystFilter]

    Try to find the given expression in the tree of filters in order to determine whether it is safe to remove it from the higher level filters.

    Try to find the given expression in the tree of filters in order to determine whether it is safe to remove it from the higher level filters. Note that strictly speaking we could stop the search whenever an expression is found that contains this expression as subexpression (e.g., when searching for "a" and "(a or c)" is found) but we don't care about optimizations here since the filter tree is assumed to be small.

    filter

    The org.apache.spark.sql.parquet.CatalystFilter to expand and search

    expression

    The expression to look for

    returns

    An optional org.apache.spark.sql.parquet.CatalystFilter that contains the expression.

  17. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  18. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  19. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  20. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  21. final def notify(): Unit

    Definition Classes
    AnyRef
  22. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  23. def serializeFilterExpressions(filters: Seq[Expression], conf: Configuration): Unit

    Note: Inside the Hadoop API we only have access to Configuration, not to org.apache.spark.SparkContext, so we cannot use broadcasts to convey the actual filter predicate.

  24. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  25. def toString(): String

    Definition Classes
    AnyRef → Any
  26. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  27. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  28. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped