Interface BoundFunction

All Superinterfaces:
Function, Serializable
All Known Subinterfaces:
AggregateFunction<S,R>, ScalarFunction<R>

@Evolving public interface BoundFunction extends Function
Represents a function that is bound to an input type.
Since:
3.2.0
  • Method Summary

    Modifier and Type
    Method
    Description
    default String
    Returns the canonical name of this function, used to determine if functions are equivalent.
    Returns the required data types of the input values to this function.
    default boolean
    Returns whether this function result is deterministic.
    default boolean
    Returns whether the values produced by this function may be null.
    Returns the data type of values produced by this function.

    Methods inherited from interface org.apache.spark.sql.connector.catalog.functions.Function

    name
  • Method Details

    • inputTypes

      DataType[] inputTypes()
      Returns the required data types of the input values to this function.

      If the types returned differ from the types passed to UnboundFunction.bind(StructType), Spark will cast input values to the required data types. This allows implementations to delegate input value casting to Spark.

      Returns:
      an array of input value data types
    • resultType

      DataType resultType()
      Returns the data type of values produced by this function.

      For example, a "plus" function may return IntegerType when it is bound to arguments that are also IntegerType.

      Returns:
      a data type for values produced by this function
    • isResultNullable

      default boolean isResultNullable()
      Returns whether the values produced by this function may be null.

      For example, a "plus" function may return false when it is bound to arguments that are always non-null, but true when either argument may be null.

      Returns:
      true if values produced by this function may be null, false otherwise
    • isDeterministic

      default boolean isDeterministic()
      Returns whether this function result is deterministic.

      By default, functions are assumed to be deterministic. Functions that are not deterministic should override this method so that Spark can ensure the function runs only once for a given input.

      Returns:
      true if this function is deterministic, false otherwise
    • canonicalName

      default String canonicalName()
      Returns the canonical name of this function, used to determine if functions are equivalent.

      The canonical name is used to determine whether two functions are the same when loaded by different catalogs. For example, the same catalog implementation may be used for by two environments, "prod" and "test". Functions produced by the catalogs may be equivalent, but loaded using different names, like "test.func_name" and "prod.func_name".

      Names returned by this function should be unique and unlikely to conflict with similar functions in other catalogs. For example, many catalogs may define a "bucket" function with a different implementation. Adding context, like "com.mycompany.bucket(string)", is recommended to avoid unintentional collisions.

      Returns:
      a canonical name for this function