pyspark.sql.functions.ilike(str: ColumnOrName, pattern: ColumnOrName, escapeChar: Optional[Column] = None) → pyspark.sql.column.Column[source]

Returns true if str matches pattern with escape case-insensitively, null if any arguments are null, false otherwise. The default escape character is the ‘’.

New in version 3.5.0.

strColumn or str

A string.

patternColumn or str

A string. The pattern is a string which is matched literally, with exception to the following special symbols: _ matches any one character in the input (similar to . in posix regular expressions) % matches zero or more characters in the input (similar to .* in posix regular expressions) Since Spark 2.0, string literals are unescaped in our SQL parser. For example, in order to match “bc”, the pattern should be “abc”. When SQL config ‘spark.sql.parser.escapedStringLiterals’ is enabled, it falls back to Spark 1.6 behavior regarding string literal parsing. For example, if the config is enabled, the pattern to match “bc” should be “bc”.


An character added since Spark 3.0. The default escape character is the ‘’. If an escape character precedes a special symbol or another escape character, the following character is matched literally. It is invalid to escape any other character.


>>> df = spark.createDataFrame([("Spark", "_park")], ['a', 'b'])
>>>, df.b).alias('r')).collect()
>>> df = spark.createDataFrame(
...     [("%SystemDrive%/Users/John", "/%SystemDrive/%//Users%")],
...     ['a', 'b']
... )
>>>, df.b, lit('/')).alias('r')).collect()