public abstract class UserDefinedType<UserType> extends DataType implements scala.Serializable
This interface allows a user to make their own classes more interoperable with SparkSQL;
e.g., by creating a UserDefinedType
for a class X, it becomes possible to create
a DataFrame
which has class X in the schema.
For SparkSQL to recognize UDTs, the UDT must be annotated with
SQLUserDefinedType
.
The conversion via serialize
occurs when instantiating a DataFrame
from another RDD.
The conversion via deserialize
occurs when reading from a DataFrame
.
Constructor and Description |
---|
UserDefinedType() |
Modifier and Type | Method and Description |
---|---|
String |
catalogString()
String representation for the type saved in external catalogs.
|
int |
defaultSize()
The default size of a value of this data type, used internally for size estimation.
|
abstract UserType |
deserialize(Object datum)
Convert a SQL datum to the user type
|
boolean |
equals(Object other) |
int |
hashCode() |
String |
pyUDT()
Paired Python UDT class, if exists.
|
abstract Object |
serialize(UserType obj)
Convert the user type to a SQL datum
|
String |
serializedPyClass()
Serialized Python UDT class, if exists.
|
String |
sql() |
abstract DataType |
sqlType()
Underlying storage type for this UDT
|
abstract Class<UserType> |
userClass()
Class object for the UserType
|
equalsIgnoreCaseAndNullability, equalsIgnoreNullability, equalsStructurally, equalsStructurallyByName, fromDDL, fromJson, json, parseTypeWithFallback, prettyJson, simpleString, typeName
public abstract DataType sqlType()
public String pyUDT()
public String serializedPyClass()
public abstract Object serialize(UserType obj)
obj
- (undocumented)public abstract UserType deserialize(Object datum)
public abstract Class<UserType> userClass()
public int defaultSize()
DataType
defaultSize
in class DataType
public int hashCode()
hashCode
in class Object
public boolean equals(Object other)
equals
in class Object
public String catalogString()
DataType
catalogString
in class DataType