Package org.apache.spark.util
Class RpcUtils
Object
org.apache.spark.util.RpcUtils
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionstatic org.apache.spark.rpc.RpcTimeout
askRpcTimeout
(SparkConf conf) Returns the default Spark timeout to use for RPC ask operations.static org.apache.spark.rpc.RpcTimeout
Infinite timeout is used internally, so there's no timeout configuration property that controls it.static org.apache.spark.rpc.RpcTimeout
lookupRpcTimeout
(SparkConf conf) Returns the default Spark timeout to use for RPC remote endpoint lookup.static org.apache.spark.rpc.RpcEndpointRef
makeDriverRef
(String name, SparkConf conf, org.apache.spark.rpc.RpcEnv rpcEnv) Retrieve aRpcEndpointRef
which is located in the driver via its name.static int
maxMessageSizeBytes
(SparkConf conf) Returns the configured max message size for messages in bytes.
-
Constructor Details
-
RpcUtils
public RpcUtils()
-
-
Method Details
-
makeDriverRef
public static org.apache.spark.rpc.RpcEndpointRef makeDriverRef(String name, SparkConf conf, org.apache.spark.rpc.RpcEnv rpcEnv) Retrieve aRpcEndpointRef
which is located in the driver via its name.- Parameters:
name
- (undocumented)conf
- (undocumented)rpcEnv
- (undocumented)- Returns:
- (undocumented)
-
askRpcTimeout
Returns the default Spark timeout to use for RPC ask operations. -
lookupRpcTimeout
Returns the default Spark timeout to use for RPC remote endpoint lookup. -
INFINITE_TIMEOUT
public static org.apache.spark.rpc.RpcTimeout INFINITE_TIMEOUT()Infinite timeout is used internally, so there's no timeout configuration property that controls it. Therefore, we use "infinite" without any specific reason as its timeout configuration property. And its timeout property should never be accessed since infinite means we never timeout.- Returns:
- (undocumented)
-
maxMessageSizeBytes
Returns the configured max message size for messages in bytes.
-