Package org.apache.spark.resource
Class ResourceUtils
Object
org.apache.spark.resource.ResourceUtils
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionstatic void
addTaskResourceRequests
(SparkConf sparkConf, TaskResourceRequests treqs) static String
AMOUNT()
calculateAmountAndPartsForFraction
(double doubleAmount) static String
static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement>
executorResourceRequestToRequirement
(scala.collection.Seq<ExecutorResourceRequest> resourceRequest) static final String
FPGA()
static scala.collection.immutable.Map<String,
ResourceInformation> getOrDiscoverAllResources
(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt) Gets all allocated resource information for the input component from input resources file and the application level Spark configs.static scala.collection.immutable.Map<String,
ResourceInformation> getOrDiscoverAllResourcesForResourceProfile
(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf) This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile information instead of the application level configs.static final String
GPU()
static scala.collection.Seq<ResourceID>
listResourceIds
(SparkConf sparkConf, String componentName) static void
logResourceInfo
(String componentName, scala.collection.immutable.Map<String, ResourceInformation> resources) static org.slf4j.Logger
static void
org$apache$spark$internal$Logging$$log__$eq
(org.slf4j.Logger x$1) static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation>
parseAllocated
(scala.Option<String> resourcesFileOpt, String componentName) static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation>
parseAllocatedFromJsonFile
(String resourcesFile) static scala.collection.Seq<ResourceRequest>
parseAllResourceRequests
(SparkConf sparkConf, String componentName) static ResourceRequest
parseResourceRequest
(SparkConf sparkConf, ResourceID resourceId) static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement>
parseResourceRequirements
(SparkConf sparkConf, String componentName) static final String
static boolean
resourcesMeetRequirements
(scala.collection.immutable.Map<String, Object> resourcesFree, scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> resourceRequirements) static boolean
validateTaskCpusLargeEnough
(SparkConf sparkConf, int execCores, int taskCpus) static String
VENDOR()
static void
warnOnWastedResources
(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores) static <T> scala.collection.Seq<T>
withResourcesJson
(String resourcesFile, scala.Function1<String, scala.collection.Seq<T>> extract)
-
Constructor Details
-
ResourceUtils
public ResourceUtils()
-
-
Method Details
-
DISCOVERY_SCRIPT
-
VENDOR
-
AMOUNT
-
parseResourceRequest
-
listResourceIds
public static scala.collection.Seq<ResourceID> listResourceIds(SparkConf sparkConf, String componentName) -
parseAllResourceRequests
public static scala.collection.Seq<ResourceRequest> parseAllResourceRequests(SparkConf sparkConf, String componentName) -
calculateAmountAndPartsForFraction
-
addTaskResourceRequests
-
parseResourceRequirements
-
executorResourceRequestToRequirement
public static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> executorResourceRequestToRequirement(scala.collection.Seq<ExecutorResourceRequest> resourceRequest) -
resourcesMeetRequirements
-
withResourcesJson
-
parseAllocatedFromJsonFile
public static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocatedFromJsonFile(String resourcesFile) -
parseAllocated
-
getOrDiscoverAllResources
public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResources(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt) Gets all allocated resource information for the input component from input resources file and the application level Spark configs. It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the Spark configs to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.- Parameters:
sparkConf
- (undocumented)componentName
- (undocumented)resourcesFileOpt
- (undocumented)- Returns:
- a map from resource name to resource info
-
getOrDiscoverAllResourcesForResourceProfile
public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf) This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile information instead of the application level configs.It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the ResourceProfile to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.
- Parameters:
resourcesFileOpt
- (undocumented)componentName
- (undocumented)resourceProfile
- (undocumented)sparkConf
- (undocumented)- Returns:
- a map from resource name to resource info
-
logResourceInfo
public static void logResourceInfo(String componentName, scala.collection.immutable.Map<String, ResourceInformation> resources) -
validateTaskCpusLargeEnough
-
warnOnWastedResources
public static void warnOnWastedResources(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores) -
GPU
-
FPGA
-
RESOURCE_PREFIX
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)
-