public class ResourceUtils
extends Object
Constructor and Description |
---|
ResourceUtils() |
Modifier and Type | Method and Description |
---|---|
static void |
addTaskResourceRequests(SparkConf sparkConf,
TaskResourceRequests treqs) |
static String |
AMOUNT() |
static scala.Tuple2<Object,Object> |
calculateAmountAndPartsForFraction(double doubleAmount) |
static String |
DISCOVERY_SCRIPT() |
static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> |
executorResourceRequestToRequirement(scala.collection.Seq<ExecutorResourceRequest> resourceRequest) |
static String |
FPGA() |
static scala.collection.immutable.Map<String,ResourceInformation> |
getOrDiscoverAllResources(SparkConf sparkConf,
String componentName,
scala.Option<String> resourcesFileOpt)
Gets all allocated resource information for the input component from input resources file and
the application level Spark configs.
|
static scala.collection.immutable.Map<String,ResourceInformation> |
getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt,
String componentName,
ResourceProfile resourceProfile,
SparkConf sparkConf)
This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile
information instead of the application level configs.
|
static String |
GPU() |
static scala.collection.Seq<ResourceID> |
listResourceIds(SparkConf sparkConf,
String componentName) |
static void |
logResourceInfo(String componentName,
scala.collection.immutable.Map<String,ResourceInformation> resources) |
static void |
org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) |
static org.slf4j.Logger |
org$apache$spark$internal$Logging$$log_() |
static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation> |
parseAllocated(scala.Option<String> resourcesFileOpt,
String componentName) |
static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation> |
parseAllocatedFromJsonFile(String resourcesFile) |
static scala.collection.Seq<ResourceRequest> |
parseAllResourceRequests(SparkConf sparkConf,
String componentName) |
static ResourceRequest |
parseResourceRequest(SparkConf sparkConf,
ResourceID resourceId) |
static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> |
parseResourceRequirements(SparkConf sparkConf,
String componentName) |
static String |
RESOURCE_PREFIX() |
static boolean |
resourcesMeetRequirements(scala.collection.immutable.Map<String,Object> resourcesFree,
scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> resourceRequirements) |
static boolean |
validateTaskCpusLargeEnough(SparkConf sparkConf,
int execCores,
int taskCpus) |
static String |
VENDOR() |
static void |
warnOnWastedResources(ResourceProfile rp,
SparkConf sparkConf,
scala.Option<Object> execCores) |
static <T> scala.collection.Seq<T> |
withResourcesJson(String resourcesFile,
scala.Function1<String,scala.collection.Seq<T>> extract) |
public static String DISCOVERY_SCRIPT()
public static String VENDOR()
public static String AMOUNT()
public static ResourceRequest parseResourceRequest(SparkConf sparkConf, ResourceID resourceId)
public static scala.collection.Seq<ResourceID> listResourceIds(SparkConf sparkConf, String componentName)
public static scala.collection.Seq<ResourceRequest> parseAllResourceRequests(SparkConf sparkConf, String componentName)
public static scala.Tuple2<Object,Object> calculateAmountAndPartsForFraction(double doubleAmount)
public static void addTaskResourceRequests(SparkConf sparkConf, TaskResourceRequests treqs)
public static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> parseResourceRequirements(SparkConf sparkConf, String componentName)
public static scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> executorResourceRequestToRequirement(scala.collection.Seq<ExecutorResourceRequest> resourceRequest)
public static boolean resourcesMeetRequirements(scala.collection.immutable.Map<String,Object> resourcesFree, scala.collection.Seq<org.apache.spark.resource.ResourceRequirement> resourceRequirements)
public static <T> scala.collection.Seq<T> withResourcesJson(String resourcesFile, scala.Function1<String,scala.collection.Seq<T>> extract)
public static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocatedFromJsonFile(String resourcesFile)
public static scala.collection.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocated(scala.Option<String> resourcesFileOpt, String componentName)
public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResources(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt)
sparkConf
- (undocumented)componentName
- (undocumented)resourcesFileOpt
- (undocumented)public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf)
It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the ResourceProfile to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.
resourcesFileOpt
- (undocumented)componentName
- (undocumented)resourceProfile
- (undocumented)sparkConf
- (undocumented)public static void logResourceInfo(String componentName, scala.collection.immutable.Map<String,ResourceInformation> resources)
public static boolean validateTaskCpusLargeEnough(SparkConf sparkConf, int execCores, int taskCpus)
public static void warnOnWastedResources(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores)
public static final String GPU()
public static final String FPGA()
public static final String RESOURCE_PREFIX()
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)