Class ResourceUtils

Object
org.apache.spark.resource.ResourceUtils

public class ResourceUtils extends Object
  • Constructor Details

    • ResourceUtils

      public ResourceUtils()
  • Method Details

    • DISCOVERY_SCRIPT

      public static String DISCOVERY_SCRIPT()
    • VENDOR

      public static String VENDOR()
    • AMOUNT

      public static String AMOUNT()
    • parseResourceRequest

      public static ResourceRequest parseResourceRequest(SparkConf sparkConf, ResourceID resourceId)
    • listResourceIds

      public static scala.collection.immutable.Seq<ResourceID> listResourceIds(SparkConf sparkConf, String componentName)
    • parseAllResourceRequests

      public static scala.collection.immutable.Seq<ResourceRequest> parseAllResourceRequests(SparkConf sparkConf, String componentName)
    • calculateAmountAndPartsForFraction

      public static scala.Tuple2<Object,Object> calculateAmountAndPartsForFraction(double doubleAmount)
    • addTaskResourceRequests

      public static void addTaskResourceRequests(SparkConf sparkConf, TaskResourceRequests treqs)
    • parseResourceRequirements

      public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement> parseResourceRequirements(SparkConf sparkConf, String componentName)
    • executorResourceRequestToRequirement

      public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement> executorResourceRequestToRequirement(scala.collection.immutable.Seq<ExecutorResourceRequest> resourceRequest)
    • resourcesMeetRequirements

      public static boolean resourcesMeetRequirements(scala.collection.immutable.Map<String,Object> resourcesFree, scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement> resourceRequirements)
    • withResourcesJson

      public static <T> scala.collection.immutable.Seq<T> withResourcesJson(String resourcesFile, scala.Function1<String,scala.collection.immutable.Seq<T>> extract)
    • parseAllocatedFromJsonFile

      public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocatedFromJsonFile(String resourcesFile)
    • parseAllocated

      public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocated(scala.Option<String> resourcesFileOpt, String componentName)
    • getOrDiscoverAllResources

      public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResources(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt)
      Gets all allocated resource information for the input component from input resources file and the application level Spark configs. It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the Spark configs to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.
      Parameters:
      sparkConf - (undocumented)
      componentName - (undocumented)
      resourcesFileOpt - (undocumented)
      Returns:
      a map from resource name to resource info
    • getOrDiscoverAllResourcesForResourceProfile

      public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf)
      This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile information instead of the application level configs.

      It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the ResourceProfile to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.

      Parameters:
      resourcesFileOpt - (undocumented)
      componentName - (undocumented)
      resourceProfile - (undocumented)
      sparkConf - (undocumented)
      Returns:
      a map from resource name to resource info
    • logResourceInfo

      public static void logResourceInfo(String componentName, scala.collection.immutable.Map<String,ResourceInformation> resources)
    • validateTaskCpusLargeEnough

      public static boolean validateTaskCpusLargeEnough(SparkConf sparkConf, int execCores, int taskCpus)
    • warnOnWastedResources

      public static void warnOnWastedResources(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores)
    • GPU

      public static final String GPU()
    • FPGA

      public static final String FPGA()
    • RESOURCE_PREFIX

      public static final String RESOURCE_PREFIX()
    • org$apache$spark$internal$Logging$$log_

      public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
    • org$apache$spark$internal$Logging$$log__$eq

      public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)
    • LogStringContext

      public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)