Class TaskInfo

Object
org.apache.spark.scheduler.TaskInfo
All Implemented Interfaces:
Cloneable

public class TaskInfo extends Object implements Cloneable
:: DeveloperApi :: Information about a running task attempt inside a TaskSet.
  • Constructor Details

    • TaskInfo

      public TaskInfo(long taskId, int index, int attemptNumber, int partitionId, long launchTime, String executorId, String host, scala.Enumeration.Value taskLocality, boolean speculative)
    • TaskInfo

      public TaskInfo(long taskId, int index, int attemptNumber, long launchTime, String executorId, String host, scala.Enumeration.Value taskLocality, boolean speculative)
      This api doesn't contains partitionId, please use the new api. Remain it for backward compatibility before Spark 3.3.
      Parameters:
      taskId - (undocumented)
      index - (undocumented)
      attemptNumber - (undocumented)
      launchTime - (undocumented)
      executorId - (undocumented)
      host - (undocumented)
      taskLocality - (undocumented)
      speculative - (undocumented)
  • Method Details

    • accumulables

      public scala.collection.immutable.Seq<AccumulableInfo> accumulables()
      Intermediate updates to accumulables during this task. Note that it is valid for the same accumulable to be updated multiple times in a single task or for two accumulables with the same name but different IDs to exist in a task.
      Returns:
      (undocumented)
    • attemptNumber

      public int attemptNumber()
    • clone

      public TaskInfo clone()
    • duration

      public long duration()
    • executorId

      public String executorId()
    • failed

      public boolean failed()
    • finishTime

      public long finishTime()
      The time when the task has completed successfully (including the time to remotely fetch results, if necessary).
      Returns:
      (undocumented)
    • finished

      public boolean finished()
    • gettingResult

      public boolean gettingResult()
    • gettingResultTime

      public long gettingResultTime()
      The time when the task started remotely getting the result. Will not be set if the task result was sent immediately when the task finished (as opposed to sending an IndirectTaskResult and later fetching the result from the block manager).
      Returns:
      (undocumented)
    • host

      public String host()
    • id

      public String id()
    • index

      public int index()
      The index of this task within its task set. Not necessarily the same as the ID of the RDD partition that the task is computing.
      Returns:
      (undocumented)
    • killed

      public boolean killed()
    • launchTime

      public long launchTime()
    • launching

      public boolean launching()
    • partitionId

      public int partitionId()
      The actual RDD partition ID in this task. The ID of the RDD partition is always same across task attempts. This will be -1 for historical data, and available for all applications since Spark 3.3.
      Returns:
      (undocumented)
    • running

      public boolean running()
    • speculative

      public boolean speculative()
    • status

      public String status()
    • successful

      public boolean successful()
    • taskId

      public long taskId()
    • taskLocality

      public scala.Enumeration.Value taskLocality()