pyspark.TaskContext

class pyspark.TaskContext[source]

Contextual information about a task which can be read or mutated during execution. To access the TaskContext for a running task, use: TaskContext.get().

Methods

attemptNumber()

cpus()

CPUs allocated to the task.

get()

Return the currently active TaskContext.

getLocalProperty(key)

Get a local property set upstream in the driver, or None if it is missing.

partitionId()

The ID of the RDD partition that is computed by this task.

resources()

Resources allocated to the task.

stageId()

The ID of the stage that this task belong to.

taskAttemptId()

An ID that is unique to this task attempt (within the same SparkContext, no two task attempts will share the same attempt ID).