Resource profile to associate with an RDD. A pyspark.resource.ResourceProfile
allows the user to specify executor and task requirements for an RDD that will get
applied during a stage. This allows the user to change the resource requirements between
stages. This is meant to be immutable so user cannot change it after building.
New in version 3.1.0.
This API is evolving.