pyspark.RDD.barrier

RDD.barrier()[source]

Note

Experimental

Marks the current stage as a barrier stage, where Spark must launch all tasks together. In case of a task failure, instead of only restarting the failed task, Spark will abort the entire stage and relaunch all tasks for this stage. The barrier execution mode feature is experimental and it only handles limited scenarios. Please read the linked SPIP and design docs to understand the limitations and future plans.

Returns

an RDDBarrier instance that provides actions within a barrier stage.

See also

Design Doc

New in version 2.4.0.