pyspark.SparkContext.range¶
-
SparkContext.
range
(start, end=None, step=1, numSlices=None)[source]¶ Create a new RDD of int containing elements from start to end (exclusive), increased by step every element. Can be called the same way as python’s built-in range() function. If called with a single argument, the argument is interpreted as end, and start is set to 0.
- Parameters
start – the start value
end – the end value (exclusive)
step – the incremental step (default: 1)
numSlices – the number of partitions of the new RDD
- Returns
An RDD of int
>>> sc.range(5).collect() [0, 1, 2, 3, 4] >>> sc.range(2, 4).collect() [2, 3] >>> sc.range(1, 7, 2).collect() [1, 3, 5]