Spark函数之collect、toArray和collectAsMap 2015-08-20 21:07

collect、toArray

将RDD转换为Scala的数组。

collectAsMap

与collect、toArray相似。collectAsMap将key-value型的RDD转换为Scala的map。

注意:map中如果有相同的key,其value只保存最后一个值。

1
2
3
4
5
6
7
8
9
scala> var z = sc.parallelize(List( ("cat",2), ("cat", 5), ("mouse", 4),("cat", 12), ("dog", 12), ("mouse", 2)), 2)
z: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[129] at parallelize at <console>:21

scala> z.collect
res44: Array[(String, Int)] = Array((cat,2), (cat,5), (mouse,4), (cat,12), (dog,12), (mouse,2))

scala> z.collectAsMap
res45: scala.collection.Map[String,Int] = Map(dog -> 12, cat -> 12, mouse -> 2)
scala>
Tags: #Spark    Post on Spark-API