Spark算子:RDD行动Action操作(1)–first、count、reduce、collect


关键字:Spark算子、Spark RDD行动Action、first、count、reduce、collect

first

def first(): T

first返回RDD中的第一个元素,不排序。

 
 
  1. scala> var rdd1 = sc.makeRDD(Array(("A","1"),("B","2"),("C","3")),2)
  2. rdd1: org.apache.spark.rdd.RDD[(String, String)] = ParallelCollectionRDD[33] at makeRDD at :21
  3.  
  4. scala> rdd1.first
  5. res14: (String, String) = (A,1)
  6.  
  7. scala> var rdd1 = sc.makeRDD(Seq(10, 4, 2, 12, 3))
  8. rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[0] at makeRDD at :21
  9.  
  10. scala> rdd1.first
  11. res8: Int = 10
  12.  

count

def count(): Long

count返回RDD中的元素数量。

 
 
  1. scala> var rdd1 = sc.makeRDD(Array(("A","1"),("B","2"),("C","3")),2)
  2. rdd1: org.apache.spark.rdd.RDD[(String, String)] = ParallelCollectionRDD[34] at makeRDD at :21
  3.  
  4. scala> rdd1.count
  5. res15: Long = 3
  6.  

reduce

def reduce(f: (T, T) ⇒ T): T

根据映射函数f,对RDD中的元素进行二元计算,返回计算结果。

 
 
  1. scala> var rdd1 = sc.makeRDD(1 to 10,2)
  2. rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[36] at makeRDD at :21
  3.  
  4. scala> rdd1.reduce(_ + _)
  5. res18: Int = 55
  6.  
  7. scala> var rdd2 = sc.makeRDD(Array(("A",0),("A",2),("B",1),("B",2),("C",1)))
  8. rdd2: org.apache.spark.rdd.RDD[(String, Int)] = ParallelCollectionRDD[38] at makeRDD at :21
  9.  
  10. scala> rdd2.reduce((x,y) => {
  11. | (x._1 + y._1,x._2 + y._2)
  12. | })
  13. res21: (String, Int) = (CBBAA,6)
  14.  

collect

def collect(): Array[T]

collect用于将一个RDD转换成数组。

 
 
  1. scala> var rdd1 = sc.makeRDD(1 to 10,2)
  2. rdd1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[36] at makeRDD at :21
  3.  
  4. scala> rdd1.collect
  5. res23: Array[Int] = Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
  6.  

更多关于Spark算子的介绍,可参考 Spark算子系列文章 :

http://lxw1234.com/archives/2015/07/363.htm

转载请注明:lxw的大数据田地 » Spark算子:RDD行动Action操作(1)–first、count、reduce、collect


注意!

本站转载的文章为个人学习借鉴使用,本站对版权不负任何法律责任。如果侵犯了您的隐私权益,请联系我们删除。



 
粤ICP备14056181号  © 2014-2020 ITdaan.com