特牛网址导航

Spark算子实现WordCount_用combinebykey实现wordcount-CSDN博客

网友收藏
文章浏览阅读213次。1 map + reduceByKey sparkContext.textFile("hdfs://ifeng:9000/hdfsapi/wc.txt") .flatMap(_.split(",")) .map((_,1)) .reduceByKey(_+_).collect()2 countByValue代替map + reduceByKeyval RDDfile = sparkContext.textFile("hdfs://ifeng:9_用combinebykey实现wordcount