- 链接地址:https://blog.csdn.net/weixin_39381833/article/details/108349646
- 链接标题:Spark算子实现WordCount_用combinebykey实现wordcount-CSDN博客
- 所属网站:blog.csdn.net
- 被收藏次数:7520
文章浏览阅读213次。1 map + reduceByKey sparkContext.textFile("hdfs://ifeng:9000/hdfsapi/wc.txt") .flatMap(_.split(",")) .map((_,1)) .reduceByKey(_+_).collect()2 countByValue代替map + reduceByKeyval RDDfile = sparkContext.textFile("hdfs://ifeng:9_用combinebykey实现wordcount
版权声明:本文发布于特牛网址导航 内容均来源于互联网 如有侵权联系删除