最近看的spark学习视频,视频里面都是用scala操作spark的,对scala语言学习较少,想要入门,可以通过下面的这个例子去用java语言思想去理解scala。主要是函数式编程思想:
一种Map Reduce的计算示例
//定义两个字符串文本
*
val txt1=“I am into Spark so much”
*
val txt2=“Scala is powerful”
*
//正常的mapReduce操作
println(List(txt1,txt2).flatMap{x=>x.split(" ")}.map {x=>(x,1)}.map(x=>x.2).reduce((x,y)=>x+y))
*
//第一次简化
println(List(txt1,txt2).flatMap{x=>x.split(" ")}.map {(,1)}.map{(.2)}.reduce((x,y)=>x+y))
*
//第二次简化
println(List(txt1,txt2).flatMap{x=>x.split(" ")}.map {(,1)}.map{(.2)}.reduce(+_))
计算过程
List(“I am into Spark so much”,“Scala is powerful”) -> .flatMap{x=>x.split(" ")
->
List(I, am, into, Spark, so, much, Scala, is, powerful) -> .map {x=>(x,1)}
->
List((I,1), (am,1), (into,1), (Spark,1), (so,1), (much,1), (Scala,1), (is,1), (powerful,1)) -> .map(x=>x.2)
->
List(1, 1, 1, 1, 1, 1, 1, 1, 1) -> .reduce(+_)
->
9