scala和spark版本对应关系:https://blog.csdn.net/qq_34319644/article/details/115555522
这里采用jdk1.8+spark3.0+scala2.12
先配置scala 2.12:
官网:https://www.scala-lang.org/download/2.12.17.html
tar -zxf scala-2.12.17.tgz
vim .bashrc
exportSCALA_HOME=/home/xingmo/sdk/scala
exportPATH=$PATH:$SCALA_HOME/bin
source .bashrc
IDEA安装scala插件:file->setting
添加libraries:file->project structure
scala测试:
packagespark.coreobject Test {def main(args:Array[String]):Unit={
println("Hello Spark")}}
jdk版本记得切换:
可参考:https://blog.csdn.net/weixin_45490198/article/details/125119932
mave依赖:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0</version>
</dependency>
</dependencies>
配置完成,测试代码:
packagespark.core.wcimportorg.apache.spark.{SparkConf, SparkContext}object Spark01_WordCount {def main(array: Array[String]):Unit={// TODO 建立和Spark框架的连接val sparConf =new SparkConf().setMaster("local").setAppName("WordCount")val sc =new SparkContext(sparConf)// TODO 执行业务操作// TODO 关闭连接
sc.stop()}}
版权归原作者 xing_mo 所有, 如有侵权,请联系我们删除。