运行应用程序 Spark 在 Intellij 14.1.3

我正在尝试启动该应用程序 Spark, 写道 Scala 在 Intellij 14.1.3.The scala sdk is scala-sdk-2.11.6. 执行代码时,我收到以下错误:


Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty//Lscala/collection/immutable/HashSet;
at akka.actor.ActorCell$.<init>/ActorCell.scala:336/
at akka.actor.ActorCell$.<clinit>/ActorCell.scala/
at akka.actor.RootActorPath.$div/ActorPath.scala:159/
at akka.actor.LocalActorRefProvider.<init>/ActorRefProvider.scala:464/
at akka.remote.RemoteActorRefProvider.<init>/RemoteActorRefProvider.scala:124/
at sun.reflect.NativeConstructorAccessorImpl.newInstance0/Native Method/
at sun.reflect.NativeConstructorAccessorImpl.newInstance/NativeConstructorAccessorImpl.java:62/
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance/DelegatingConstructorAccessorImpl.java:45/
at java.lang.reflect.Constructor.newInstance/Constructor.java:422/
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply/DynamicAccess.scala:78/
at scala.util.Try$.apply/Try.scala:191/
at akka.actor.ReflectiveDynamicAccess.createInstanceFor/DynamicAccess.scala:73/
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply/DynamicAccess.scala:84/
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply/DynamicAccess.scala:84/
at scala.util.Success.flatMap/Try.scala:230/
at akka.actor.ReflectiveDynamicAccess.createInstanceFor/DynamicAccess.scala:84/
at akka.actor.ActorSystemImpl.liftedTree1$1/ActorSystem.scala:584/
at akka.actor.ActorSystemImpl.<init>/ActorSystem.scala:577/
at akka.actor.ActorSystem$.apply/ActorSystem.scala:141/
at akka.actor.ActorSystem$.apply/ActorSystem.scala:118/
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem/AkkaUtils.scala:122/
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply/AkkaUtils.scala:55/
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply/AkkaUtils.scala:54/
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp/Utils.scala:1837/
at scala.collection.immutable.Range.foreach$mVc$sp/Range.scala:166/
at org.apache.spark.util.Utils$.startServiceOnPort/Utils.scala:1828/
at org.apache.spark.util.AkkaUtils$.createActorSystem/AkkaUtils.scala:57/
at org.apache.spark.SparkEnv$.create/SparkEnv.scala:223/
at org.apache.spark.SparkEnv$.createDriverEnv/SparkEnv.scala:163/
at org.apache.spark.SparkContext.createSparkEnv/SparkContext.scala:269/
at org.apache.spark.SparkContext.<init>/SparkContext.scala:272/
at LRParquetProcess$.main/LRParquetProcess.scala:9/
at LRParquetProcess.main/LRParquetProcess.scala/
at sun.reflect.NativeMethodAccessorImpl.invoke0/Native Method/
at sun.reflect.NativeMethodAccessorImpl.invoke/NativeMethodAccessorImpl.java:62/
at sun.reflect.DelegatingMethodAccessorImpl.invoke/DelegatingMethodAccessorImpl.java:43/
at java.lang.reflect.Method.invoke/Method.java:497/
at com.intellij.rt.execution.application.AppMain.main/AppMain.java:140/


该过程以输出代码结束 1

我的 pom.xml 如下所示:


<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="[url=http://maven.apache.org/POM/4.0.0"]http://maven.apache.org/POM/4.0.0"[/url] xmlns:xsi="[url=http://www.w3.org/2001/XMLSchema-instance"]http://www.w3.org/2001/XMLSchema-instance"[/url] xsi:schemalocation="[url=http://maven.apache.org/POM/4.0.0]http://maven.apache.org/POM/4.0.0[/url] [url=http://maven.apache.org/xsd/maven-4.0.0.xsd">]http://maven.apache.org/xsd/ma ... gt%3B[/url]
<modelversion>4.0.0</modelversion>
<groupid>ParquetGeneration</groupid>
<artifactid>ParquetGeneration</artifactid>
<version>1.0-SNAPSHOT</version>
<properties>
<hadoop.version>2.7.0</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupid>org.apache.spark</groupid>
<artifactid>spark-core_2.10</artifactid>
<version>1.3.1</version>
<exclusions>
<exclusion>
<groupid>org.apache.hadoop</groupid>
<artifactid>hadoop-client</artifactid>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupid>org.apache.hadoop</groupid>
<artifactid>hadoop-hdfs</artifactid>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupid>org.apache.hadoop</groupid>
<artifactid>hadoop-common</artifactid>
<version>${hadoop.version}</version>
<exclusions>
<exclusion>
<groupid>org.eclipse.jetty</groupid>
<artifactid>*</artifactid>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupid>org.apache.hadoop</groupid>
<artifactid>hadoop-mapreduce-client-app</artifactid>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupid>org.scala-lang</groupid>
<artifactid>scala-library</artifactid>
<version>2.10.5</version>
</dependency>
<dependency>
<groupid>org.scala-lang</groupid>
<artifactid>scala-compiler</artifactid>
<version>2.10.5</version>
</dependency>
<dependency>
<groupid>org.apache.spark</groupid>
<artifactid>spark-sql_2.10</artifactid>
<version>1.2.1</version>
</dependency>
<dependency>
<groupid>com.typesafe.akka</groupid>
<artifactid>akka-actor_2.10</artifactid>
<version>2.3.11</version>
</dependency>
<dependency>
<groupid>org.apache.spark</groupid>
<artifactid>spark-hive_2.10</artifactid>
<version>1.3.1</version>
</dependency>
</dependencies>


</project></init></init></init></init></clinit></init>
已邀请:

卫东

赞同来自:

去 scala 2.10, 目前会更好

君笑尘

赞同来自:

如图所示,你必须尝试 2.10.x.

安装 2.10.x 并设置适当的环境变量以使用它。 由于您已经有一个项目,请转到该文件 - > 项目结构 - > 全球图书馆和删除 2.11.x. 然后加 2.10.x, 按下按钮 " + " - > Scala SDK - > 概述和选择文件夹 2.10.x, 您之前安装的哪个。

版本的要求 scala 指定B.
https://spark.apache.org/docs/latest/
.

要回复问题请先登录注册