英文:
Can't create SparkSession in Spring Boot
问题
我正在尝试在Spring上运行Spark,项目本身没有错误,但在尝试创建会话后,我遇到了这个错误。
java.lang.NoSuchMethodError: 'scala.collection.SeqOps scala.collection.mutable.Buffer$.empty()'
pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.thoughtworks.paranamer</groupId>
<artifactId>paranamer</artifactId>
<version>2.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework/spring-core -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>6.0.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.3.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.13</artifactId>
<version>3.3.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.15</version>
</dependency>
</dependencies>
controller
@RestController
public class MainController {
@GetMapping("/")
public String index() {
SparkSession spark = SparkSession
.builder()
.appName("AppSpark")
.config("spark.master", "local")
.getOrCreate();
return "Hello World!";
}
}
我尝试降低pom文件中的版本,但没有帮助,只是错误类型发生了变化。如果有人遇到相同的错误,请帮忙解决将不胜感激。
英文:
I'm trying to run Spark on Spring, the project itself runs without errors, but after I try to create a session, i get this error.
java.lang.NoSuchMethodError: 'scala.collection.SeqOps scala.collection.mutable.Buffer$.empty()'
pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.thoughtworks.paranamer</groupId>
<artifactId>paranamer</artifactId>
<version>2.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework/spring-core -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>6.0.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.3.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.13</artifactId>
<version>3.3.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.15</version>
</dependency>
</dependencies>
controller
@RestController
public class MainController {
@GetMapping("/")
public String index() {
SparkSession spark = SparkSession
.builder()
.appName("AppSpark")
.config("spark.master", "local")
.getOrCreate();
return "Hello World!";
}
}
I tried to downgrade the versions in the pom file, but it did not help, only the error type changed.
If anyone has come across the same error, I would appreciate any help.
答案1
得分: 1
Scala库是针对特定Scala版本编译的,这个版本在构件名称中可见,比如spark-sql_2.13
表示它是为Scala 2.13编译的spark-sql
。
在同一个项目中不能使用多个Scala版本。
在你的情况下,你有:
scala-library
: 2.12spark-core
: 2.12spark-sql
: 2.13
选择一个版本并坚持使用它,如果可能的话,选择2.13,因为它更近期。
英文:
Scala libraries are compiled against a Scala version which is visible in the artifact name like spark-sql_2.13
means it's spark-sql
compiled for Scala 2.13.
You cannot have multiple Scala versions in the same project.
In your case, you have:
scala-library
: 2.12spark-core
: 2.12spark.sql
: 2.13
Choose a single version and stick to it. 2.13 if possible as it's more recent.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论