能在多群集节点中运行Java-Spark应用程序(桌面版)吗?

huangapple go评论63阅读模式
英文:

Can you run an Java-Spark application (desktop) in a multi-cluster node

问题

我已经在Java中实现了一个脚本,用Apache Spark处理和转换数据。我想在多台机器上运行这个脚本(一个多集群节点),但我找不到关于如何在Java脚本上执行的任何文档。所以我想知道这里是否有任何之前尝试过的人,因为我找不到任何相关文档?如果这不可能,那么除了将代码从Java更改为Scala之外,是否还有其他选择?

谢谢!

英文:

I have implemented a script in Java to process and transform data using Apache Spark. I want to run this script on multiple machines (a multi-cluster node) but I cannot find any documentation on how to do it for a script in Java. So I was wondering if there is anyone here who tried that before because I couldn't find any documentation on it? If it is not possible, then what is the alternative other than changing the code from Java to Scala?

Thank you!

答案1

得分: 1

如果您已经使用Spark库(包括RDD和其他内容)编写了脚本,只需将您的脚本发送到一个Spark集群中。

Spark会自动在各个工作节点之间共享执行任务。

如果您的问题是如何在集群上启动您的Spark应用程序,或者如何配置集群,请参阅Spark文档

英文:

If you have done a script using spark library (RDD and other things) you just have to send your script to a spark cluster.

And spark will share the execution between the slave by himself.

If your question is how to launch your Spark application on the cluster or how to configure a cluster take a look to Spark documentation

huangapple
  • 本文由 发表于 2020年9月6日 03:50:34
  • 转载请务必保留本文链接:https://go.coder-hub.com/63757923.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定