英文:
How do I configure host/ip on sparkjava?
问题
如果我在我的服务器上有多个IP地址,并且想要将每个Spark应用分配给不同的IP地址/主机,应该如何操作?
英文:
If I have multiple IP addresses on my server and would like to assign each spark app to a different IP address/host, how is this done?
答案1
得分: 1
Spark的Service
类中有一个方法(javadoc),用于设置你的Spark服务器监听的IP地址。
> public Service ipAddress(String ipAddress)
>
> 设置Spark应该监听的IP地址。如果没有调用此方法,默认地址为'0.0.0.0'。必须在进行任何路由映射之前调用此方法。
如果你想要这些应用在同一个JVM中运行,似乎可以通过实例化多个Service
对象来实现。Javadoc中提到:
> Service
代表了Spark服务器的“会话”。如果用户想在他的应用程序中使用多个'Sparks',则应该静态导入并使用方法ignite()
来创建实例。
英文:
There is a method in the Spark Service
class (javadoc) that sets the IP address that your Spark server listens on.
> public Service ipAddress(String ipAddress)
>
> Set the IP address that Spark should listen on. If not called the default address is '0.0.0.0'. This has to be called before any route mapping is done.
If you wanted these apps to run in the same JVM, it looks like you do this by instantiating multiple Service
onjects. The javadoc says:
> Service
represents a Spark server "session". If a user wants multiple 'Sparks' in his application the method ignite()
should be statically imported and used to create instances.
答案2
得分: 0
Use Spark.ipAddress("我的IP地址");
Reference: javadoc
英文:
Use Spark.ipAddress("my.IP.address");
Reference: javadoc
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论