有没有办法在同一个JVM中运行多个Spark
java服务器实例?我在“插件”软件中使用它,并根据外部环境启动我的插件的多个实例,然后导致
java.lang.IllegalStateException: This must be done before route mapping has begun at spark.SparkBase.throwBeforeRouteMappingException(SparkBase.java:256) at spark.SparkBase.port(SparkBase.java:101) at com.foo.bar.a(SourceFile:59)
在我看来,它是在代码中围绕静态字段构建的代码,所以我正在考虑类加载器技巧或使用SparkServerFactory以某种方式消除SparkBase.
解决方法
从Spark 2.5你可以使用ignite():
http://sparkjava.com/news.html#spark25released
例:
public static void main(String[] args) { igniteFirstSpark(); igniteSecondSpark(); } static void igniteSecondSpark() { Service http = ignite(); http.get("/basicHello",(q,a) -> "Hello from port 4567!"); } static void igniteFirstSpark() { Service http = ignite() .port(8080) .threadPool(20); http.get("/configuredHello",a) -> "Hello from port 8080!"); }
我个人初始化它们是这样的:
import spark.Service public static void main(String[] args) { Service service1 = Service.ignite().port(8080).threadPool(20) Service service2 = Service.ignite().port(8081).threadPool(10) }
我建议阅读关于how to use those services outside your main method,我认为这在这里很有用.