英文:
How to trigger Spark Java Web Framework request programmatically
问题
Spark Framework如何以编程方式触发请求?假设我们有以下代码:
http.get("/hello/:route_param") { "Hello Spark!" }
如何以正确的path
、query
、body
等方式调用它,类似于:
http.call(
url = "/hello/alex?a=b",
body = "{ value: 20 }"
) // => "Hello Spark!"
P.S.
我需要这样做来添加批处理/batch
路由 - 这样可以使用其他路由和参数列表调用它,并获取结果列表。
英文:
Is there a way for Spark Framework to trigger request programmatically? Say we have
http.get("/hello/:route_param") { "Hello Spark!" }
How it could be called, with proper path
, query
, body
etc, something like:
http.call(
url = "/hello/alex?a=b",
body = "{ value: 20 }"
) // => "Hello Spark!"
P.S.
I need it to add batch /batch
route - so it will be possible to call it with the list of other routes and parameters and get back list of results.
答案1
得分: 1
如果你指的是“以编程方式”而不是启动HTTP服务器并执行HTTP请求,那么我认为答案是否。Spark默认情况下不提供这样的功能。
有一些可能的解决方法:
- 你可以在你的应用程序中启动Spark并发出HTTP请求。这种方法的示例可以在我为Spark附加组件编写的集成测试中找到。
- 使用MockRunner的Servlet模块来创建一个(虚假的)
HttpServletRequest
。使用RequestResponseFactory.create(HttpServletRequest)
将其转换为Spark的Request
。类似地处理响应。将Spark的Route
重构为一个独立的类,并使用刚刚创建的Request
和Response
变量来调用它。
英文:
If by "programmatically" you mean "without firing up an HTTP server and performing an HTTP request" than I think the answer is no. Spark does not provide such a capability out of the box.
There are a few possible workarounds:
- You could start Spark in your application and fire an HTTP request. An example of this approach is in an integration test I wrote for my Spark add-on
- Use MockRunner's Servlet module to create a (fake)
HttpServletRequest
. UseRequestResponseFactory.create(HttpServletRequest)
to turn that into a SparkRequest
. Similarly for responses. Refactor the SparkRoute
to be a separate class, and invoke that using theRequest
andResponse
variables that you just created.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论