英文:
REST Request from multiple clients, collect all requests per second basis, process them in bulk & send response back to each clients
问题
Sure, here's the translation of the provided content:
我们有一个 Web 服务器,每秒接收数百个客户端请求。
- 需要按每秒的基础收集来自客户端的所有 REST 请求。
- 对于收集到的请求,准备批量请求到 Elasticsearch 并获取响应。
- 解析 Elasticsearch 响应并为每个 REST 客户端准备单独的响应。
- 将每个响应发送给 REST 客户端。
我可以处理 Elasticsearch 批量请求,并需要帮助创建解决上述问题的 REST API。
目前,对于每个客户端请求,服务器都会发起单个的 Elasticsearch 调用,这可能会非常昂贵,需要一种解决方案来避免这种情况。
英文:
We are having a web server which receives 100s of client requests every seconds.
- Need to collect all REST request from clients per second basis.
- For collected requests, prepare bulk request to Elasticsearch & get the response.
- Parse Elasticsearch response & prepare individual responses for each REST clients.
- Send each response to REST clients.
I can take care of Elasticsearch bulk request & need help in creating a REST API which solves above stated problem.
Currently, for each client request, server is making a single Elasticsearch call would be very expensive and to avoid it, needed a solution.
答案1
得分: 1
以下是翻译好的部分:
有多种可能的解决方案,但无论如何,一般的方法是以线程安全的方式存储请求;以所需速率在单独的线程中安排作业,能够批量加载存储的数据;当批量加载完成时,以某种方式通知请求处理程序。
在一些更详细的情况下,可以按以下方式执行:
- 对于每个请求,创建一个
CompletableFuture
,并将其与接收到的数据一起放入线程安全的集合中。例如,ArrayBlockingQueue<DataAndFutureContainer>
或ConcurrentHashMap<DataToLoad, CompletableFuture>
。 - 使用
@Scheduled
(如果使用Spring)或Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate()
以所需速率安排作业。 - 该作业应该检索(例如通过
ArrayBlockingQueue.poll()
或ConcurrentHashMap.remove()
)当前可用的所有数据,进行批量加载(或其他操作),并使用批量加载的结果(或其他结果)通过.complete()
履行相关的CompletableFuture
。 - 请求处理程序,在第1步创建了这些
CompletableFuture
后,通过future.get()
等待它,从而获得批量加载的结果。
此外,根据您的框架,还可以进行一些优化。例如,使用Spring时,您可以从请求处理程序返回DeferredResult,并使用它来替代CompletableFuture
。如果您使用Spring WebFlux,可以返回Mono.fromFuture()
。
这是一种方法的简要描述,请注意细节。
英文:
Well, there are multiple possible solutions, however, anyway the general approach is to store requests in a thread-safe manner; schedule a job in a separate thread at a desired rate capable of bulk load stored data; notify a request handler in some way when bulk load is complete.
In some more details it could be done as follows:
- For each request create a
CompletableFuture
and put in along with received data in an a thread-safe collection. For example,ArrayBlockingQueue<DataAndFutureContainer>
orConcurrentHashMap<DataToLoad, CompletableFuture>
- Schedule a job at a desired rate via
@Scheduled
(if using Spring) orExecutors.newSingleThreadScheduledExecutor().scheduleAtFixedRate()
- That job should retrieve (for example via
ArrayBlockingQueue.poll()
orConcurrentHashMap.remove()
) all data available at the moment, do bulk load (or whatever you want), and fulfil the relatedCompletableFuture
via.complete()
with result of bulk load (or whatever) - Request handler, which have created that futures in step 1 waits on it by
future.get()
, thus gets the result of a bulk load.
Also, some optimisations is available depending on your framework. For example, with Spring you could return DeferredResult from request handler and use instead of CompletableFuture
. If you are using Spring WebFlux, you could return Mono.fromFuture()
Thats a brief description of a approach, be careful in details.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论