英文:
How does Uvicorn / Fastapi handle concurrency with 1 worker and synchronous endpoint?
问题
I am trying to understand the behavior of Uvicorn. I have create a sample fastapi app which mainly sleep for 5 seconds.
import time
from datetime import datetime
from fastapi import FastAPI
app = FastAPI()
counter = 0
@app.get("/")
def root():
global counter
counter = counter + 1
my_id = counter
print(f'I ({my_id}) am feeling sleepy')
time.sleep(5)
print(f'I ({my_id}) am done sleeping')
return {}
I called my app using the following command of Apache Bench:
ab -n 5 -c 5 http://127.0.0.1:8000/
Output:
I (1) am feeling sleepy -- 0s
I (1) am done sleeping -- 5s
I (2) am feeling sleepy -- 5s
I (3) am feeling sleepy -- 5s
I (4) am feeling sleepy -- 5s
I (5) am feeling sleepy -- 5s
I (2) am done sleeping -- 10s
I (4) am done sleeping -- 10s
I (3) am done sleeping -- 10s
I (5) am done sleeping -- 10s
Why are requests running concurrently? I ran the app as:
uvicorn main:app --workers 1
Please note that I did not use the async keyword so for me everything should be completely synchronous.
From the FastAPI docs:
> When you declare a path operation function with normal def instead of
> async def, it is run in an external threadpool that is then awaited,
> instead of being called directly (as it would block the server).
Where is this threadpool? As I am using sleep, I though the only worker available would be completely blocked.
<details>
<summary>英文:</summary>
Understanding Uvicorn asynchrounous behavior
I am trying to understand the behavior of Uvicorn. I have create a sample fastapi app which mainly sleep for 5 seconds.
import time
from datetime import datetime
from fastapi import FastAPI
app = FastAPI()
counter = 0
@app.get("/")
def root():
global counter
counter = counter + 1
my_id = counter
print(f'I ({my_id}) am feeling sleepy')
time.sleep(5)
print(f'I ({my_id}) am done sleeping')
return {}
I called my app using the following command of Apache Bench:
ab -n 5 -c 5 http://127.0.0.1:8000/
Output:
I (1) am feeling sleepy -- 0s
I (1) am done sleeping -- 5s
I (2) am feeling sleepy -- 5s
I (3) am feeling sleepy -- 5s
I (4) am feeling sleepy -- 5s
I (5) am feeling sleepy -- 5s
I (2) am done sleeping -- 10s
I (4) am done sleeping -- 10s
I (3) am done sleeping -- 10s
I (5) am done sleeping -- 10s
Why are requests running concurrently? I ran the app as:
uvicorn main:app --workers 1
Please note that I did not use the async keyword so for me everything should be completely synchronous.
From the FastAPI docs:
> When you declare a path operation function with normal def instead of
> async def, it is run in an external threadpool that is then awaited,
> instead of being called directly (as it would block the server).
Where is this threadpool? As I am using sleep, I though the only worker available would be completely blocked.
</details>
# 答案1
**得分**: 3
FastAPI 使用 Starlette,后者在幕后使用 AnyIO。默认提供了一个包含 40 个线程的线程池来处理同步请求。这个线程池是用于处理并发同步请求的多次执行的魔法背后的关键。
这可以进行配置:
```python
from anyio.lowlevel import RunVar
from anyio import CapacityLimiter
app = FastAPI()
@app.on_event("startup")
def startup():
print("start")
RunVar("_default_thread_limiter").set(CapacityLimiter(2))
来源:https://github.com/tiangolo/fastapi/issues/4221#issuecomment-982260467
鸣谢 @MatsLindh。
英文:
FastAPI uses Starlette which uses AnyIO behind the scene. A thread pool of 40 threads is provided by default to handle synchronous requests. This thread pool is behind the magic of the multiple executions of concurrent synchronous requests.
This can be configured:
from anyio.lowlevel import RunVar
from anyio import CapacityLimiter
app = FastAPI()
@app.on_event("startup")
def startup():
print("start")
RunVar("_default_thread_limiter").set(CapacityLimiter(2))
Source: https://github.com/tiangolo/fastapi/issues/4221#issuecomment-982260467
Kudos to @MatsLindh.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论