英文:
Our Server can't handle more than 20 requests/second
问题
经过3个月的辛勤工作,我将公司的API从PHP切换到Go,并发现我们的Go服务器无法处理超过20个请求/秒。
我们的API的基本工作原理:
- 接收请求
- 验证请求
- 使用MYSQL从数据库中获取数据
- 将数据放入Map中
- 以JSON格式发送回客户端
所以在编写了大约30个API之后,我决定进行负载测试,看看它在负载下的性能如何。
测试1: ab -n 1 -c 1 http://localhost:8000/sales/report 的结果是"Time per request: 72.623 [ms] (mean)"。
测试2: ab -n 100 -c 100 http://localhost:8000/sales/report 的结果是"Time per request: 4548.155 [ms] (mean)"(没有MYSQL错误)。
在第二个测试中,这个数字为什么突然从72.623增加到4548毫秒?我们预计每天会有数千个请求,所以我需要在最终发布之前解决这个问题。当我看到这些数字时,我感到非常惊讶,简直不敢相信。我知道Go可以做得更好。
关于服务器和设置的基本信息:
- 使用GO 1.5
- 16GB RAM
- GOMAXPROCS使用了所有8个核心
- db.SetMaxIdleConns(1000)
- db.SetMaxOpenConns(1000)(还确保我们使用连接池)
- 通过Unix套接字连接到MYSQL
- 系统运行在Ubuntu下
我们正在使用的外部库:
- github.com/go-sql-driver/mysql
- github.com/gorilla/mux
- github.com/elgs/gosqljson
有什么想法可能导致这个问题?我查看了这篇文章post,但并没有像我上面提到的那样出现任何MYSQL错误。非常感谢您提供的任何帮助。
英文:
So after 3 months of hard work in developing & switching the company API from PHP to Go I found out that our Go server can't handle more than 20 req/second.
So basically how our API works:
- takes in a request
- validates the request
- fetches the data from the DB using MYSQL
- put's the data in a Map
- send's it back to the Client in a JSON format
So after writing about 30 APIs I decided to take it for a spin and see how it performance under load test.
Test 1: ab -n 1 -c 1 http://localhost:8000/sales/report the results are "Time per request: 72.623 [ms] (mean)" .
Test 2: ab -n 100 -c 100 http://localhost:8000/sales/report the results are "Time per request: 4548.155 [ms] (mean)" (No MYSQL errors).
How did the number suddenly spike from 72.623 to 4548 ms in the second test?. We expect thousands of requests per day so I need to solve this issue before we finally release it . I was surprised when I saw the numbers ; I couldn't believe it. I know GO can do much better.
So basic info about the server and settings:
- Using GO 1.5
- 16GB RAM
- GOMAXPROCS is using all 8 Cores
- db.SetMaxIdleConns(1000)
- db.SetMaxOpenConns(1000) (also made sure we are using pool of
connections) - Connecting to MYSQL through unix socket
- System is running under Ubuntu
External libraries that we are using:
-
github.com/go-sql-driver/mysql
-
github.com/gorilla/mux
-
github.com/elgs/gosqljson
Any ideas what might be causing this? . I took a look at this post but didn't work as I mentioned above I never got any MYSQL error. Thank you in advance for any help you can provide.
答案1
得分: 3
您的帖子没有足够的信息来解释为什么您的程序表现不如您期望的那样,但我认为仅凭这个问题就值得回答:
为什么第二次测试中的数字突然从72.623增加到4548毫秒?
在您的第一次测试中,您只发送了一个请求(-n 1
)。而在第二次测试中,您同时发送了100个请求(-c 100 -n 100
)。
您提到您的程序与外部数据库进行通信,您的程序必须等待该资源响应。您了解当您同时发送1000个请求时,您的数据库的性能如何吗?您没有提到这一点。Go语言确实可以处理数百个并发请求而不费吹灰之力,但这取决于您的操作和实现方式。如果您的程序无法以与请求到达的速度相同的速度完成请求,请求将堆积起来,导致延迟增加。
您提到的这两个测试都无法了解您的服务器在“正常”情况下的性能-您说这将是“每天数千个请求”(这并不是非常具体,但我理解为“每秒几个请求”)。那么,使用-c 4 -n 1000
或类似的方式来测试,在较长的时间段内对服务器进行测试,并使用更接近您预期的并发请求数量,将更有意义。
英文:
Your post doesn't have enough information to address why your program is not performing how you expect, but I think this question alone is worth an answer:
> How did the number suddenly spike from 72.623 to 4548 ms in the second test?
In your first test, you did one single request (-n 1
). In your second test, you did 100 requests in flight simultaneously (-c 100 -n 100
).
You mention that your program communicates to an external database, your program has to wait for that resource to respond. Do you understand how your database performs when you send it 1,000 requests simultaneously? You made no mention of this. Go can certainly handle many hundreds of concurrent requests a second without breaking a sweat, but it depends what you're doing and how you're doing it. If your program can't complete requests as fast as they are coming in, they will pile up, leading to a high latency.
Neither of those tests you told us about are useful to understand how your server performs under "normal" circumstances - which you said would be "thousands of requests per day" (which isn't very specific, but I'll take to mean, "a few a second"). Then it would be much more interesting to look at -c 4 -n 1000
, or something that exercises the server over a longer period of time, with a number of concurrent requests which is more like what you expect to get.
答案2
得分: 0
我对gosqljson包不太熟悉,但你说你的查询本身并不复杂,并且你正在对一个结构良好的数据库表运行查询,所以我建议你放弃gosqljson,将查询结果绑定到结构体,然后将这些结构体转换为JSON。这样应该比对所有内容使用map[string]interface{}更快,并且会减少内存抖动。
但我不认为gosqljson可能会那么慢,所以它可能不是主要的问题。
你进行第二个基准测试的方式并不有帮助。
> 测试2:ab -n 100 -c 100 http://localhost:8000/sales/report
这不是测试你处理并发请求的速度,而更像是测试你连接到MySQL的速度。你只进行了100次查询并使用了100个请求,这意味着Go可能大部分时间都在建立最多100个到MySQL的连接。考虑到Go为满足每个查询所做的所有其他工作,它可能甚至没有时间重用任何数据库连接,然后测试就结束了。你需要将最大连接数设置为50左右,并运行10,000次查询,以查看在已经建立了一个数据库连接池的情况下并发请求需要多长时间;目前你基本上是在测试Go建立数据库连接池需要多长时间。
英文:
I'm not familiar with gosqljson package, but you say your "query by itself is not really complicated" and you're running it against "a well structured DB table," so I'd suggest dropping the gosqljson and binding your query results to structs, then marshalling those structs to json. That should be faster and incur less memory thrashing than using a map[string]interface{} for everything.
But I don't think gosqljson could possibly be that slow, so it's likely not the main culprit.
The way you're doing your second benchmark is not helpful.
> Test 2: ab -n 100 -c 100 http://localhost:8000/sales/report
That's not testing how fast you can handle concurrent requests so much as it's testing how fast you can make connections to MySQL. You're only doing 100 queries and using 100 requests, which means Go is probably spending most of its time making up to 100 connections to MySQL. Go probably doesn't even have time to reuse any of the db connections, considering all the other stuff it's doing to satisfy each query, and then, boom, the test is over. You would need to set the max connections to something like 50 and run 10,000 queries to see how long concurrent requests take once a pool of db connections is already established; right now you're basically testing how long it takes Go to build up a pool of db connections.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论