使用goroutines时,进行HTTP GET请求的时间响应

huangapple go评论76阅读模式
英文:

Time response for HTTP GET request when using goroutines

问题

我有一个简单的代码,它会打印出一个文本文件(url_list.txt)中列出的每个URL的GET响应时间。

当顺序发送请求时,返回的时间与各个URL的预期响应时间相对应。

然而,当同时执行相同的代码时,返回的响应时间通常比预期的要高。

似乎我在调用http.Get(url)之前捕获的time_start并不是实际发送请求的时间。我猜测http.Get(url)的执行被排队到某种程度。

在使用goroutines时,有没有更好的方法来捕获URL的响应时间?

以下是我的代码:

顺序请求:

package main

import (
	"fmt"
	"net/http"
	"io/ioutil"
	"time"
	"strings"
)

func get_resp_time(url string) {
	time_start := time.Now()
	resp, err := http.Get(url)
	if err != nil {
		panic(err)
	}
	defer resp.Body.Close()
	fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile("url_list.txt")
	urls := strings.Split(string(content), "\n")

	for _, url := range urls {
		get_resp_time(url)
		//go get_resp_time(url)
	}

	//time.Sleep(20 * time.Second)
}

并发请求:

package main

import (
	"fmt"
	"net/http"
	"io/ioutil"
	"time"
	"strings"
)

func get_resp_time(url string) {
	time_start := time.Now()
	resp, err := http.Get(url)
	if err != nil {
		panic(err)
	}
	defer resp.Body.Close()
	fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile("url_list.txt")
	urls := strings.Split(string(content), "\n")

	for _, url := range urls {
		//get_resp_time(url)
		go get_resp_time(url)
	}

	time.Sleep(20 * time.Second)
}
英文:

I have a simple code that prints GET response time for each URL listed in a text file (url_list.txt).

When the requests are fired sequentially the returned times correspond to the expected response times of individual URLs.

However, when the same code is executed concurrently the returned response times are typically higher than expected.

It seems that the time_start I capture before the http.Get(url) is called is not the time of when the request is actually sent. I guess the execution of http.Get(url) is queued to some extend.

Is there a better way to capture URL response time when using goroutines?

Here is my code:

Sequential requests:

<!-- language: lang-c -->

package main
 
import (&quot;fmt&quot;
		&quot;net/http&quot;
		&quot;io/ioutil&quot;
		&quot;time&quot;
		&quot;strings&quot;
)

func get_resp_time(url string) {
		time_start := time.Now()
		resp, err := http.Get(url)
		if err != nil {
			panic(err)
		}
		defer resp.Body.Close()
		fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile(&quot;url_list.txt&quot;)
	urls := strings.Split(string(content), &quot;\n&quot;)

	for _, url := range urls {
		get_resp_time(url)
		//go get_resp_time(url)
	}

	//time.Sleep(20 * time.Second)
}

Concurrent requests:

<!-- language: lang-c -->

package main
 
import (&quot;fmt&quot;
		&quot;net/http&quot;
		&quot;io/ioutil&quot;
		&quot;time&quot;
		&quot;strings&quot;
)

func get_resp_time(url string) {
		time_start := time.Now()
		resp, err := http.Get(url)
		if err != nil {
			panic(err)
		}
		defer resp.Body.Close()
		fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile(&quot;url_list.txt&quot;)
	urls := strings.Split(string(content), &quot;\n&quot;)

	for _, url := range urls {
		//get_resp_time(url)
		go get_resp_time(url)
	}

	time.Sleep(20 * time.Second)
} 

答案1

得分: 7

你正在一次性启动所有请求。如果文件中有成千上万个URL,那么你一次性启动了成千上万个Go协程。这可能会导致错误,例如超出套接字或文件句柄的限制。我建议你一次只启动有限数量的请求,就像下面的代码一样。

这样做也可以帮助你控制时间。

package main

import (
	"fmt"
	"io/ioutil"
	"log"
	"net/http"
	"strings"
	"sync"
	"time"
)

func get_resp_time(url string) {
	time_start := time.Now()
	resp, err := http.Get(url)
	if err != nil {
		log.Printf("Error fetching: %v", err)
	}
	defer resp.Body.Close()
	fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile("url_list.txt")
	urls := strings.Split(string(content), "\n")

	const workers = 25

	wg := new(sync.WaitGroup)
	in := make(chan string, 2*workers)

	for i := 0; i < workers; i++ {
		wg.Add(1)
		go func() {
			defer wg.Done()
			for url := range in {
				get_resp_time(url)
			}
		}()
	}

	for _, url := range urls {
		if url != "" {
			in <- url
		}
	}
	close(in)
	wg.Wait()
}

希望对你有帮助!

英文:

You are starting all the requests at once. If there are 1000s of urls in the file then you are starting 1000s of go routines all at once. This may work, but may give you errors about being out of sockets or file handles. I'd recommend starting a limited number of fetches at once, like this code below.

This should help with the timing also.

package main

import (
	&quot;fmt&quot;
	&quot;io/ioutil&quot;
	&quot;log&quot;
	&quot;net/http&quot;
	&quot;strings&quot;
	&quot;sync&quot;
	&quot;time&quot;
)

func get_resp_time(url string) {
	time_start := time.Now()
	resp, err := http.Get(url)
	if err != nil {
		log.Printf(&quot;Error fetching: %v&quot;, err)
	}
	defer resp.Body.Close()
	fmt.Println(time.Since(time_start), url)
}

func main() {
	content, _ := ioutil.ReadFile(&quot;url_list.txt&quot;)
	urls := strings.Split(string(content), &quot;\n&quot;)

	const workers = 25

	wg := new(sync.WaitGroup)
	in := make(chan string, 2*workers)

	for i := 0; i &lt; workers; i++ {
		wg.Add(1)
		go func() {
			defer wg.Done()
			for url := range in {
				get_resp_time(url)
			}
		}()
	}

	for _, url := range urls {
		if url != &quot;&quot; {
			in &lt;- url
		}
	}
	close(in)
	wg.Wait()
}

huangapple
  • 本文由 发表于 2014年5月11日 00:20:08
  • 转载请务必保留本文链接:https://go.coder-hub.com/23583471.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定