有没有更好的方法来跟踪goroutine的响应?

huangapple go评论77阅读模式
英文:

Any better way to keep track of goroutine responses?

问题

我正在努力理解goroutines。我创建了一个简单的程序,可以在多个搜索引擎上并行执行相同的搜索。目前,为了跟踪响应的数量,我计算了我收到的响应数量。但这似乎有点业余。

在下面的代码中,有没有更好的方法来知道我是否收到了所有goroutines的响应?

package main

import (
	"fmt"
	"net/http"
	"log"
)

type Query struct {
	url    string
	status string
}

func search(url string, out chan Query) {
	fmt.Printf("Fetching URL %s\n", url)
	resp, err := http.Get(url)

	if err != nil {
		log.Fatal(err)
	}

	defer resp.Body.Close()

	out <- Query{url, resp.Status}
}

func main() {
	searchTerm := "carrot"

	fmt.Println("Hello world! Searching for ", searchTerm)

	searchEngines := []string{
		"http://www.bing.co.uk/?q=",
		"http://www.google.co.uk/?q=",
		"http://www.yahoo.co.uk/?q="}

	out := make(chan Query)

	for i := 0; i < len(searchEngines); i++ {
		go search(searchEngines[i]+searchTerm, out)
	}

	progress := 0

	for {
		// 有没有更好的方法来执行这一步?
		if progress >= len(searchEngines) {
			break
		}
		fmt.Println("Polling...")
		query := <-out
		fmt.Printf("Status from %s was %s\n", query.url, query.status)
		progress++
	}
}
英文:

I'm trying to get my head around goroutines. I've created a simple program that performs the same search in parallel across multiple search engines. At the moment to keep track of the number of responses, I count the number I've received. It seems a bit amateur though.

Is there a better way of knowing when I've received a response from all of the goroutines in the following code?

package main

import (
	&quot;fmt&quot;
	&quot;net/http&quot;
	&quot;log&quot;
)

type Query struct {
	url string
	status string
}

func search (url string, out chan Query) {
	fmt.Printf(&quot;Fetching URL %s\n&quot;, url)
	resp, err := http.Get(url)

	if err != nil {
		log.Fatal(err)
	}

	defer resp.Body.Close()

	out &lt;- Query{url, resp.Status}
}

func main() {
	searchTerm := &quot;carrot&quot;

	fmt.Println(&quot;Hello world! Searching for &quot;, searchTerm)

	searchEngines := []string{
		&quot;http://www.bing.co.uk/?q=&quot;,
		&quot;http://www.google.co.uk/?q=&quot;,
		&quot;http://www.yahoo.co.uk/?q=&quot;}

	out := make(chan Query)

	for i := 0; i &lt; len(searchEngines); i++ {
		go search(searchEngines[i] + searchTerm, out)
	}

	progress := 0

	for {
                    // is there a better way of doing this step?
		if progress &gt;= len(searchEngines) {
			break
		}
		fmt.Println(&quot;Polling...&quot;)
		query := &lt;-out
		fmt.Printf(&quot;Status from %s was %s\n&quot;, query.url, query.status)
		progress++
	}
}

答案1

得分: 12

请使用sync.WaitGroup,在pkg doc中有一个示例

searchEngines := []string{
	"http://www.bing.co.uk/?q=",
	"http://www.google.co.uk/?q=",
	"http://www.yahoo.co.uk/?q="}
var wg sync.WaitGroup
out := make(chan Query)

for i := 0; i < len(searchEngines); i++ {
    wg.Add(1)
	go func (url string) {
        defer wg.Done()
		fmt.Printf("Fetching URL %s\n", url)
		resp, err := http.Get(url)

		if err != nil {
			log.Fatal(err)
		}

		defer resp.Body.Close()

		out <- Query{url, resp.Status}
		
	}(searchEngines[i] + searchTerm)
	
}
wg.Wait()
英文:

Please use sync.WaitGroup, there is an example in the pkg doc

searchEngines := []string{
	&quot;http://www.bing.co.uk/?q=&quot;,
	&quot;http://www.google.co.uk/?q=&quot;,
	&quot;http://www.yahoo.co.uk/?q=&quot;}
var wg sync.WaitGroup
out := make(chan Query)

for i := 0; i &lt; len(searchEngines); i++ {
    wg.Add(1)
	go func (url string) {
        defer wg.Done()
		fmt.Printf(&quot;Fetching URL %s\n&quot;, url)
		resp, err := http.Get(url)

		if err != nil {
			log.Fatal(err)
		}

		defer resp.Body.Close()

		out &lt;- Query{url, resp.Status}
		
	}(searchEngines[i] + searchTerm)
	
}
wg.Wait()

huangapple
  • 本文由 发表于 2013年2月23日 17:00:27
  • 转载请务必保留本文链接:https://go.coder-hub.com/15038834.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定