英文:
Reading from a StdoutPipe() in Go freezes
问题
我正在尝试从命令的标准输出中读取内容,但大约每50次会出现冻结的情况。
func runProcess(process *exec.Cmd) (string, string, error) {
var stdout strings.Builder
var stderr string
process := exec.Command(programPath, params...)
go func() {
pipe, err := process.StderrPipe()
if err != nil {
return
}
buf, err := io.ReadAll(pipe)
if err != nil {
log.Warn("Error reading stderr: %v", err)
}
stderr = string(buf)
}()
pipe, err := process.StdoutPipe()
if err = process.Start(); err != nil {
return "", "", err
}
buf := make([]byte, 1024)
read, err := pipe.Read(buf) // it reads correctly from the pipe
for err == nil && read > 0 {
_, err = stdout.Write(buf[:read])
read, err = pipe.Read(buf) // this is where is stalls
}
if err = process.Wait(); err != nil {
return stdout.String(), stderr, err
}
return stdout.String(), stderr, nil
}
我尝试使用stdout, err := io.ReadAll(pipe)
一次性读取所有内容,而不是分块读取,但结果行为相同。
被调用的程序似乎成功执行。它的日志文件已创建且完整。而且,在循环之前,当我从管道中读取(第一次读取)时,所有的输出都在那里。但在循环内部,当第二次调用.Read()
并且应该返回EOF时(输出小于1024字节),它会冻结。
英文:
I am trying to read from the Stdout of a command, but once every (approx.) 50 times it freezes.
func runProcess(process *exec.Cmd) (string, string, error) {
var stdout strings.Builder
var stderr string
process := exec.Command(programPath, params...)
go func() {
pipe, err := process.StderrPipe()
if err != nil {
return
}
buf, err := io.ReadAll(pipe)
if err != nil {
log.Warn("Error reading stderr: %v", err)
}
stderr = string(buf)
}()
pipe, err := process.StdoutPipe()
if err = process.Start(); err != nil {
return "", "", err
}
buf := make([]byte, 1024)
read, err := pipe.Read(buf) // it reads correctly from the pipe
for err == nil && read > 0 {
_, err = stdout.Write(buf[:read])
read, err = pipe.Read(buf) // this is where is stalls
}
if err = process.Wait(); err != nil {
return stdout.String(), stderr, err
}
return stdout.String(), stderr, nil
}
I've tried to use stdout, err := io.ReadAll(pipe)
to read everything at once instead of reading chunks, but I get the same behaviour.
The program that is called seems to be executed successfully. Its logfile is created and it is complete. Plus, first time when I read from the pipe (before the loop), all the output is there. But inside the loop, when the .Read()
is called for the second time and it should return an EOF (the output is smaller than 1024 bytes), it freezes.
答案1
得分: 7
这段代码中存在许多竞态条件。一般来说,如果你创建了一个 goroutine,就应该有某种形式的同步机制,比如 chan
、sync.Mutex
、sync.WaitGroup
或者原子操作。
修复这些竞态条件。
-
在调用
Start()
之前调用StderrPipe()
。代码中没有这样做。 -
在返回之前等待 goroutine 完成。
竞态条件可能会破坏 exec.Cmd
结构... 这可能意味着它泄漏了一个管道,这就解释了为什么 Read()
会阻塞(因为管道的写端没有关闭)。
作为一个经验法则,总是修复竞态条件。将它们视为高优先级的错误。
下面是一个修复了竞态条件的示意代码:
func runProcess(process *exec.Cmd) (stdout, stderr string, err error) {
outPipe, err := process.StdoutPipe()
if err != nil {
return "", "", err
}
// 在调用 Start() 之前调用 StderrPipe。
// 一种简单的方法是在 goroutine 外部调用。
errPipe, err := process.StderrPipe()
if err != nil {
return "", "", err
}
// 启动进程。
if err := process.Start(); err != nil {
return "", "", err
}
// 在 goroutine 中读取 stderr。
var wg sync.WaitGroup
var stderrErr error
wg.Add(1)
go func() {
defer wg.Done()
data, err := ioutil.ReadAll(errPipe)
if err != nil {
stderrErr = err
} else {
stderr = string(data)
}
}()
// 在主线程中读取 stdout。
data, stdoutErr := ioutil.ReadAll(outPipe)
// 等待直到我们完成读取 stderr。
wg.Wait()
// 等待进程结束。
if err := process.Wait(); err != nil {
return "", "", err
}
// 处理从读取 stdout 中的错误。
if stdoutErr != nil {
return "", "", stderrErr
}
// 处理从读取 stderr 中的错误。
if stderrErr != nil {
return "", "", stderrErr
}
stdout = string(data)
return stdout, stderr, nil
}
更简单的代码
所有这些都由 os/exec
包自动完成。你可以使用任何 io.Writer
作为 Stdout
和 Stderr
,不限于 *os.File
。
func runProcess(process *exec.Cmd) (stdout, stderr string, err error) {
var stdoutbuf, stderrbuf bytes.Buffer
process.Stdout = &stdoutbuf
process.Stderr = &stderrbuf
if err := process.Run(); err != nil {
return "", "", err
}
return stdoutbuf.String(), stderrbuf.String(), nil
}
英文:
There are many race conditions in this code. In general, if you create a goroutine, there should be some kind of synchronization--like a chan
, sync.Mutex
, sync.WaitGroup
, or atomic.
Fix the race conditions.
-
Call
StderrPipe()
before callingStart()
. The code does not do this. -
Wait for the goroutine to finish before returning.
The race condition could corrupt the exec.Cmd
structure... which could mean that it leaks a pipe, which would explain why Read()
hangs (because a write end of the pipe wasn't closed).
As a rule of thumb, always fix race conditions. Consider them to be high-priority bugs.
Here is a sketch of how you could write it without race conditions:
func runProcess(process *exec.Cmd) (stdout, stderr string, err error) {
outPipe, err := process.StdoutPipe()
if err != nil {
return "", "", err
}
// Call StderrPipe BEFORE Start().
// Easy way to do it: outside the goroutine.
errPipe, err := process.StderrPipe()
if err != nil {
return "", "", err
}
// Start process.
if err := process.Start(); err != nil {
return "", "", err
}
// Read stderr in goroutine.
var wg sync.WaitGroup
var stderrErr error
wg.Add(1)
go func() {
defer wg.Done()
data, err := ioutil.ReadAll(errPipe)
if err != nil {
stderrErr = err
} else {
stderr = string(data)
}
}()
// Read stdout in main thread.
data, stdoutErr := ioutil.ReadAll(outPipe)
// Wait until we are done reading stderr.
wg.Wait()
// Wait for process to finish.
if err := process.Wait(); err != nil {
return "", "", err
}
// Handle error from reading stdout.
if stdoutErr != nil {
return "", "", stderrErr
}
// Handle error from reading stderr.
if stderrErr != nil {
return "", "", stderrErr
}
stdout = string(data)
return stdout, stderr, nil
}
Much Simpler Code
All of this is done by the os/exec
package automatically. You can use any io.Writer
for Stdout
and Stderr
, you are not limited to *os.File
.
func runProcess(process *exec.Cmd) (stdout, stderr string, err error) {
var stdoutbuf, stderrbuf bytes.Buffer
process.Stdout = &stdoutbuf
process.Stderr = &stderrbuf
if err := process.Run(); err != nil {
return "", "", err
}
return stdoutbuf.String(), stderrbuf.String(), nil
}
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论