[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Go - deadlocked processes causing memory leaks

Hi all,

Is this a good community to ask a concurrency question specific to Go?

There is an interesting post on StackOverflow asking why memory leaks can occur when a service process doesn't complete due to timeout.

The question is hereÂhttp://stackoverflow.com/questions/29323560/bug-detect-go-channels-with-select

func Read(url string, timeout time.Duration) (res *Response) {
    ch := make(chan *Response)
    go func() {
        time.Sleep(time.Millisecond * 300)
        ch <- Get(url)
    select {
    case res = <-ch:
    case <-time.After(timeout):
        res = &Response{"Gateway timeout\n", 504}
Because the channel is non-blocking, the secondary goroutine is blocked to send its response and therefore occupies some memory. When the response is consumed, everything is cleaned up correctly. But when a timeout occurs instead, the secondary goroutine is effectively left deadlocked in memory, and this memory leak is never recovered.

There is a simple proposed solution of using a buffered channel. The sending goroutine is never blocked and terminates immediately. The receiving goroutine either consumes the buffer's message or skips it. Either way the memory is garbage-collected. No leak occurs.

I have a question: are there other practical solutions to this puzzle that allow unbuffered channels to operate without the deadlock described?


PS as an aside, it has been my observation that actor systems (in the Erlang and Akka style) can't suffer from deadlocks because they don't allow blocking reads or writes. But do they therefore suffer from subtle memory leaks instead? I think quite they most likely do.