Generator Concurrency Pattern in Go: A Comprehensive Guide
In our previous post, we explored and visualized the basics of goroutines and channels, the building blocks of Go’s concurrency. Now, let’s look at how these primitives combine to form powerful patterns that solve real-world problems.
In this post we’ll cover Generator Pattern and will try to visualize them. So let’s gear up as we’ll be hands on through out the process.
Generator Pattern
A generator is like a fountain that continuously produces values that we can consume whenever needed.
In Go, it’s a function that produces a stream of values and sends them through a channel, allowing other parts of our program to receive these values on demand.
Let’s look at an example:
// generateNumbers creates a generator that produces numbers from 1 to max
func generateNumbers(max int) chan int {
out := make(chan int)
// Launch a goroutine to generate numbers
go func() {
// Important: Always close the channel when done
defer close(out)
for i := 1; i <= max; i++ {
out <- i
}
}()
return out
}
func main() {
// Create a generator that produces numbers 1-5
numbers := generateNumbers(5)
// Receive values from the generator
for num := range numbers {
fmt.Println("Received:", num)
}
}
In this example, our generator function does three key things:
- Creates a channel to send values
- Launches a goroutine to generate values
- Returns the channel immediately for consumers to use
Why Use Generators?
- Separate value production from consumption
- Generate values on-demand (lazy evaluation)
- Can represent infinite sequences without consuming infinite memory
- Allow concurrent production and consumption of values
Real-world Use Case
Reading large files line by line:
func generateLines(filename string) chan string {
out := make(chan string)
go func() {
defer close(out)
file, err := os.Open(filename)
if err != nil {
return
}
defer file.Close()
scanner := bufio.NewScanner(file)
for scanner.Scan() {
out <- scanner.Text()
}
}()
return out
}
You must be thinking, what sets this apart? We can generate sequence of data as well as read it line by line without goroutines. Let’s try to visualize both cases:
Without the goroutines
Here you have to wait for everything to be ready before you can start processing.
// Traditional approach
func getNumbers(max int) []int {
numbers := make([]int, max)
for i := 1; i <= max; i++ {
numbers[i-1] = i
// Imagine some heavy computation here
time.Sleep(100 * time.Millisecond)
}
return numbers
}
With goroutines
You can start processing the data while the data is still being generated.
// Generator approach
func generateNumbers(max int) chan int {
out := make(chan int)
go func() {
defer close(out)
for i := 1; i <= max; i++ {
out <- i
// Same heavy computation
time.Sleep(100 * time.Millisecond)
}
}()
return out
}
Key Benefits of Generator Pattern:
- Non-Blocking Execution: Generation and processing happen simultaneously
- Memory Efficiency: Can generate and process one value at a time, no need to store in the memory right away, reduces memory spike
- Infinite Sequences: Can generate infinite sequences without memory issues
- Backpressure Handling: If your consumer is slow, the generator naturally slows down (due to channel blocking), preventing memory overload.
// Generator naturally handles slow consumers (backpressure handling)
for line := range generateLines(bigFile) {
// Take our time processing each line
// Generator won't overwhelm us with data
processSlowly(line)
}
Common Pitfalls and Solutions
- Forgetting to Close Channels
// Wrong ❌
func badGenerator() chan int {
out := make(chan int)
go func() {
for i := 1; i <= 5; i++ {
out <- i
}
// Channel never closed!
}()
return out
}
// Right ✅
func goodGenerator() chan int {
out := make(chan int)
go func() {
defer close(out) // Always close when done
for i := 1; i <= 5; i++ {
out <- i
}
}()
return out
}
- Not Handling Errors
// Better approach with error handling
func generateWithErrors() (chan int, chan error) {
out := make(chan int)
errc := make(chan error, 1) // Buffered channel for error
go func() {
defer close(out)
defer close(errc)
for i := 1; i <= 5; i++ {
if i == 3 {
errc <- fmt.Errorf("error at number 3")
return
}
out <- i
}
}()
return out, errc
}
- Resource Leaks: When using generators with resources (like files), ensure proper cleanup:
func generateFromFile(filename string) chan string {
out := make(chan string)
go func() {
defer close(out)
file, err := os.Open(filename)
if err != nil {
return
}
defer file.Close() // Important: Close file when done
scanner := bufio.NewScanner(file)
for scanner.Scan() {
out <- scanner.Text()
}
}()
return out
}
That wraps up our deep dive into the Generator pattern! Coming up next, we’ll explore the Pipeline concurrency pattern, where we’ll see how to chain our generators together to build powerful data processing flows.
If you found this post helpful, have any questions, or want to share your own experiences with generators — I’d love to hear from you in the comments below. Your insights and questions help make these explanations even better for everyone.
Stay tuned for more Go concurrency patterns! 🚀