05open 25 min

Concurrency basics: goroutines change the default shape

Build intuition for goroutines and channels from a Python async and threading background.

by the end of this lesson you can

  • Starts concurrent work with goroutines when appropriate
  • Uses channels or another clear coordination mechanism
  • Accounts for error handling instead of ignoring failures

Overview

If you come from Python, concurrency often means threads, multiprocessing, or `asyncio`. Go gives you a lighter-weight default and a standard way to coordinate work.

In Python, you often

choose between async code for I/O-heavy tasks and threads or processes when concurrency pressure grows.

In Go, the common pattern is

to launch a goroutine for concurrent work and use channels or synchronization primitives to coordinate results.

why this difference matters

The mental shift is that concurrency stops being a specialized subsystem and becomes part of ordinary application design.

Python

result = await fetch_user(user_id)

Go

ch := make(chan User)
go func() {
    user, _ := fetchUser(userID)
    ch <- user
}()
result := <-ch

Deeper comparison

Python version

async def load_dashboard(user_id):
    profile_task = asyncio.create_task(fetch_profile(user_id))
    stats_task = asyncio.create_task(fetch_stats(user_id))
    profile = await profile_task
    stats = await stats_task
    return profile, stats

Go version

func loadDashboard(userID int) (Profile, Stats, error) {
    profileCh := make(chan struct {
        profile Profile
        err     error
    })
    statsCh := make(chan struct {
        stats Stats
        err   error
    })

    go func() {
        profile, err := fetchProfile(userID)
        profileCh <- struct {
            profile Profile
            err     error
        }{profile: profile, err: err}
    }()

    go func() {
        stats, err := fetchStats(userID)
        statsCh <- struct {
            stats Stats
            err   error
        }{stats: stats, err: err}
    }()

    profileResult := <-profileCh
    if profileResult.err != nil {
        return Profile{}, Stats{}, profileResult.err
    }

    statsResult := <-statsCh
    if statsResult.err != nil {
        return Profile{}, Stats{}, statsResult.err
    }

    return profileResult.profile, statsResult.stats, nil
}

Reflect

When does Go concurrency feel simpler than Python async, and when does it demand more discipline from you?

what a strong answer notices

A strong answer mentions that starting concurrent work is lightweight, but coordinating results and avoiding leaks requires deliberate structure.

Rewrite

Translate this Python async shape into a Go-style concurrent design.

Rewrite this Python

async def fetch_pair(user_id):
    user = await fetch_user(user_id)
    posts = await fetch_posts(user_id)
    return user, posts

what good looks like

  • Starts concurrent work with goroutines when appropriate
  • Uses channels or another clear coordination mechanism
  • Accounts for error handling instead of ignoring failures

Practice

Design a Go function that loads a user profile and a billing summary concurrently, then returns a combined response only when both succeed.

success criteria

  • Makes the coordination strategy explicit
  • Describes what happens if one concurrent call fails
  • Keeps the final response assembly separate from the concurrent fetch steps

Common mistakes

  • Adding goroutines before the sequential design is clear.
  • Ignoring cancellation, cleanup, or error propagation when concurrent work fans out.
  • Treating channels like magic queues instead of explicit coordination tools.

takeaways

  • The mental shift is that concurrency stops being a specialized subsystem and becomes part of ordinary application design.
  • A strong answer mentions that starting concurrent work is lightweight, but coordinating results and avoiding leaks requires deliberate structure.
  • Makes the coordination strategy explicit