We ❤️ Open Source
A community education resource
Why treating AI as a co-pilot can transform your coding sessions
Unlock AI's real value with practical tips to reduce friction and accelerate builds.
Burr Sutter, senior director of developer experience at Red Hat, sat down with the All Things Open team to share how platform engineering and AI copilots can restore developer flow, shorten feedback loops, and make day-to-day work actually enjoyable again.
Read more: Deep dive into the Model Context Protocol
Burr opens with a story about wasted hours and stalled momentum, explaining how slow access to simple resources, like a Kafka broker or a test database, signals to engineers that their time is not valued. He’s seen teams wait weeks for basic access, and he’s watched developers quietly take time off because they knew they’d be stuck waiting. Burr’s point is simple, fix the developer experience by removing friction, automate common paths, and give engineers fast, safe access to the tools they need.
On AI, Burr is optimistic but pragmatic, calling models useful co-pilots rather than replacements. He’s used ChatGPT, Cursor, Cline, Sonnet, and other assistants to retrain himself across stacks and to speed up routine work. The pattern he describes is familiar, ask the model for a starting point, try the code, feed back the error, iterate, repeat. That loop accelerates learning and productivity, but Burr warns the output rarely ships as-is, so developers must verify, refine, and shape the results to meet real business requirements.
Burr’s advice for learning and tool choice is architectural, not tribal. Start by understanding application structure and patterns, then let AI help you explore languages and frameworks, whether Java, Python, Node.js, Spring Boot, or Prisma. He also highlights practical tools that accelerate work, like Cline with a Sonnet 3.7 model for scaffolding example apps, and MCP servers for model-context workflows. Finally, Burr urges developers to stay plugged into conferences and open source communities, because the best shortcuts come from shared practice and real human conversations.
Key takeaways
- Remove friction with platform engineering, automate common requests, and give engineers fast access to resources.
- Use AI as a co-pilot, not a replacement, iterate on model outputs and always verify against business needs.
- Learn architecture first, then apply AI to explore languages and frameworks, and stay active in open source communities and conferences.
Conclusion
Burr’s message is practical: Invest in platform engineering to speed teams, experiment with AI copilots to accelerate learning, and keep human judgment central to shipping reliable software. Do the platform work, use AI to jumpstart tasks, but always validate what the models produce, and keep showing up to the community to learn and share.
More from We Love Open Source
- What is prompt engineering?
- Deep dive into the Model Context Protocol
- How Acorn Labs is rethinking AI adoption with Obot
- Why AI agents are the future of web navigation
- 3 key metrics for reliable LLM performance
The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.