Be the first to know and get exclusive access to offers by signing up for our mailing list(s).

Subscribe

We ❤️ Open Source

A community education resource

Stop guessing, start measuring developer engagement

Creating checkpoints that reveal exactly where your onboarding fails.

Most companies create developer content and documentation but struggle to measure whether any of it actually works. In this episode, Jono Bacon, CEO of Stateshift, joins the We Love Open Source podcast to share why developer engagement requires a systematic approach built around measurable gates, and how AI is revealing nuances in onboarding experiences that would have been impossible to detect before.

Subscribe to our All Things Open YouTube channel to get notifications when new videos are available.

Jono breaks down developer onboarding into specific gates that track real behavior, not vanity metrics. For example, gate one captures website visits and signups. Gate two measures a single API call, proof someone actually did something. Gate three shows 10 API calls, meaning they completed a quickstart guide. Gate four reaches 100 API calls, indicating production testing. Gate five hits 500 API calls, signaling active engagement. This systematic breakdown identifies exactly where people drop off, but historically you needed deep expertise to analyze that data. You had to know how to ask the right questions.

AI changed that equation. Jono’s team loads specific data into AI tools and discovers nuances they would never detect manually. In one example, AI identified that a company’s social media messaging was slightly off, something the team wouldn’t have caught otherwise. This democratizes insights that previously required expensive consultants. Jono also uses tools like Peec to track how often brand names and content appear in LLM outputs, running daily prompts to monitor visibility. Their recognition was struggling a, but tracking that metric helped focus content strategy and drive results up.

The biggest mistake organizations make is not measuring at all. Companies attend conferences like All Things Open but can’t quantify the value because they don’t track whether developers actually use their products afterward. Without systematic measurement there’s no way to improve. Jono’s advice: Embrace iteration and get comfortable with failure because failure delivers good information. Like stand-up comedians who bomb repeatedly in small clubs before perfecting their Netflix special, product teams need to iterate constantly to figure out what works.

Read more: How I use AI agents to automate my workflow and save hours

Key takeaways

  • Break onboarding into measurable gates. Track specific behaviors like first API call, completing quickstart, production testing, and active usage to identify exactly where people drop off.
  • AI democratizes engagement insights. Loading onboarding data into AI tools reveals nuances that would be impossible to detect manually or would require expensive consultants.
  • Measure or you won’t improve. Companies can’t quantify impact without systematic measurement across the entire developer journey.

Developer engagement isn’t mysterious, it just requires breaking the experience into measurable steps and using the right tools to understand what’s working. Jono’s systematic approach turns developer onboarding from guesswork into a process you can actually optimize.

More from We Love Open Source

The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.

Want to contribute your open source content?

Contribute to We ❤️ Open Source

Help educate our community by contributing a blog post, tutorial, or how-to.

Two World-class Events

If you didn't make it to All Things AI, check out the event summary, and make plans to join us October 19-20 for All Things Open.

Open Source Meetups

We host some of the most active open source meetups in the U.S. Get more info and RSVP to an upcoming event.