Be the first to know and get exclusive access to offers by signing up for our mailing list(s).

Subscribe

We ❤️ Open Source

A community education resource

7 min read

Measuring what really works in open source

Looking beyond stars and forks to understand what keeps contributors engaged.

Open source communities often celebrate visible success: spikes in stars, forks, and mentions across social platforms. These are encouraging signs, but they only describe what can be seen from above the surface. The real health of a project depends on what happens underneath: the unmeasured behaviors, perceptions, and patterns that explain why contributors stay, drift away, or disengage quietly.

Projects that last are the ones that learn to read both levels at once.

Subscribe to our All Things Open YouTube channel to get notifications when new videos are available.

Three truths about contributors

Before measuring success, it helps to understand how contributors actually think and behave. These truths describe the patterns that shape how developers read, respond, and decide whether to keep engaging with a project.

1. Contributors decide how much attention to pay

Developers are not passive readers of documentation. They skim, skip, and make assumptions based on what feels familiar. When a README or setup guide resembles something they have already used, they may gloss over key details or ignore context they assume they know. The problem is rarely lack of knowledge. Rather, it is misplaced confidence.

If a contributor’s first experience feels inconsistent or fails unexpectedly, the blame lands on the project, not the skipped step. Clear visual cues, concise context, and strong “read this first” guidance can prevent these misunderstandings. Effective documentation does not just inform; it directs attention to what truly matters.

2. Contributors use resources with specific intent

People approach documentation in one of two modes: to do or to learn.

Someone troubleshooting an error wants a short, focused answer: commands that work, configurations that solve the problem, and code that runs as written. Someone trying to understand a system wants to know why it works that way, what tradeoffs exist, and how it fits into the broader stack.

Friction occurs when a project mixes these two needs together. A “doer” gets frustrated by paragraphs of philosophy; a “learner” leaves confused by copy-paste instructions without reasoning. The most successful projects separate quick-action guides from deeper explanations and signal which type of content a reader is looking at before they start.

3. Contributors interpret information through their own experience

Every developer reads through the lens of their own stack, background, and habits. The same phrase can carry entirely different meanings depending on who reads it. “Production ready,” for example, may sound like “ready to use immediately” to one person and “rigid and hard to customize” to another.

Documentation is not just about accuracy; it is about framing. Anticipating multiple interpretations and clarifying intent helps align expectations across diverse contributors. Projects that make implicit assumptions explicit tend to build stronger understanding and trust.

Read more: Measuring open source community health with Savannah

The audience insight iceberg

Most projects measure what is easy to count. Few measure what truly matters.

The iceberg model provides a simple way to think about this gap.

Visual aid showing the audience insight iceberg, with visible insight above the water, tradition implicit data submerged below the surface, and hidden user insights lurking deep at the bottom of the iceberg.
Visual aid from Catchy (Consulted: Ranade, N. (2024), Understanding the Hidden Users for Content Strategy, In Technical Communication, Vol 71 Number 3 August 2024.

Visible insights

The visible “tip of the iceberg” represents everything that can be tracked publicly: stars, forks, pull requests, release cadence, and traffic to documentation. These are indicators of attention. They show that people are looking, but they do not explain behavior.

Traditional analytics

Below that surface sits the layer of behavioral signals that hint at friction: repeated questions in issues, recurring comments in PRs, patterns in documentation drop-offs, or confusion that shows up across community channels. These signals tell you where contributors struggle or lose momentum, but again may mask the true explanation of the behavior.

Hidden insights

The deepest layer of the “iceberg” is the one that few teams measure directly. It includes the unspoken feedback that shapes perception: abandoned pull requests, untracked workarounds shared in chat groups, unanswered posts in forums, or quiet disengagement that never becomes visible in metrics. This is where the real story of a project lives.

Looking beyond the surface matters because this bottom layer explains why contributors act as they do. Surface data shows what is happening; hidden signals show why.

Read more: Using metrics to improve open source communities

Case study: Uncovering the real drivers of engagement with Meta OS

Problem

When Meta’s open source ecosystem expanded with projects such as PyTorch, React Native, RocksDB, and FAIR tools, the surface numbers looked strong. Mentions were increasing, stars and forks were climbing, and launches attracted wide attention. By those measures, adoption appeared steady and healthy.

Yet beneath that growth, something was off. Community participation was uneven. Some projects sustained energy and others saw contributors lose interest after a few interactions. The visible data did not explain why.

Bringing hidden insights to the surface

To find a solution, Catchy gathered the “hidden” data to understand the deeper layers of developer sentiment and engagement. This combined data from public code repositories, technical forums, and social channels to identify patterns that traditional metrics missed. Rather than counting mentions, the goal was to analyze tone, repetition, and trust.

At the surface, the data confirmed high awareness and enthusiasm. In the middle layer, however, patterns began to emerge: developers were asking the same questions about core setup steps and licensing. Many of those questions came from experienced contributors, suggesting that context was missing, not knowledge.

Looking deeper still, the analysis uncovered abandoned issues and pull requests around framework-specific integrations and production deployments. Off-platform discussions revealed a perception that feedback was not being acknowledged. Some contributors expressed doubt that Meta would maintain long-term support for these projects.

Key findings leading to a solution

This was the hidden signal that explained the disconnect. The challenge was not awareness but confidence. Developers trusted the technology but were uncertain about the commitment behind it.

Once the issue was visible, the response became clear. Meta’s teams clarified licensing and contribution policies, expanded documentation around integration pain points, and began highlighting community contributions publicly. They also surfaced off-platform solutions and credited the developers who shared them, showing that the company was paying attention.

The change in tone was immediate. Conversations shifted from skepticism to collaboration, and engagement across multiple projects stabilized. Visibility had created excitement, but listening to what lay beneath it created trust.

Five actions to strengthen your own project

Translating insight into action is what makes this approach valuable. These simple, repeatable habits help maintainers turn signals from their community into meaningful improvements that build trust and long-term engagement.

  • Treat stars and forks as a pulse, not a verdict. They measure visibility, not health.
  • Watch for repeated questions and patterns. Repetition points directly to friction.
  • Ask why, not just how many. Each drop-off or stalled contribution has a reason.
  • Bring hidden fixes into the open. Turn community workarounds into documented solutions and acknowledge their authors.
  • Check the temperature regularly. A light monthly or quarterly review of community sentiment can reveal early warning signs long before participation declines.

The takeaway

Open source success depends on more than visibility. Healthy projects listen to the signals that cannot be easily counted: trust, clarity, and shared ownership. Surface metrics show that people are looking. Hidden insights show that they care. When a project pays attention to both, it builds not only better software but a stronger community around it.

More from We Love Open Source

Works consulted:

  • Redish, J. (1993).Understanding Readers. In C. Barnum and S. Carliner (Eds.)., Techniques for Technical Communicators. (pp. 15-41). New York: Macmillian.​
  • Ranade, N. (2024). Understanding the Hidden Users for Content Strategy. In Technical Communication. Vol 71 Number 3: August 2024.​

About the Author

Senior Developer Marketing Strategist, Catchy

Andrew Gordon is a Senior Developer Marketing Strategist at Catchy, and an adjunct professor of professional and technical communications at Carnegie Mellon University and the Heinz College of Information Systems and Public Policy.

Read Andrew Gordon's Full Bio

The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.

Want to contribute your open source content?

Contribute to We ❤️ Open Source

Help educate our community by contributing a blog post, tutorial, or how-to.

We're hosting two world-class events in 2026!

Join us for All Things AI, March 23-24 and for All Things Open, October 19-20.

Open Source Meetups

We host some of the most active open source meetups in the U.S. Get more info and RSVP to an upcoming event.