Scrum Master Toolbox Podcast: Agile storytelling from the trenches

BONUS Measure and Visualize Software Improvement for Actionable Results | Mooly Beeri


Listen Later

Global Agile Summit Preview: How to Measure and Visualize Software Improvement for Actionable Results with Mooly Beeri

In this BONUS Global Agile Summit preview episode, we explore how to effectively measure and visualize the continuous improvement journey in technology organizations. Mooly Beeri shares his data-driven approach that helps software teams identify where to focus their improvement efforts and how to quantify their progress over time. We discuss practical examples from major organizations like Philips and Aptiv, revealing how visualization creates an internal language of improvement that empowers teams while giving leadership the insights needed to make strategic decisions.

Visualizing Software Development Effectiveness

"We visualize the entire SDLC end-to-end. All the aspects... we have a grading of each step in the SDLC. It starts with a focus on understanding what needs to be done better."

Mooly shares how his approach at Philips helped create visibility across a diverse organization built from numerous acquisitions with different technologies and development cultures. The challenge was helping management understand the status of software craftsmanship across the company. His solution was developing a heat map visualization that examines the entire software development lifecycle (SDLC) - from requirements gathering through deployment and support - with an effectiveness index for each stage. This creates an at-a-glance view where management can quickly identify which teams need support in specific areas like automation, code reviews, or CI/CD processes.

This visualization becomes a powerful internal language for improvement discussions, allowing focused investment decisions instead of relying on intuition or which team has the most persuasive argument. The framework creates alignment while empowering teams to determine their own improvement paths.

Measuring What Matters: The Code Review Example

"We often hear 'we have to do code reviews, of course we do them,' but when we talk about 'how well are they done?', the answer comes 'I don't know, we haven't measured it.'"

When one team wanted to double the time invested in code reviews based on conference recommendations, Mooly helped them develop a meaningful measurement approach. They created the concept of "code review escapes" - defects that could have been caught with better code reviews but weren't. By gathering the team to evaluate a sample of defects after each iteration, they could calculate what percentage "escaped" the code review process.

This measurement allowed the team to determine if doubling review time actually improved outcomes. If the escape rate remained at 30%, the investment wasn't helping. If it dropped to 20%, they could calculate a benefit ratio. This approach has been expanded to measure "escapes" in requirements, design, architecture, and other SDLC phases, enabling teams to consciously decide where improvement efforts would yield the greatest returns.

Balancing Team Autonomy with Organizational Alignment

"Our model focuses on giving teams many options on how to improve, not just one like from top-down improvements. We want to focus the teams on improving on what matters the most."

Mooly contrasts his approach with traditional top-down improvement mandates, sharing a story from Microsoft where a VP mandated increasing unit test coverage from 70% to 80% across all teams regardless of their specific needs. Instead, his framework agrees on an overall definition of effectiveness while giving teams flexibility to choose their improvement path.

Like athletes at different fitness levels, teams with lower effectiveness have many paths to improvement, while high-performing teams have fewer options. This creates a win-win scenario where teams define their own improvement strategy based on their context, while management can still see quantifiable progress in overall organizational effectiveness.

Adapting to Different Industry Contexts

"TIP: Keep the model of evaluation flexible enough to adapt to a team's context."

While working across healthcare, automotive, and other industries, Mooly found that despite surface differences, all software teams face similar fundamental challenges throughout the development lifecycle. His effectiveness framework was born in the diverse Philips environment, where teams built everything from espresso machine firmware to hospital management systems and MRI scanners.

The framework maintains flexibility by letting teams define what's critical in their specific context. For example, when measuring dynamic analysis, teams define which runtime components are most important to monitor. For teams releasing once every four years (like medical equipment), continuous integration means something very different than for teams deploying daily updates. The framework adapts to these realities while still providing meaningful measurements.

Taking the First Step Toward Measured Improvement

"Try to quantify the investment, by defining where to improve by how much. We encourage the team to measure effectiveness of whatever the practices are they need to improve."

For leaders looking to implement a more measured approach to improvement, Mooly recommends starting by focusing teams on one simple question: how will we know if our improvement efforts are actually working? Rather than following trends or implementing changes without feedback mechanisms, establish concrete metrics that demonstrate progress and help calculate return on investment.

The key insight is that most teams already value continuous improvement but struggle with prioritization and knowing when they've invested enough in one area. By creating a quantifiable framework, teams can make more conscious decisions about where to focus their limited improvement resources and demonstrate their progress to leadership in a language everyone understands.

About Mooly Beeri

Mooly Beeri is a software transformation expert with nearly 30 years of industry experience. As founder and CEO of BetterSoftware.dev, he developed a very practical and visual approach to visualize and measure the improvements in technology organizations like Microsoft, Phillips, and Aptiv. His data-driven approach helps organizations visualize and optimize their entire software development lifecycle through measurable improvements.

You can link with Mooly Beeri on LinkedIn and visit Mooly Beeri’s website.

...more
View all episodesView all episodes
Download on the App Store

Scrum Master Toolbox Podcast: Agile storytelling from the trenchesBy Vasco Duarte, Agile Coach, Certified Scrum Master, Certified Product Owner

  • 4.7
  • 4.7
  • 4.7
  • 4.7
  • 4.7

4.7

178 ratings


More shows like Scrum Master Toolbox Podcast: Agile storytelling from the trenches

View all
WSJ Tech News Briefing by The Wall Street Journal

WSJ Tech News Briefing

1,634 Listeners

WSJ What’s News by The Wall Street Journal

WSJ What’s News

4,302 Listeners

HBR IdeaCast by Harvard Business Review

HBR IdeaCast

253 Listeners

The McKinsey Podcast by McKinsey & Company

The McKinsey Podcast

375 Listeners

Coaching for Leaders by Dave Stachowiak

Coaching for Leaders

1,491 Listeners

Projectified by Project Management Institute

Projectified

216 Listeners

The Indicator from Planet Money by NPR

The Indicator from Planet Money

9,534 Listeners

The Journal. by The Wall Street Journal & Gimlet

The Journal.

5,900 Listeners

Scrum.org Community Podcast by Scrum.org

Scrum.org Community Podcast

11 Listeners

A Bit of Optimism by Simon Sinek

A Bit of Optimism

2,167 Listeners

Huberman Lab by Scicomm Media

Huberman Lab

28,131 Listeners

Product Thinking by Melissa Perri

Product Thinking

143 Listeners

The Exceptional Scrum Master podcast by Adeyinka Okunlade

The Exceptional Scrum Master podcast

59 Listeners

Agile Mentors Podcast by Brian Milner and Guests

Agile Mentors Podcast

45 Listeners

The Daily Brief by Zerodha

The Daily Brief

17 Listeners