The Complete Guide to Measuring Team and Technical Agility

 

Before writing this article, we were curious to know more about how often teams are measuring their agility (if ever). We ran an informal poll on LinkedIn, and the results were fascinating.

Assessing your team’s agility is a crucial step toward continuous improvement. After all, you can’t get where you want to go if you don’t know where you are.

But you probably have questions: How do you measure a team’s agility? Who should do it and when? What happens with the data you collect, and what should you do afterwards?

We’re here to answer these questions. Use the following sections to guide you:

  • What is Team and Technical Agility?
  • What is the team and technical agility assessment?
  • Assessment tips, including before, during, and after you assess
  • Team and technical agility assessment resources

These sections include a video showing where to find the team and technical agility assessment in SAFe® Studio and what the assessment looks like.

What Is Team and Technical Agility?

Agile Teams, Built-In Quality, and Teams of Agile Teams graphic from Framework article
The three dimensions of team and technical agility

Before jumping into the assessment, it’s important to understand team and technical agility. This will help determine if you want to run the assessment and which areas may be most beneficial for your team. 

Team and technical agility is a team’s ability to deliver solutions that meet customers’ needs. It’s one of the seven business agility core competencies. 

Team and technical agility contains three parts:

  • Agile teams
  • Teams of Agile teams
  • Built-in Quality

Agile teams

As the basic building block of an Agile Release Train (ART), the Agile team is responsible for 

  • Connecting with the customer
  • Planning the work
  • Delivering value
  • Getting feedback
  • Improving relentlessly

They’re the ones on the ground bringing the product roadmap to life. They must also plan, commit, and improve together to execute in unison. 

Teams of Agile teams

An ART is where Agile teams work together to deliver solutions. The ART has the same responsibilities as the Agile team but on a larger scale. The ART also plans, commits, executes, and improves together. 

Built-in quality

Since Agile teams and ARTs are responsible for building products and delivering value, they must follow built-in quality practices. These practices apply during development and the review process. 

As we state in the Framework article: “Built-in quality is even more critical for large solutions, as the cumulative effect of even minor defects and wrong assumptions may create unacceptable consequences.”

It’s important to consider all three areas when assessing your team’s agility.

What Is the Team and Technical Agility Assessment?

Team and Technical Agility Assessment results screenshot

The team and technical agility assessment is a review tool that measures your team’s agility through a comprehensive survey and set of recommendations. 

However, there’s more to it than that. We’ll review the information you need to fully understand what you learn from this assessment and how to access it.

Each question in the assessment asks team members to rate statements about their teams on the following scale:

  1. True
  2. More True than False
  3. Neither False nor True
  4. More False than True
  5. False
  6. Not Applicable
The assessment answer options

What information can I get from the team and technical agility assessment?

Team and technical agility assessment helps teams identify areas for improvement, highlight strengths worth celebrating, and benchmark performance against future progress. It asks questions like the following about how your team operates:

  • Do team members have cross-functional skills? 
  • Do you have a dedicated Product Owner (PO)
  • How are teams of teams organized in your ARTs? 
  • Do you use technical practices like test-driven development and peer review? 
  • How does your team tackle technical debt?

For facilitators, including Scrum Masters/Team Coaches (SM/TC), the team and technical agility assessment is a great way to create space for team reflection beyond a typical retrospective. It can also increase engagement and buy-in for the team to take on actionable improvement items.

Once the assessment is complete, the team receives the results broken down by each category of team and technical agility.

Team and Technical Agility Assessment results (aggregate view)

When you click on a category, the results break into three sub-categories to drill down even further into the responses.

Team and Technical Agility results (drilled down view of Agile Teams category)

In addition to the responses, you receive key strengths. The answers with the highest average scores and the lowest deviations between team members are key strengths.

Assessment results showing statements with the highest scores and highest amount of agreement

Inversely, you also get key opportunities. The answers with the lowest average scores and highest deviations between team members highlight areas where more focus is needed.

Screenshot of areas of improvement in assessment results

The assessment will include growth recommendations based on your team’s results. These are suggested next steps for your team to improve the statements and areas where it scored lowest.

Screenshot of a growth recommendation example from the TTA assessment

How do I access the team and technical agility assessment?

You can access the team and technical agility assessment in SAFe® Studio. Use the following steps:

  1. Log into SAFe® Studio.
  2. Navigate to the My SAFe Assessments page under “Practice” in the main navigation bar on the left side of the homepage. 
  3. Click the Learn More button under Comparative Agility, our Measure and Grow Partner. The team and technical agility assessment runs through their platform. 
  4. Click on the Click Here to Get Started button.
  5. From there, you’ll land on the Comparative Agility website. If you want to create an account to save your progress and assessment data, you may do so. If you’d like to skip to the assessment, click on Start Survey in the bottom right of the screen. 
  6. Select Team and Technical Agility Assessment.
  7. Click Continue in the pop-up that appears. 
  8. The assessment will then start in a new tab. 

See each of these steps in action in this video.

Team and Technical Agility Assessment Best Practices

To ensure you get the best results from the team and technical agility assessment, we’ve compiled recommended actions before, during, and after the assessment.

Assessment Best Practices

Before facilitating the team and technical agility assessment

Being intentional about how you set up the assessment with your team will give you results you can work with after the assessment.

Who should run the assessment

Running assessments can be tricky for a few reasons. 

  • Teams might feel defensive about being “measured” 
  • Self-reported data isn’t always objective or accurate 
  • Emotions and framing can impact the results 

That’s why SAFe recommends a SM/TC or other trained facilitator run the assessment. A SM/TC, SPC, or Agile coach can help ensure teams understand their performance and know where to focus their improvement efforts.

When to run the assessment

It’s never too early or too late to know where you stand. Running the assessment for your team when starting with an Agile transformation will help you target the areas where you most need to improve, but you can assess team performance anytime. 

As for how frequently you should run it, it’s probably more valuable to do it on a cadence—either once a PI or once a year, depending on the team’s goals and interests. There’s a lot of motivation in seeing how you grow and progress as a team, and it’s easier to celebrate wins demonstrated through documented change over time.

How to prepare to run the assessment

Before you start the team and technical agility assessment, define your team’s shared purpose. This will help you generate buy-in and excitement. If the team feels like they’re just completing the assessment because the SM/TC said so, it won’t be successful. They must see value in it for them as individuals and as a team. 

Some questions we like to ask to set this purpose include: 

  • What do we want it to feel like to be part of this team two PIs from now?
  • How will our work lives be improved when we check in one year from now?

We like to kick off the assessment with a meeting invitation with a draft agenda if you’re completing the assessment as a team. Sending this ahead of time gives everyone a chance to prepare. You can keep the agenda loose, so you have the flexibility to spend more or less time discussing particular areas, depending on how your team chooses to engage with each question.

If you’re completing the assessment asynchronously, send out a deadline of when team members must complete the assessment by. Then send a meeting invitation for reviewing the results as a team.

Facilitating the team and technical agility assessment

Now it’s time to complete the assessment. These are a couple of tips to consider when facilitating the assessment for your team.

Running the assessment

Ways to run the assessment graphic

There are two ways you can approach running this assessment. Each has a different value. Choose the option based on your team’s culture. 

Option one is to have team members take the assessment individually and then discuss the results as a group. You can do this one of two ways: team members complete the assessment asynchronously by a certain date so you can review results as a team later or set a time for teammates to take the assessment at the same time and discuss results immediately afterwards.   

Option two is to discuss the assessment questions as a team and agree on the group’s answers.

When we ran this assessment, we had team members do it individually so we could focus our time together on reviews and actions. If you run it asynchronously, be available to team members if they have questions before you review your answers.

Keeping the assessment anonymous

Keeping the answers anonymous is helpful if you want more accurate results. We like to be clear upfront that the assessment will be anonymous so that team members can feel confident about being honest in their answers. 

For example, with our teams, we not only explained the confidentiality of individuals’ answers but also demonstrated in real time how the tool works so that the process would feel open and transparent. We also clarified that we would not be using the data to compare teams to each other or for any purpose other than to gain a shared understanding of where we are selecting improvement items based on the team’s stated goals.

However, if you choose to complete the assessment as a team and decide on each answer together, answering anonymously isn’t possible. Choose the option you think works best for your team’s culture.

After facilitating the team and technical agility assessment

The main point of running the team and technical agility assessment is to get the information it provides. What you do with this information determines its impact on your team.

What to do with the assessment results

Once you’ve completed the assessment using one of the two approaches,

  • Review sections individually
  • Show aggregate results
  • Allow team to notice top strengths and areas for improvement
  • Don’t tell the team what you think as facilitator

We learned in the assessment how much we disagreed on some items. For example, even with a statement as simple as “Teams execute standard iteration events,” some team members scored us a five (out of five) while others scored us a one. 

We treated every score as valid and sought to understand why some team members scored high and others low, just like we do when estimating the size of a user story. 

This discussion lead to:

  • Knowing where to improve
  • Uncovering different perspectives
  • Showing how we were doing as a team
  • Prompting rich conversations
  • Encouraging meaningful progress

We know it can be challenging to give and receive feedback, especially when the feedback focuses on improving. Here are a few ways to make conversations about the assessment results productive with your team.

How to review assessment results graphic

Using the assessment to improve

With your assessment results in hand, it’s time to take actions that help you improve. 

For each dimension of the team and technical agility assessment, SAFe provides growth recommendations to help teams focus on the areas that matter most and prioritize their next steps. 

Growth recommendations are helpful because they’re bite-sized actions to break down the overall area of improvement. They’re easy to fit into the PI without overloading capacity. 

Examples of growth recommendations:

Example 1:

  • As a SM/TC, watch the How to Run an Effective Backlog Refinement Workshop video with the team.
  • Discuss the importance of refining the backlog to ensure upcoming work is well-defined and there is no work outside the backlog. 
  • Schedule backlog refinement on a cadence.

Example 2: 

  • As a team, use the Identifying Key Stakeholders Collaborate template and answer the following questions:
    • Who is the customer of our work? (This could be internal or external customers.)
    • Who is affected by our work?
    • Who provides key inputs or influences the goals of our work?
    • Whose feedback do we need to progress the work?
  • Maintain a list of key stakeholders.

Example 3:

  • As a team, collect metrics to understand the current situation. Include the total number of tests, the frequency each test is run, test coverage, the time required to build the Solution and execute the tests, the percentage of automated tests, and the number of defects. Additionally, quantify the manual testing effort each Iteration and during a significant new release.
  • Present and discuss these metrics with the key stakeholders, highlighting how the lack of automation impacts quality and time to market.
  • Create a plan for increasing the amount of test automation.

Here are some actions you should take once you’ve completed the assessment: 

  • Review the team growth recommendations together to generate ideas
  • Select your preferred actions (you can use dot voting or WSJF calculations for this; SAFe® Studio has ready-made templates you can use)
  • Capture your team’s next steps in writing: “Our team decided to do X, Y, and Z.” 
  • Follow through on your actions so that you’re connecting them to the desired outcome
  • Review your progress at the beginning of iteration retrospectives

Finally, you’ll want to use these actions to set a focus for the team throughout the PI. Then check in with Business Owners at PI planning on how these improvements have helped the organization progress toward its goals.

Tip: Simultaneously addressing all focus areas may be tempting, but you want to limit your WIP. 

To do this, pick one focus area based on the results. You can add the remaining focus areas to the backlog to begin working on once you’ve addressed the first one.

Ways to prioritize action items: WSJF, Team vote, Timeliness, Ease, Team capacity

Feeling overwhelmed by the action items for your team? Try breaking them down into bite-size tasks to make it easier on capacity while still making progress.

These are some examples.

Bite-size action item examples

Team and Technical Agility Assessment Resources

Here are some additional resources to consider when assessing your team’s agility. 

About the authors

Lieschen is a product owner and former scrum master at Scaled Agile.

Lieschen Gargano is a Release Train Engineer and conflict guru, thanks in part to her master’s degree in conflict resolution. As the RTE for the development value stream at Scaled Agile, Inc., Lieschen loves cultivating new ideas and approaches to Agile to keep things fresh and engaging. She also has a passion for developing practices for happy teams of teams across the full business value stream.

Sam is a certified SAFe® 6 Practice Consultant (SPC) and serves as the SM/TC for several teams at Scaled Agile. His recent career highlights include entertaining the crowd as the co-host of the 2019, 2020, and 2021 Global SAFe® Summits. A native of Columbia, South Carolina, Sam lives in Kailua, Hawaii, where he enjoys CrossFit and Olympic weightlifting.

How to Measure Team Performance: A Scrum Master Q+A

Assessing your team’s agility is an important step on the path to continuous improvement. After all, you can’t get where you want to go if you don’t know where you are. But you probably have questions: How do you measure a team’s agility, anyway? Who should do it, and when? What happens with the data you collect, and what should you do afterwards?

To bring you the answers, we interviewed two of our experienced scrum masters, Lieschen Gargano and Sam Ervin. Keep reading to learn their recommendations for running a Team successfully and Technical Agility Assessment.

Q: How does SAFe help teams measure their agility, and why should I care? 

Measure and Grow is the Scaled Agile Framework’s approach to evaluating agility and determining what actions to take next. Measure and Grow assessment tools and recommendation actions help organizations and teams reflect on where they are and know how to improve. 

The SAFe® Business Agility Assessment measures an organization’s overall agility across seven core competencies: team and technical agility, agile product delivery, enterprise solution delivery, Lean portfolio management, Lean-agile leadership, organizational agility, and continuous learning culture. 

business agility assessment

The SAFe Core Competency Assessments measure each of these core competencies on a deeper level. For example, the Team and Technical Agility (TTA) Core Competency Assessment helps teams identify areas for improvement, highlight strengths worth celebrating, and baseline performance against future growth. It asks questions about how your team operates. Do team members have cross-functional skills? Do you have a dedicated PO? How are teams of teams organized in your agile release trains (ARTs)? Do you use technical practices like test-driven development and peer review? How does your team tackle technical debt?

For facilitators, including scrum masters, the Team and Technical Agility Assessment is a great way to create space for team reflection beyond a typical retrospective. It can also increase engagement and buy-in for the team to take on actionable improvement items.

Q: Who should run a Team and Technical Agility Assessment? 

Running assessments can be tricky. Teams might feel defensive about being “measured.” Self-reported data isn’t always objective or accurate. Emotions and framing can impact the results. That’s why SAFe recommends that a scrum master or other trained facilitator run the assessment. A scrum master, SPC, or agile coach can help ensure that teams understand their performance and know where to focus their improvement efforts. 

Q: When should I do this assessment?

It’s never too early or too late to know where you stand. Running the assessment for your team when you’re first getting started with an agile transformation will help you target the areas where you most need to improve, but you can assess team performance at any time. 

As for how frequently you should run it … it’s probably more valuable to do it on a cadence—either once a PI or once a year, depending on the team’s goals and appetite for it. There’s a lot of energy in seeing how you grow and progress as a team, and it’s easier to celebrate wins that are demonstrated through documented change over time than through general sentiment.

Q: Okay, how do I prepare for and run it?

The agility assessment tools are available free to SAFe members and customers at the Measure and Grow page on the SAFe Community Platform. There you can choose from tools created for us by our partners, AgilityHealth and Comparative Agility.

Before you start the Team and Technical Agility Assessment, define your team’s shared purpose. This will help you generate buy-in and excitement. If the team feels like they’re just doing the assessment because the scrum master said so, it won’t be successful. They have to see value in it for them, both as individuals and as a team. 

Some questions we like to ask to set this purpose include, “What do we want it to feel like to be part of this team, two PIs from now?” And, “How will our work lives be improved when we check in one year from now?”

There are two ways you can approach running this assessment. Option #1 is to have team members take the assessment individually, and then get together to discuss their results as a group. Option #2 is to discuss the assessment questions together and come to a consensus on the group’s answers.

When we’ve run this assessment we’ve had team members do it individually, so we could focus our time together on review and actions. If you do decide to run it asynchronously it’s important as a facilitator to be available to team members, in case they have questions before you review your answers as a team.

Q: What else should I keep in mind?

We like to kick off the assessment with a meeting invitation that includes a draft agenda. Sending this ahead of time gives everyone a chance to prepare. You can keep the agenda loose so you have flexibility to spend more or less time discussing particular areas, depending on how your team chooses to engage with each question.

Q: Is the assessment anonymous? 

Keeping the answers anonymous is really helpful if you want to get more accurate results. We like to be very clear upfront that the assessment will be anonymous, so that team members can feel confident about being honest in their answers. 

For example, with our teams, we not only explained the confidentiality of individuals’ answers but demonstrated in real-time how the tool itself works so that the process would feel open and transparent. We also made it clear that we would not be using the data to compare teams to each other, or for any purpose other than to gain a shared understanding of where we are selecting improvement items based on the team’s stated goals.

Q: Then what? What do I do with the results?

Once you’ve completed the assessment using one of the two approaches, you’ll want to review the sections one by one, showing the aggregate results and allowing the team to notice their top strengths and top areas for improvement. Your job as facilitator is NOT to tell them what you think based on the results; it’s to help guide the team’s own discussion as they explore the answers. This yields much more effective outcomes!

One thing one of us learned in doing the assessment was how much we disagreed on some things. For example, even with a statement as simple as, “Teams execute standard iteration events,” some team members scored us a five (out of five) while others scored us a one. We treated every score as valid and sought to understand why some team members scored high and others low, just like we do when estimating the size of a user story. During this conversation, we learned an important fact. The product owner thought the iteration was executed in a standard way because she was the one executing it. But team members gave that statement a low score because they weren’t included in much of the decision-making. There was no consensus understanding for what “standard iteration events” meant to the team. 

This prompted a conversation about why the team isn’t always included in how the iteration was executed. We talked about the challenge of aligning schedules to share responsibility for decision-making in meetings. And we talked about the impact of team members not having the opportunity to contribute. 

As a result, the assessment did more than help us see where we needed to improve; it showed us where we had completely different perspectives about how we were doing. It prompted rich conversations that led to meaningful progress.

Q: Okay, I ran the assessment; now what? What are the next steps?

With your assessment results in hand, it’s now time to take actions that help you improve. For each dimension of the Team and Technical Agility Assessment, SAFe provides growth recommendations to help teams focus on the areas that matter most and prioritize their next steps. You should: 

  • Review the team growth recommendations together to generate ideas
  • Select your preferred actions (you can use dot voting or WSJF calculations for this; SAFe® Collaborate has ready-made templates you can use)
  • Capture your team’s next steps in writing: “Our team decided to do X, Y, and Z.” 
  • Follow through on your actions, so that you’re connecting them to the desired outcome
  • Check in on your progress at the beginning of iteration retrospectives

Finally, you’ll want to use these actions to set a focus for the team throughout the PI, and check in with business owners at PI planning on how these improvements have helped the organization make progress toward its goals.

Q: I’m ready! How do I get started? 

Fantastic. Just visit the Measure and Grow page at the SAFe Community Platform to choose your assessment tool. While you’re there, you can watch the video for tips or download the Measure and Grow Toolkit for play-by-play guidance. As you’re running the assessment, use the SAFe Collaborate templates to guide the discussion and identify actions and next steps. 

Have fun!

About the authors

Lieschen is a product owner and former scrum master at Scaled Agile.

Lieschen is a product owner and former scrum master at Scaled Agile. She’s also an agile coach and conflict guru—thanks in part to her master’s degree in conflict resolution. Lieschen loves cultivating new ideas and approaches to agile to keep things fresh and exciting. And she’s passionate about developing best practices for happy teams to deliver value in both development and non-technical environments. Fun fact? “I’m the only person I know of who’s been a scrum master and a scrum-half on a rugby team.”

Sam is a certified SAFe® 5.0 Program Consultant (SPC) and serves as the scrum master for several teams at Scaled Agile. His recent career highlights include entertaining the crowd as the co-host of the 2019 and 2020 Global SAFe® Summits. A native of Columbia, South Carolina, Sam lives in Denver, CO, where he enjoys CrossFit and Olympic weightlifting.

Share:

Back to: All Blog Posts

Next: The Unparalleled Value of Emotional Intelligence, Part Two

8 Patterns to Set Up Your Measure and Grow Program for Success

We all know that any time you start something new in an organization it takes time to make it stick, and if teams and leaders find value, they will work to keep a program flourishing. The same is true when you implement a Measure and Grow Program within your organization. It takes planning and effort to get it started, but the rewards will definitely outweigh the efforts in the end.

At AgilityHealth®, our Strategists work with organizations every day to help them set up Measure and Grow programs that will succeed based on their individual needs. Through their experiences, they have noticed some consistent patterns across our customers, both commercial and government, for- and non-profit. Understanding these patterns can help you set up a program that’s right for your organization.

Before we jump into the patterns, let’s review what a Measure and Grow program is. Simply stated, it’s how you will measure your progress toward business agility. When we look at how Enterprise Business Agility was defined by Sally Elatta, AgilityHealth Founder, and Evan Leybourne, Founder of the Business Agility Institute, you can see why this is important.

The ability to adapt to change, learn and pivot, deliver at speed, and thrive in a competitive market.

Sally Elatta, CEO AgilityHealth and Evan Leybourn, Founder, Business Agility Institute

We need to maintain our competitive edge, and in the process, make sure that healthy teams remain a priority—especially as we start to identify common patterns across teams.

Patterns

  1. Define how you will measure success.

Bertrand Dupperin said, “Tell me how you will measure me, and I will tell you how I will behave.” This is true of our teams, our team members, and our leaders. After this success criteria have been defined, allow the team members to measure themselves in a safe environment where they can be open and honest about their maturity with a neutral facilitator. The process of actioning on the data is very powerful for teams.

  1. Provide a way to help teams grow after you measure them.

“Measurement without action is worthless data.” (Thanks, Sally, for another great bit of wisdom.) When you set up your Measure and Grow program, make sure it includes a way for teams to learn and mature.

Some of the common ones we see are:

  • Dojo teams—high-performing teams paired with new or immature teams to help them learn
  • Pre-defined learning paths for teams using instructor-led or virtual learning
  • Intentional learning options for teams through Communities of Practice or other options
  • Pairing/Mentorship/Accountability Partners
  1. Tie the results to the goals.

“Why are we taking the time to do this?” This is a common question that teams and leaders ask when we are starting Measure and Grow programs. They feel that the time reserved for an Inspect and Adapt session could maybe be used to tie up those last few story points or test cases, when in reality there is a corporate objective to mature the teams. Be sure to share these kinds of goals with your teams and managers so they understand that this is important to the organization.

  1. Provide a maturity roadmap that takes the subjectivity out of the questions.

We all have an idea of what “good” looks like, but without a shared understanding of “good”, my “good” might be a 3, my teammate’s might be a 4, someone else’s might be a 2, and so on. When you share a common maturity roadmap to provide context for your assessment, your results will be less subjective.

  1. Measure at multiple levels so that you can correlate the results.

When we just look at maturity from the team perspective, we get one view of an organization. When we look at maturity from the leadership and stakeholder perspectives, we get another view. When we look at both together—the sandwich model—we get a three-dimensional view and can start to surmise cause and effect. This gives a clearer picture of how an organization is performing.

  1. Minimize competing priorities and platforms.

Almost all teams, regardless of organization, share that there are too many systems, too many priorities, too many everything (except maybe pizza slices …). Be sure to schedule your measurement and retrospective time when the team is taking a natural break in their work. Teams should take the time to do a strategic retrospective on how they are working together at the end of every PI during their Inspect and Adapt, so use that time wisely.

  1. Engage the leaders in the process.

When this becomes a “we” exercise and not a “you” exercise, then there is a sense of trust that is built between the teams and their leaders. Inevitably the teams are going to ask the leaders for assistance in removing obstacles. If the leaders are on board from the start and are expecting this, and they start removing them, this creates an atmosphere of psychological safety where teams can be honest about what they need and leaders can be honest about what they expect.

  1. Remember, this is all change, and change takes time.

Roy T. Bennett said, “Change begins at the end of your comfort zone.” It takes time, perseverance, and some uncomfortable conversations to change an organization and help it to grow. But in the end, it’s worth doing.

Get Started

Setting up a Measure and Grow program isn’t without its struggles, but for the organizations and teams that put the time and effort into doing it right, the rewards far outweigh the work that goes into it. If you would like to chat with us about what it would take to set up your Measure and Grow program, we’re ready to help.

About Trisha Hall

Trisha Hall - AgilityHealth’s

Trisha has been part of AgilityHealth’s Nebraska-based leadership team since 2014. As VP of Enterprise Solutions, she taps into her 25 years of experience to help organizations bring Business Agility to their companies and help corporate leaders build healthy, high-performing teams. Find Trisha on LinkedIn.

Share:

Back to: All Blog Posts

Next: Aligning Global Teams Through Agile Program Management: A Case Study

How 90 Teams Used Measure and Grow to Improve Performance by 134 Percent

This post is part of an ongoing blog series where Scaled Agile Partners share stories from the field about using Measure and Grow assessments with customers to evaluate progress and identify improvement opportunities.

One of our large financial services clients needed immediate help. It was struggling to meet customer demands and industry regulations and needed to align business priorities to capacity before it was outplayed by competitors. The company thought the answer would be to invest in business in Agility practices. But so far, that strategy didn’t seem to be paying off.. 

Teams were in constant flux and the ongoing change was causing unstable, unpredictable performance. The leading question was, “How can we get more output from existing capacity?”

Among the client’s key challenges:

  • No visibility into common patterns across teams
  • Inspect-and-adapt data was stuck in PowerPoint and Excel
  • Output expectations didn’t match current capacity
  • Teams weren’t delivering outcomes aligned to business value

Getting a baseline on team health 

We introduced the AgilityHealth® TeamHealth Radar Assessment to the continuous improvement leadership team, and it decided to pilot the assessment across the portfolio. Within a few weeks after launching the assessment, the organization got a comprehensive readout. It identified the top areas of improvement and key roadblocks for 90+ teams. 

These baseline results showed a lack of a backlog, not to mention a lack of clarity around the near-term roadmap. Teams were committing to work that wasn’t attached to any initiatives and the work wasn’t well-defined. Dependencies and impediments weren’t being managed. And the top areas of improvement matched data collected during inspect and adapt exercises over the previous two years. Even though the organization had previously identified these issues, nothing had been done to resolve them, as leaders did not trust the data until it came from the voice of the teams via AgilityHealth.

The ROI of slowing down to speed up

Equipped with this knowledge, leaders took the time to slow down and ensure teams had what they needed to perform their jobs efficiently. Leaders also developed a better understanding of where they needed to step in to help the teams. The organization re-focused efforts on building a sufficient backlog, aligned with a roadmap, so teams could identify dependencies earlier in the development lifecycle. 

This intentional slow-down drove a return on investment in less than a year and $6M in cost savings—equivalent in productivity to the work of five extra teams—while generating an additional $25M in value for the company.

By leveraging the results of the AgilityHealth assessment, leaders now had the data they needed to take action:

  • A repeatable process for collecting and measuring continuous improvement efforts at the end of every planning increment (PI)
  • Clear understanding of where teams stood in their Agile journey and next steps for maturity
  • Comprehensive baseline assessment results showing where individual team members thought improvement was needed, both from leaders and within their teams

What’s next

An enterprise transformation doesn’t stop with the first round of assessments. Like other Fortune 500 companies, this client plans to continue scaling growth and maturity across the enterprise, increasing momentum and building on what it’s learned.

The company plans to introduce the Agility Health assessment for individual roles, so it can measure role maturity and accelerate the development of Agile skills across defined competencies. It will continue to balance technical capacity with an emphasis on maintaining stable, cross-functional teams since these performance metrics correlate to shipping products that delight customers and grow the business. And to better facilitate “structural agility” (creating and tracking Agile team structures that support business outcomes), it will focus on ensuring the integrity of its data.

Get started

You too can leverage AgilityHealth’s Insights Dashboard to get an overall view of your organization’s Agile maturity: baseline where you are now, discover how to improve, and get to where you want to be tomorrow. Get started by logging into the SAFe Community and visiting the Measure and Grow page.

About Sally

Sally is a thought leader in the Agile and business agility space

Sally is a thought leader in the Agile and business agility space. She’s passionate about accelerating the enterprise business agility journey by measuring what matters at every level and building strong leaders and strong teams. She is an executive advisor to many Fortune 500 companies and a frequent keynote speaker. Learn more about AgilityHealth here.

Share:

Back to: All Blog Posts

Next: 8 Patterns to Set Up Your Measure and Grow Program for Success

AgilityHealth Insights: What We Learned from Teams to Improve Performance – Agility Planning

This post is part of an ongoing blog series where Scaled Agile Partners share stories from the field about using Measure and Grow assessments with customers to evaluate progress and identify improvement opportunities.

At AgilityHealth®, our team has always believed there’s a correlation between qualitative metrics (defined by maturity) and quantitative metrics (defined by performance or flow). A few years ago, we moved to gather both qualitative and quantitative data. Once we felt we had a sufficient amount to explore, we partnered with the University of Nebraska’s Center for Applied Psychological Services to review the data through our AgilityHealth platform. The main question we wanted to answer was: What are the top competencies driving teams to higher performance? 

Before we jump into the data, let’s start by reviewing what metrics make up “performance.” Below are the five quantitative metrics that form the Performance Dimension within the TeamHealth® radar: 

  • Time-to-market 
  • Quality
  • Predictable Delivery
  • Responsiveness (cycle time)
  • Value Delivered
AgilityHealth

During the team assessment, we ask the team and the product owner about their happiness and their confidence in their ability to meet the current goals. We consider these leading indicators for performance, so we were curious to see what drives the qualitative metrics of Confidence and Happiness as well. 

Methodology 

We analyzed both quantitative and qualitative data from teams surveyed between November 2018 and April 2021. There were 146 companies representing a total of 4,616 teams (some who took the assessment more than once) which equates to more than 46,000 individual survey responses.

We used stepwise regression to explore and identify the top five drivers for each outcome. Stepwise regression is one approach in building a model that explains the most predictive set of competencies for the desired outcome. 

The results of our analysis identified the top five input drivers for each of the performance metrics in the TeamHealth assessment, along with the corresponding “weight” of each driver. We also uncovered the top five drivers of Confidence and Happiness for teams and product owners. These drivers are the best predictors for the corresponding metrics. All drivers are statistically significant, and each metric has the driver’s ranked order. 

By focusing on increasing these top five predictors, teams should see the highest gain on their performance metrics. 

Results

 After analyzing the top drivers for each of the performance metrics, we noticed that a few kept showing up as repeat drivers across performance. 

AgilityHealth

When analyzing the drivers for Confidence and Happiness, we found these additional predictors:

AgilityHealth

We know from experience that shorter iterations, better planning and estimating, and T-shaped skills all lead to better performance—but we now have data to prove it. It was a welcome surprise to see self-organization and creativity take center stage, as it did in our analysis. We’ve always coached managers to empower teams to solve problems, but for the first time, we have the data to back it up. 

Recommendations

Pulling these patterns together, it’s clear that if a team wants to impact its performance in an efficient way, it should focus on weekly iterations, T-shaped team members, effective planning and estimating, enabling creativity and self-organization, role clarity, and right-sizing and skilling. Teams that invested in these drivers saw a 37 percent performance improvement over teams that didn’t. So when in doubt, start here!

We’re excited to share that you can now see the drivers for each competency inside the AgilityHealth platform. We hope it helps you make informed decisions about where to invest your time and effort to improve your performance.

Visit the AgilityHealth page on the SAFe® Community Platform to learn more about these assessment tools and get started!

About Sally

Sally- Agile and business agility space

Sally is a thought leader in the Agile and business agility space. She’s passionate about accelerating the enterprise business agility journey by measuring what matters at every level and building strong leaders and strong teams. She is an executive advisor to many Fortune 500 companies and a frequent keynote speaker. Learn more about AgilityHealth at https://www.agilityhealthradar.com.

Share:

Back to: All Blog Posts

Next: How Do We Measure Feelings?

How Do We Measure Feelings? – SAFe Transformation

This post is part of an ongoing blog series where Scaled Agile Partners share stories from the field about using Measure and Grow assessments with customers to evaluate progress and identify improvement opportunities.

As business environments feature increasing rates of change and uncertainty, agile ways of working are becoming the dominant way of operating around the globe. The reason for this dominance is not that agile is necessarily the “best” way of working (agile, by definition, embraces the idea that you don’t know what you don’t know) but because businesses have found agile better-suited to addressing today’s challenges. Detailed three-year plans, extensive Gantt charts, and work breakdown structures simply have less relevance in today’s world. Agile, with its emphasis on fast learning and experimentation, has proven itself to be more appropriate for today’s unpredictable business environment.

Agility Requires Data You Can Trust

Whereas a plan-driven approach requires an extensive analysis phase, today’s context demands frequent access to high-quality data and information to facilitate quick course correction and validation. One of these critical sources of data is targeted assessments. The purpose of any assessment is to gather information. And the quality of the information collected is a direct result of the quality of the assessment. 

Think of an assessment as a measuring tool. If we were studying a physical object, we might use measuring devices to assess its length, height, mass, and so on. Scientists have developed sophisticated definitions of many of these physical characteristics so we can have a shared understanding of them.

However, people—especially groups of people—are not quite so straightforward to measure: particularly if we’re talking about their attitudes and feelings. It’s not really possible to directly measure concepts like culture and teamwork in the same way we can measure mass or length. Instead, we have to look to the discipline of psychometrics—the field of study dedicated to the construction and validation of assessment instruments—to assist us in measuring these complex topics.

Survey researchers often refer to an assessment or questionnaire as an “instrument,” because the purpose is to measure. We measure to learn, and we learn to apply our knowledge in pursuit of improvement. This is one reason why assessment is such an integral part of the educational system. Properly designed, assessments can be a powerful tool to help us validate our approach, understand our strengths, and identify areas of opportunity.

Ensuring Quality is Built into the Assessment

Since meaningful information is so critical to fast inspection and adaptation, it’s important to use high-quality assessments. After all, if we’re going to leverage insights from the assessments to inform our strategy and guide our decisions, we need to be confident we can trust the data.

How do we know that an assessment instrument is measuring what it purports to? It’s so important to use care when designing the assessment tool, and then use data to provide evidence of both its validity (accuracy) and reliability (precision). Here’s how we ensure quality is built into our assessment.

Step 1: Prototype

All survey instrument development starts with a measurement framework. When Comparative Agility partnered with SAFe® to design the new Business Agility assessment, subject matter experts leveraged their experience from the original Business Agility survey to explore enhancements. 

The original Business Agility survey had generated a variety of important insights and proved to be incredibly popular among SAFe customers. But one area of potential improvement was the language used in the assessment itself. Customers wanted to leverage a proven SAFe survey to understand an organization’s current state, without first requiring the organization to have gone through comprehensive training. With the former Business Agility survey, this proved difficult, since the survey instrument often referred to SAFe-specific topics that many had not been exposed to yet.

To address this issue, subject matter experts (SPCTs, SAFe Fellows) teamed up with data scientists from Comparative Agility to craft SAFe survey items that would be meaningful at the start of a SAFe implementation, while avoiding terms that would require prior knowledge. This work resulted in a prototype survey or “minimum viable product.” 

Step 2: Test and Validate

Once the new Business Agility survey instrument was developed, we released it to beta and began to collect data. Several people in the SPCT community were asked to participate in a pilot. In follow-up interviews, respondents were asked about their experience with the survey. Together with respondents, the survey design team, and additional subject matter experts, we examined the results. (We also received external feedback from a Gartner researcher to help improve the nomenclature of some of the survey items.) Only once the team has been satisfied with the reliability and validity of the beta survey instrument will it be ready for production.

Step 3: Deploy and Monitor

Even after the Business Agility survey instrument reaches the production phase, the data science team at Comparative Agility and Scaled Agile continuously monitor the assessment for data consistency. A rigorous change management process ensures that any tweaks made to survey language, post-deployment, are tested to ensure they don’t negatively impact the accuracy.

Integrating Flow and Outcomes
Although validated assessments are a critical component of a data-driven approach to continuous improvement, they’re not sufficient. To gain a holistic perspective and complete the feedback loop, it’s also important to measure Flow and Outcomes. 

Flow
Flow metrics express how efficient an organization is at delivering value. When operating in complex environments characterized by uncertainty and volatility, flow metrics help organizations identify performance across the end-to-end value stream, so you can identify impediments to agility. A more comprehensive overview of Flow metrics can be found in the SAFe knowledge article, Metrics.

OutcomesFlow metrics may help us deliver quickly and effectively, but without understanding whether we’re delivering value to our customers, we risk simply “delivering crap faster.” Outcome metrics address this challenge by ensuring that we’re creating meaningful value for the end-customer and delivering business benefits. Examples of outcome metrics include revenue impact, customer retention, NPS scores, and Mean Time to Resolution (MTTR). 

Embracing a Culture of Data-Driven, Continuous Improvement

It’s important to note that although data and insights help inform our strategy and guide our decisions, to make change stick and ultimately to drive sustainable cultural change, we need to appreciate that data is a means to an end.

That is, data—even though it’s validated, statistically significant, and of high quality—should be viewed not as a source of answers, but rather as a means to ask better questions and uncover new insights in our interactions with people. By having data guide us in our conversations, interactions, and how we define hypotheses, we can drive a culture of inquiry and continuous improvement. 

Just like when a survey helps us better understand how we feel, the assessment provides us with an opportunity to interact in a more meaningful way and increase our understanding. The data itself is not the goal but a way to help us learn faster, adapt quicker, and remove impediments to agility.

Start Improving with Your Own Data

As 17 software industry professionals noted some twenty years ago at a resort in Snowbird, Utah, becoming more agile is about “individuals and interactions over processes and tools.” 

To start your own journey of data-driven, continuous improvement today, activate your free Comparative Agility account in the Measure & Grow area of the SAFe Community Platform.

About Matthew

Matthew Haubrich is the Director of Data Science at Comparative Agility.

Matthew Haubrich is the Director of Data Science at Comparative Agility. Passionate about discovering the story behind the data, Matt has more than 25 years of experience in data analytics, survey research, and assessment design. Matt is a frequent speaker at numerous national and international conferences and brings a broad perspective of analytics from both public and private sectors.

Share:

Back to: All Blog Posts

Next: Everything You Wanted to Know About SAFe® Enterprise (but Were Afraid to Ask)

Honest Assessments Achieve Real Insights

In this post, I share my experience of running a series of Measure and Grow assessments at a government agency in the UK I’m working with—including the experiments that we decided to run and our learnings during the SAFe transformation process.

The last year has been a voyage of discovery for all of us at Radtac. First, we had to figure out how to deliver training online and still make it an immersive learning experience. Then, we needed to figure out how to do PI Planning online with completely dispersed teams. Once that was sorted, we entered a whole new world of ongoing, remote consulting that included how to run effective Measure and Grow assessments.

In this post, I share my experience of running a series of Measure and Grow assessments at a government agency in the UK I’m working with—including the experiments that we decided to run and our learnings. The agency has already established and runs 15 Agile Release Trains (ARTs). We agreed that we wouldn’t run assessments for 15 ARTs because we wanted to start small and test the process first. Therefore, we picked four ARTs to pilot the assessments and only undertake the Team and Technical Agility and Agile Product Delivery assessments.

Pre-assessment Details

What was really important was that each ART we had selected had an agility assessment pre-briefing where we set the context with the following key messages:

  1. This is NOT a competition between the ARTs to see who had the best assessment.
  2. The assessments will support the LACE in identifying the strengths and development areas across the ARTs.
  3. The results will be presented to leadership in an aggregated form. Each ART will see only their results; no individual ART results will be shared with other ARTs.
  4. The results will identify where leadership can remove impediments that the teams face.
  5. We need an honest assessment to achieve real insight into where leadership and the LACE can help the teams.

In addition, prior to the assessments, we asked the ARTs to:

  1. Briefly review the assessment questions.
  2. Prioritise attendance with core team members with a cross-section of their team.

Conducting the Assessment

The assessment was facilitated by external consultants to provide some challenges to the responses. We allotted 120 minutes for both the Technical and Team Agility and Agile Product Delivery assessments, but most ARTs completed them within 90 minutes. We used Microsoft Teams as our communication tool and Menimeter.com (Menti) to poll the responses.

Each Menti page had five to six questions that the team members were asked to score on a scale of 1 to 5–with 1 being false, 3 being neither false nor true, and 5 is true. To avoid groupthink, we didn’t show the results until all questions and all members had been scored. Because Menti shows a distribution of scores, where there was a range in the scoring, we explored the extremes and asked team members to explain why they thought it was a 1 while others thought it was a 5. On the rare occasion that there was any misunderstanding, we ran the poll again for that set of questions.

Scaled Agile Partners
Some results from the Team and Technical Agility poll.

What we found after the first assessment was that there was still a lot of SAFe® terminologies that people didn’t understand. (Based on this and similar feedback, Scaled Agile recently updated its Business Agility assessment with simpler, clearer terminology. This is helpful for organizations that want to use it before everyone has been trained or even before they’ve decided to adopt SAFe.) So, for the next assessment, we created a glossary of definitions, and for each set of questions before they scored, we reminded them of some of the key terminology definitions.

The other learning was that for some of the questions, team members didn’t have a particular experience, and therefore scored a 1 (false) which distorted the assessment. Going forward, we asked team members to skip the question if they had no experience. We also took a short break between the assessments. And of course, no workshop would be complete without a feedback session at the end, which helped us improve each time we completed the assessments.

Here is a quote from one of the ARTs:

“As a group, we found the Agile Assessment a really useful exercise to take part in. Ultimately, it’s given our supporting team areas to focus on and allowed us to pinpoint areas where we can drive improvements. The distributed scores for each question are where we saw a great deal of value and highlighted differences in opinion between roles. This was made more impactful by having a split of engineers and supporting team roles in the session. The main challenge we had about the session was how we interpreted the questions differently. To overcome this, we had a discussion about each question before starting the scoring, and although this made the process a little longer, it was valuable in ensuring we all had the same understanding.”

Post-assessment Findings

We shared the individual ART results with its team members so that they could consider what they as an ART could improve themselves. As a LACE, we aggregated the results and looked for trends across the four ARTs. Here’s what we presented to the leadership team:

  1. Observations—what did we see across the ARTs?
  2. Insights—what are the consequences of these observations?
  3. Proposed actions—what do we need to do as a LACE and leadership team? We used the Growth Recommendations to provide some inspiration for the actions.

We then made a commitment to the teams that we would provide feedback from the leadership presentations.

Next Steps

We need to run the assessments across the other 11 ARTs and then repeat the assessments every two to three Program Increments.

You can get started with Measure and Grow, including the updated Business Agility assessment and tools on the SAFe® Community Platform.

About Darren

Darren is a director at Radtac, a global agile consulting business

Darren is a director at Radtac, a global agile consulting business based in London that was recently acquired by Cprime. As an SPCT and SAFe® Fellow, Darren is an active agile practitioner and consultant who frequently delivers certified SAFe courses. Darren also serves as treasurer of BCS Kent Branch and co-authored the BCS book, Agile Foundations—Principles, Practices and Frameworks.

Share:

Back to: All Blog Posts

Next: Creating Your PI Backlog Content