How Do We Measure Feelings? – SAFe Transformation

This post is part of an ongoing blog series where Scaled Agile Partners share stories from the field about using Measure and Grow assessments with customers to evaluate progress and identify improvement opportunities.

As business environments feature increasing rates of change and uncertainty, agile ways of working are becoming the dominant way of operating around the globe. The reason for this dominance is not that agile is necessarily the “best” way of working (agile, by definition, embraces the idea that you don’t know what you don’t know) but because businesses have found agile better-suited to addressing today’s challenges. Detailed three-year plans, extensive Gantt charts, and work breakdown structures simply have less relevance in today’s world. Agile, with its emphasis on fast learning and experimentation, has proven itself to be more appropriate for today’s unpredictable business environment.

Agility Requires Data You Can Trust

Whereas a plan-driven approach requires an extensive analysis phase, today’s context demands frequent access to high-quality data and information to facilitate quick course correction and validation. One of these critical sources of data is targeted assessments. The purpose of any assessment is to gather information. And the quality of the information collected is a direct result of the quality of the assessment. 

Think of an assessment as a measuring tool. If we were studying a physical object, we might use measuring devices to assess its length, height, mass, and so on. Scientists have developed sophisticated definitions of many of these physical characteristics so we can have a shared understanding of them.

However, people—especially groups of people—are not quite so straightforward to measure: particularly if we’re talking about their attitudes and feelings. It’s not really possible to directly measure concepts like culture and teamwork in the same way we can measure mass or length. Instead, we have to look to the discipline of psychometrics—the field of study dedicated to the construction and validation of assessment instruments—to assist us in measuring these complex topics.

Survey researchers often refer to an assessment or questionnaire as an “instrument,” because the purpose is to measure. We measure to learn, and we learn to apply our knowledge in pursuit of improvement. This is one reason why assessment is such an integral part of the educational system. Properly designed, assessments can be a powerful tool to help us validate our approach, understand our strengths, and identify areas of opportunity.

Ensuring Quality is Built into the Assessment

Since meaningful information is so critical to fast inspection and adaptation, it’s important to use high-quality assessments. After all, if we’re going to leverage insights from the assessments to inform our strategy and guide our decisions, we need to be confident we can trust the data.

How do we know that an assessment instrument is measuring what it purports to? It’s so important to use care when designing the assessment tool, and then use data to provide evidence of both its validity (accuracy) and reliability (precision). Here’s how we ensure quality is built into our assessment.

Step 1: Prototype

All survey instrument development starts with a measurement framework. When Comparative Agility partnered with SAFe® to design the new Business Agility assessment, subject matter experts leveraged their experience from the original Business Agility survey to explore enhancements. 

The original Business Agility survey had generated a variety of important insights and proved to be incredibly popular among SAFe customers. But one area of potential improvement was the language used in the assessment itself. Customers wanted to leverage a proven SAFe survey to understand an organization’s current state, without first requiring the organization to have gone through comprehensive training. With the former Business Agility survey, this proved difficult, since the survey instrument often referred to SAFe-specific topics that many had not been exposed to yet.

To address this issue, subject matter experts (SPCTs, SAFe Fellows) teamed up with data scientists from Comparative Agility to craft SAFe survey items that would be meaningful at the start of a SAFe implementation, while avoiding terms that would require prior knowledge. This work resulted in a prototype survey or “minimum viable product.” 

Step 2: Test and Validate

Once the new Business Agility survey instrument was developed, we released it to beta and began to collect data. Several people in the SPCT community were asked to participate in a pilot. In follow-up interviews, respondents were asked about their experience with the survey. Together with respondents, the survey design team, and additional subject matter experts, we examined the results. (We also received external feedback from a Gartner researcher to help improve the nomenclature of some of the survey items.) Only once the team has been satisfied with the reliability and validity of the beta survey instrument will it be ready for production.

Step 3: Deploy and Monitor

Even after the Business Agility survey instrument reaches the production phase, the data science team at Comparative Agility and Scaled Agile continuously monitor the assessment for data consistency. A rigorous change management process ensures that any tweaks made to survey language, post-deployment, are tested to ensure they don’t negatively impact the accuracy.

Integrating Flow and Outcomes
Although validated assessments are a critical component of a data-driven approach to continuous improvement, they’re not sufficient. To gain a holistic perspective and complete the feedback loop, it’s also important to measure Flow and Outcomes. 

Flow
Flow metrics express how efficient an organization is at delivering value. When operating in complex environments characterized by uncertainty and volatility, flow metrics help organizations identify performance across the end-to-end value stream, so you can identify impediments to agility. A more comprehensive overview of Flow metrics can be found in the SAFe knowledge article, Metrics.

OutcomesFlow metrics may help us deliver quickly and effectively, but without understanding whether we’re delivering value to our customers, we risk simply “delivering crap faster.” Outcome metrics address this challenge by ensuring that we’re creating meaningful value for the end-customer and delivering business benefits. Examples of outcome metrics include revenue impact, customer retention, NPS scores, and Mean Time to Resolution (MTTR). 

Embracing a Culture of Data-Driven, Continuous Improvement

It’s important to note that although data and insights help inform our strategy and guide our decisions, to make change stick and ultimately to drive sustainable cultural change, we need to appreciate that data is a means to an end.

That is, data—even though it’s validated, statistically significant, and of high quality—should be viewed not as a source of answers, but rather as a means to ask better questions and uncover new insights in our interactions with people. By having data guide us in our conversations, interactions, and how we define hypotheses, we can drive a culture of inquiry and continuous improvement. 

Just like when a survey helps us better understand how we feel, the assessment provides us with an opportunity to interact in a more meaningful way and increase our understanding. The data itself is not the goal but a way to help us learn faster, adapt quicker, and remove impediments to agility.

Start Improving with Your Own Data

As 17 software industry professionals noted some twenty years ago at a resort in Snowbird, Utah, becoming more agile is about “individuals and interactions over processes and tools.” 

To start your own journey of data-driven, continuous improvement today, activate your free Comparative Agility account in the Measure & Grow area of the SAFe Community Platform.

About Matthew

Matthew Haubrich is the Director of Data Science at Comparative Agility.

Matthew Haubrich is the Director of Data Science at Comparative Agility. Passionate about discovering the story behind the data, Matt has more than 25 years of experience in data analytics, survey research, and assessment design. Matt is a frequent speaker at numerous national and international conferences and brings a broad perspective of analytics from both public and private sectors.

Share:

Back to: All Blog Posts

Next: Everything You Wanted to Know About SAFe® Enterprise (but Were Afraid to Ask)

Honest Assessments Achieve Real Insights

In this post, I share my experience of running a series of Measure and Grow assessments at a government agency in the UK I’m working with—including the experiments that we decided to run and our learnings during the SAFe transformation process.

The last year has been a voyage of discovery for all of us at Radtac. First, we had to figure out how to deliver training online and still make it an immersive learning experience. Then, we needed to figure out how to do PI Planning online with completely dispersed teams. Once that was sorted, we entered a whole new world of ongoing, remote consulting that included how to run effective Measure and Grow assessments.

In this post, I share my experience of running a series of Measure and Grow assessments at a government agency in the UK I’m working with—including the experiments that we decided to run and our learnings. The agency has already established and runs 15 Agile Release Trains (ARTs). We agreed that we wouldn’t run assessments for 15 ARTs because we wanted to start small and test the process first. Therefore, we picked four ARTs to pilot the assessments and only undertake the Team and Technical Agility and Agile Product Delivery assessments.

Pre-assessment Details

What was really important was that each ART we had selected had an agility assessment pre-briefing where we set the context with the following key messages:

  1. This is NOT a competition between the ARTs to see who had the best assessment.
  2. The assessments will support the LACE in identifying the strengths and development areas across the ARTs.
  3. The results will be presented to leadership in an aggregated form. Each ART will see only their results; no individual ART results will be shared with other ARTs.
  4. The results will identify where leadership can remove impediments that the teams face.
  5. We need an honest assessment to achieve real insight into where leadership and the LACE can help the teams.

In addition, prior to the assessments, we asked the ARTs to:

  1. Briefly review the assessment questions.
  2. Prioritise attendance with core team members with a cross-section of their team.

Conducting the Assessment

The assessment was facilitated by external consultants to provide some challenges to the responses. We allotted 120 minutes for both the Technical and Team Agility and Agile Product Delivery assessments, but most ARTs completed them within 90 minutes. We used Microsoft Teams as our communication tool and Menimeter.com (Menti) to poll the responses.

Each Menti page had five to six questions that the team members were asked to score on a scale of 1 to 5–with 1 being false, 3 being neither false nor true, and 5 is true. To avoid groupthink, we didn’t show the results until all questions and all members had been scored. Because Menti shows a distribution of scores, where there was a range in the scoring, we explored the extremes and asked team members to explain why they thought it was a 1 while others thought it was a 5. On the rare occasion that there was any misunderstanding, we ran the poll again for that set of questions.

Scaled Agile Partners
Some results from the Team and Technical Agility poll.

What we found after the first assessment was that there was still a lot of SAFe® terminologies that people didn’t understand. (Based on this and similar feedback, Scaled Agile recently updated its Business Agility assessment with simpler, clearer terminology. This is helpful for organizations that want to use it before everyone has been trained or even before they’ve decided to adopt SAFe.) So, for the next assessment, we created a glossary of definitions, and for each set of questions before they scored, we reminded them of some of the key terminology definitions.

The other learning was that for some of the questions, team members didn’t have a particular experience, and therefore scored a 1 (false) which distorted the assessment. Going forward, we asked team members to skip the question if they had no experience. We also took a short break between the assessments. And of course, no workshop would be complete without a feedback session at the end, which helped us improve each time we completed the assessments.

Here is a quote from one of the ARTs:

“As a group, we found the Agile Assessment a really useful exercise to take part in. Ultimately, it’s given our supporting team areas to focus on and allowed us to pinpoint areas where we can drive improvements. The distributed scores for each question are where we saw a great deal of value and highlighted differences in opinion between roles. This was made more impactful by having a split of engineers and supporting team roles in the session. The main challenge we had about the session was how we interpreted the questions differently. To overcome this, we had a discussion about each question before starting the scoring, and although this made the process a little longer, it was valuable in ensuring we all had the same understanding.”

Post-assessment Findings

We shared the individual ART results with its team members so that they could consider what they as an ART could improve themselves. As a LACE, we aggregated the results and looked for trends across the four ARTs. Here’s what we presented to the leadership team:

  1. Observations—what did we see across the ARTs?
  2. Insights—what are the consequences of these observations?
  3. Proposed actions—what do we need to do as a LACE and leadership team? We used the Growth Recommendations to provide some inspiration for the actions.

We then made a commitment to the teams that we would provide feedback from the leadership presentations.

Next Steps

We need to run the assessments across the other 11 ARTs and then repeat the assessments every two to three Program Increments.

You can get started with Measure and Grow, including the updated Business Agility assessment and tools on the SAFe® Community Platform.

About Darren

Darren is a director at Radtac, a global agile consulting business

Darren is a director at Radtac, a global agile consulting business based in London that was recently acquired by Cprime. As an SPCT and SAFe® Fellow, Darren is an active agile practitioner and consultant who frequently delivers certified SAFe courses. Darren also serves as treasurer of BCS Kent Branch and co-authored the BCS book, Agile Foundations—Principles, Practices and Frameworks.

Share:

Back to: All Blog Posts

Next: Creating Your PI Backlog Content

Creating Your PI Backlog Content – Agility Planning

Glenn Smith and Darren Wilmshurst with Radtac, a Scaled Agile Partner, co-wrote this blog post. 

At the conclusion of Program Increment (PI) Planning, we’re always reminded of something one of our colleagues always says. There’s much to celebrate because we’ve created a set of committed plans. But we first have to complete a retrospective of the PI Planning event (cue groans from everyone in the room) and we “start preparing tomorrow” for the next PI (more groans).

Moreover, the critical path for any PI Planning is the creation of the content, suitably refined and prioritized. Without this, we can’t do any planning! But what does this look like in practice? 

This blog post is aimed at coaches who need to think about the content preparation for the next PI. By that we mean SAFe® Program Consultants (SPCs) supporting the Agile Release Train (ART) and Release Train Engineers (RTEs). But more importantly, Product Management (PM) and System Architects (SA) need to create, refine, prioritize, and socialize the content supported by Product Owners (POs) and Business Owners (BOs). We will explore each of these roles in turn during the course of this post. 

The traditional siloed hierarchy of organizations can engender a ‘this isn’t my job’ attitude. Yet many people and roles need to work together to create a compelling backlog that delivers economic benefits and satisfies your customers.

The visual model below is a high-level view of the intensity of the preparation activity for each of these roles. It isn’t meant to represent the number of hours. That is, high intensity does not mean 100 percent of your time, we just expect more time spent on preparation while recognizing that there will be other things to be done.

PI Backlog Content
Preparation intensity for specific roles.

You will also notice that there is a significant spike in preparation during the Innovation and Planning (IP) Sprint for PM, BOs, POs, and the Teams. This is when PI Planning happens.

Product Management and System Architect

PM and the SA will follow a similar pattern to each other, as their roles are two sides of the same coin—one focused on the outward market and the other technically oriented. They are going to be collaborating and working closely to make sure their respected needs are met and the right balance of the work is correctly scheduled.

Crafting backlog items for an ART, whether they are Business Features or Enabler Features, follow a pattern of Creating, Refining, Prioritising, and Socialising. While overly simplistic, each step could follow the first four iterations of a PI. In the first half of the PI, expect PM and the SA to be looking to the future. This will include looking at upcoming Epics, decomposing existing Epics, and the ART roadmap and associated Architecture Runway.

A common pattern is to see poorly defined Features with weak benefit hypothesis statements and acceptance criteria. It shouldn’t be overlooked how much effort this takes to do well. This is because the work involved isn’t just writing them down in your Agile Lifecycle Management tooling, but working with BOs, talking to a wider stakeholder cohort, including customers, and reviewing market trends. Improving their own understanding of the value proposition and scope enables people on the ART to more easily deliver against it. Through the PI, their effort tapers as other cohorts take the backlog content and prepare for PI Planning.

Business Owners

BOs are key stakeholders and critical friends of the ART. As such, they gradually experience an increasing demand on their time to support creating backlog content throughout the PI—with the most involvement happening during PI Planning. As a cohort, BOs are available when needed by the likes of PM, and actively participate in the System Demos every iteration. Here, they not only get to see the progress of delivery but give feedback to help PM and the POs inspect and adapt the backlog.

We recommend that prioritization be a ‘little and often’ affair. And as it is a team sport, BOs must attend these sessions (these are the little spikes on the BO line in the model).

Product Owners

In a scaled environment, POs serve both the team and the train. In the initial periods of the PI, as you might expect, the PO has both a team execution focus and needs to support PM with Feature creation and refinement. As the content starts to get in better shape for the upcoming PI Planning, PO involvement increases, but with a shift in focus to Feature elaboration and decomposition into drafting user stories to later socialize with the team.

The Team

Through most of the PI, the team is execution-focused, although on hand for those ad hoc, short whiteboard sessions with PM, SAs, and POs. Larger demands on the team’s time should be scheduled like any other demand on their time—after all, work is work! This will be done through the use of exploration enablers in a future PI, or spikes and innovation activities that occur during the IP iteration. Either way, the outcome is gaining knowledge that reduces the uncertainty of future work.

The team’s involvement, however, peaks during the IP iteration when the POs socialize the upcoming backlog of work—the Features and the draft stories they have created. It is during the preparation for PI Planning that the team takes time to understand what is needed and answer questions that need “I’ll look in the code” to find out.

Release Train Engineer and Scrum Master

Hey wait, you didn’t forget about the RTE and Scrum Master (SM), did you? Surely they are just facilitators, we hear you say, what do they have to do with backlog items? But let’s think about this. As facilitators at the train or team level, they are key advocates for the improvement of flow and value delivery. Therefore, it is not unreasonable to expect them to create improvement items that require effort from the teams during the PI. And we know that anything that requires effort from the teams should be planned accordingly.

The items that the RTE and SM will bring to the table for inclusion will likely come from team retrospectives, the Inspect and Adapt problem-solving workshop, or from insight gained from activities like the SAFe® DevOps course.

Creating Content During PI Planning

During each PI Planning session, PM presents the vision, which highlights the proposed features of the solution, along with any relevant upcoming Milestones. While some may feel that at this point in the proceedings the content creation is over for PM, there is actually still work to do. During the planning, there will likely be scope negotiations and prioritization calls needed as the teams get deeper into understanding and scheduling in their breakout sessions.

Similarly, the BOs have a role in adaptive content creation too. Beyond providing the business context in the briefings, they will work with the team to discuss the needed outcomes from the work. And they’ll support PM and the SAs in adapting the scope from what was originally crafted—because tradeoffs need to be made during planning. Discussions with the teams during the assignment of Business Value could influence what gets produced in the upcoming PI too.

While the POs and the Teams need to sequence and plan their stories to maximize economic results, there will almost certainly be variability of scope that will need to be accommodated as new information emerges. This will involve further elaboration, negotiation, planning, and reworking of the content during PI Planning.

In addition, the model shouldn’t be followed religiously, but used to identify who, when, and by how much focus the different roles on the train need to spend to make this happen. While putting an emphasis on the quality of the backlog items is going to help your ART, it alone won’t fix your delivery problems but will act as a key enabler in doing so. 

It is important to give a government health warning at this stage: context is king! While we have given our view on the preparation activities and the intensity, your context will provide a different reflection. In fact, when creating this post, we both had a slightly different approach for prioritization based on our respective experiences. Neither is right or wrong but a reflection on the clients that we have worked with. So please treat the model we have created as a ‘mental model’ and something you can use with your trains to frame a discussion. 

The pattern, while broadly accurate, will change in some situations, particularly if you are preparing for a training launch and this is your first PI. Here, the cadence may be condensed and more focused, but this will be guided by the quality of the backlog content you already have.

A final thought and back to our colleague who says that “PI Planning starts tomorrow.” So does PI Execution. There’s no point in making a team committed to the plans that you have created and then not executing on them. Otherwise, what was the point of PI Planning in the first place?

If we’ve piqued your interest, check out this post about changing a feature in the middle of the PI. It’s a question we always get asked when we teach the Implementing SAFe® class.

About Glenn

Glenn Smith is SAFe program Consultant Trainer (SPCT)

Glenn Smith is SAFe Program Consultant Trainer (SPCT), SPC, and RTE working for Radtac as a consultant and trainer out of the UK. He is a techie at heart, now with a people and process focus supporting organizations globally to improve how they operate in a Lean-Agile way. You will find him regularly talking at conferences and writing about his experiences to share his knowledge.

Share:

Back to: All Blog Posts

Next: We’re Giving More Than a Donation for Pride Month

We’re Giving More Than a Donation for Pride Month – Agility Leadership

I wanted to share a learning moment I and my colleagues at Scaled Agile had recently. June is Pride Month, and some employees requested that we modify our logo to include the rainbow. This request led to an internal debate about whether altering our logo was a trivial act or a meaningful symbol.

People raised valid points. “Others are doing it. Why aren’t we showing our support?” and, “We don’t do enough externally to support the Lesbian, Gay, Bisexual, Transgender, Queer, Intersex, Asexual (LGBTQIA+) community, so changing our logo feels like an empty gesture.” Ultimately, we decided not to modify our logo but instead encourage an employee-driven campaign that the company could share on its social channels.

Personally, I saw the request to alter our logo as a non-issue. Here’s why: As an openly gay male executive at Scaled Agile, I lead one of our largest global regions. No one has ever questioned my capabilities, and I’ve always felt accepted. As a leader here, I have opportunities all the time to lead by example. And I consistently get feedback from employees that they appreciate my approach. People who know me professionally and personally know I don’t have a “work Brendan” that’s different from my “personal Brendan.” My customers know this too. I’ve always been proud of this, and I feel totally supported in this regard at Scaled Agile. 

Scaled Agile participates in Pledge 1% Colorado, and every year we donate a significant part of our time and profits to lots of good causes. While we haven’t yet focused on the LGBTQIA+ community, we do give back to many other underrepresented communities through volunteering and donations. Few companies of our size have matched our commitment to giving back. Our company was founded by a strong team, and we’ve never wavered in our support for the gay community.

Early on in Scaled Agile’s existence, we chose to hire the best talent. And we ended up with a large and enthusiastic LGBTQIA+ employee base. I’m here to say that you can find a place to hang your rainbow hat here with us. Fostering a welcoming workplace where LGBTQIA+ people feel safe, supported, and trusted is giving back, and it’s worth getting loud about. I’m fortunate that I’ve always found these qualities in my employers; I vet them in that regard. Providing an environment where LGBTQIA+ people can grow their skills in a welcoming way is worth more than any donation we could make to an LGBTQIA+ organization. 

Many young LGBTQIA+ people struggle and wonder whether they’ll have a safe future. Showing them that we can thrive and choose whatever career path we want is very important to me. There are LGBTQIA+ adults who go to work every day living a tale of two selves: they are fearful, and rightfully so. When people are forced to hide who they are, they miss out on the right to be their authentic selves, and out of preservation, they show up as a different self. I’ve seen the pain this causes. I’m committed to continuing to play a strategic role in growing this company so that more people can enjoy a safe, fun, and respectful workplace. As a member of the LGBTQIA+ community, I think this is the best giveback we can provide.

I’m a big proponent of providing donations to communities in need. You hope your money goes to the right people at the right time for the right reasons. And you trust that the organization is using your funds wisely. But controlling your contribution to the LGBTQIA+ community by hiring us, no questions asked, and providing us with an amazing, supportive team of colleagues and customers, elicits a tremendous feeling of pride in me.

It can be risky for leaders like me to pen posts like this because they’ll stick with you forever. But leading by example means being vulnerable. We should celebrate who we all are together as well as the fact that our company is having a big impact by offering more than just words or donations. I’ll participate in developing our more concrete LGBTQIA-focused initiatives, and in the meantime, we’ll keep on giving.

About Brendan Walsh

Brendan Walsh

As an active member of the Colorado tech startup community, Brendan has enjoyed growing some of the most successful Colorado-based companies for 25 years and counting. He lives in Denver, along with his partner of 16 years, Aaron. The two have had the privilege of living abroad for several years and always looked forward to bringing their life experiences back to Colorado. Their four-legged, rescued son, Rex, rules the house—just to be clear.

Share:

Back to: All Blog Posts

Next: Three Lessons I Learned in My First Year as a Product Owner

Three Steps to Prepare for a Successful Value Stream Workshop – SAFe Transformation

Value Stream Workshop

The Value Stream and Agile Release Train (ART) identification workshop are some of the most critical steps to generate meaningful results from your SAFe transformation. That’s because it enables you to respond faster to customer needs by organizing around value. This workshop can also be the hardest step. It’s complex and politically charged, so organizations often skip or mismanage it.

A savvy change agent would invest in the organizational and cultural readiness to improve the chances of its success. Attempting to shortcut or breeze through change readiness would be the same as putting your foot on the brake at the same time you’re trying to accelerate. Get this workshop right, and you’ll be well on your way to a successful SAFe implementation.

Why Is It So Difficult? 

Aside from the complex mechanics of identifying your value streams, there is also a people component that adds to the challenge. Leaders are often misaligned about the implications of the workshop, and it can be tough to get the right participants to attend.  For example, a people leader could soon realize that ARTs may be organized in a way that crosses multiple reporting relationships, raising the concern of their direct reports joining ARTs that don’t report to them. 

In reflecting on my battle scars from the field, I’ve distilled my advice to three steps to prepare the organization for a successful workshop.

Step 1: Engage the right participants

The Value Stream and ART identification workshop can only be effective and valuable if the right audience is present and engaged. This is the first step to ensure the outcome of the workshop solves for the whole system and breaks through organizational silos.

“… and If you can’t come, send no one.” —W. Edwards Deming

The required attendees will fall into four broad categories:

  • Executives and leaders with the authority required to form ARTs that cut across silos.
  • Business owners and stakeholders who can speak to the operational activities of the business, including ones with security and compliance concerns.
  • Technical design authorities and development managers who can identify impacted systems and are responsible for the people who are working on them.
  • Lean-Agile Center of Excellence and change agents supporting the SAFe implementation and facilitating the workshop.

Use some guiding questions to identify the right audience for the workshop within your organization. Are the participants empowered to make organizational decisions? Do the participants represent the whole value stream? Is the number of attendees within a reasonable range to make effective decisions?

Step 2: Build leadership support and pre-align expectations

To support engagement and address potential resistance, I recommend performing a series of interactions with leaders in advance of the workshop. In such interactions, the change agent would socialize a crisp and compelling case for change in the organization, supporting the “why” behind running the workshop.

The change agent needs to be prepared to address leader trepidation about the possibility of having their reporting-line personnel on ARTs that they don’t fully own.  Most compelling is a data-based case made by performing value-stream mapping with real project data to expose the delays in value delivery due to organizational handoffs. 

Interaction opportunities can include one-on-one empathy interviews, attending staff meetings, internal focus groups, and overview sessions open to all workshop participants. 

I highly advise setting expectations with leaders in advance of the workshop. This will help them understand the workshop implications, help identify potential misalignment or resistance, and coach them in how to signal support for the workshop purpose.  

The following are useful expectations to set with the participants in advance to help shape how they view the upcoming workshop:

Value Stream Workshop
  • Allow the designs to emerge during the session. This is meant as a collaborative workshop.
  • Expect to be active and on your feet during the session, actively contributing to the designs.
  • Be present and free up your schedule for the duration of the workshop as key organizational decisions are being made.
  • Alleviate the anxiety of broad, big-bang change by clarifying that they get to influence the implementation plan and timing to launch the ARTs.
  • Address the misconception about organizational change by explaining that ARTs are “virtual” organizations, and that reporting lines need not be disrupted.

Step 3: Prepare the workshop facilitators

A successful Value Stream and ART identification workshop will have the main facilitator, ideally someone with experience running this workshop. Additionally, you’ll need a facilitator, typically an SPC, per every group of six to eight attendees. Prior to the workshop date, schedule several facilitator meetings to prepare and align them on the game plan. This will go a long way in helping your facilitators project competence and confidence during the workshop. Discuss the inherent challenges and potential resistance, and how the facilitators can best facilitate such moments. Share insights on change readiness based on the leadership interactions and empathy interviews. Finally, prepare a shared communication backchannel for facilitators, and build in sync points during the event to ensure alignment across the groups.

While these simple steps and readiness recommendations don’t necessarily guarantee a successful workshop, they’re a great starting point. You’ll still need to understand the mechanics of identifying value streams. This is what Adam will cover in the next post in our value stream series. Look for it next week.

In the meantime, check out the new Organize Around Value page on the SAFe Community Platform.

About Deema Dajani

Deema Dajani is a Certified SAFe® Program Consultant Trainer (SPCT).

Deema Dajani is a Certified SAFe® Program Consultant Trainer (SPCT).
Drawing on her successful startup background and an MBA from Kellogg Northwestern University, Deema helps large enterprises thrive through successful Agile transformations. Deema is passionate about organizing Agile communities for good, and helped co-found the Women in Agile nonprofit. She’s also a frequent speaker at Agile conferences and most recently contributed to a book on business agility.

View all posts by Deema Dajani

Share:

Back to: All Blog Posts

Next: Leading the SAFe® Conversation to Win Over Your Peers

The Power of Informal Learning Networks

Learning Networks

I can remember the exact moment when I went from being a transactional learner to a lifelong learner. I was in a meeting with my leader at that time, checking in on how things were going. “Just keep doing what you’re doing,” was his response. I don’t know if any of you have heard those six words in a corporate setting, but they were life-changing for me in terms of learning.

When I heard those six words, my immediate thought was that I didn’t want to. It felt like my learning journey was about to be stalled.  With that in mind, I started to think long and hard about what I wanted to pursue next in terms of my career, and what I needed to learn in order to get there.  Knowing that there were no current opportunities for formal, external training, I had to find another way to continue my learning journey. 

Through these reflections, I realized that I didn’t always have to attend external training or a conference to keep learning. Don’t get me wrong, I’m grateful for all of the events I’ve had the opportunity to attend, all the times I shared what I learned with my colleagues, and how doing that helped me deepen my learning.

My aha moment came when I started to think about how I could learn from other associates in my enterprise and share what I learned with them. What I didn’t expect was that while learning from others, I uncovered a wealth of knowledge and experience in my own enterprise that was way beyond my expectations. And here I was, just starting to tap into it!

My first learning network

It was a typical cold and snowy day in January in Chicago when I started my first conversation around creating an informal learning network. What happened as a result forever changed how I approached learning. Another Agile coach in a completely different business unit and geographic location reached out to me to inquire about some of the workshops that I was creating and facilitating. Throughout our conversation, he shared some of the amazing things he was doing to coach his Lean-Agile transformation, and connected me with some other coaches and trainers in the organization. The more we collaborated, the more we learned from and with each other, and the more excited we were to start additional learning networks within and across our business units.

Fast forward more than three years and a move to another company, I’m still part of a number of informal learning networks with many of my colleagues from that organization. Every time we learn something new that we feel would be beneficial to the others in the network, we share it. And we learn more every time we share in these moments.   

What is a learning network?

If you were to research the words “learning network” via books or an online search, you might come up empty. There isn’t much out there on the topic. In fact, I was excited one day to see “learning network” listed in the index of one of my learning books. But it pointed me toward networks in general, which wasn’t helpful. Not long after that I was telling a colleague about one of my informal learning collaborations and I called it a learning network. It just seemed like the right way to describe it.

Learning Networks

So, here’s my personal definition: A learning network is a community of people with a passion for learning and growing. Often, these are formal gatherings; you’ve probably been a part of one at some point in your career. Now, let’s extend that definition to an informal learning network where a community of people catalyze learning in and through others across and beyond their enterprises. I drafted this broad definition based on my own personal experiences reading books and articles, watching videos, and through lots of conversations with colleagues around the world.

Now that we’ve got a working definition, let’s dive into exactly what comprises an informal learning network.

Characteristics of informal learning networks

The best way I’ve found to describe these learning networks is to share the questions people in these networks are curious about. So, here’s my synthesis of a lot of research around how we share what we learn across enterprises.

Learning Networks

And here’s something else I’ve learned about informal learning networks that grow over time. The most important skill you need to improve as a learner is to start asking questions like:

  • How do I learn faster?
  • What will you do about it? This happens when you realize you want to learn something and no one in your network has that skill.
  • What more can I be doing?
  • What can I change?
  • How do I sharpen my skills in this area?

Learning networks are successful in part because of some informal assumptions. An open-door policy (everyone is welcome), the fact that there are no rules, and that there’s no planned start or perceived end.

Sharing and reflecting

There is a flow to a typical conversation where people share and then reflect.

I know what you’re thinking: “How do people in these networks do their day-to-day work and still have time for these network activities?” 

Learning Networks

Engaging and spending time within these networks is not a time-consuming effort that is separate from current initiatives. Rather, it complements and enhances current delivery. Imagine if you were interested in working on a specific feature, yet didn’t have all of the knowledge and experience that was needed to accomplish it. Rather than pursuing something else to work on, you became curious about who in your network, or enterprise, may have the skill you need and would be open to offering you the opportunity to learn from them. This is one of the best ways to create learning organizations and extend them across an entire enterprise to create a continuous learning culture.  

Your personal learning journey

I believe learning within an enterprise takes on many forms, shapes, and sizes. I believe that the learning networks that I’ll be introducing to you in this blog series are the best kept secrets in enterprises today. And I also believe that each and every one of you, as change leaders, are best positioned to tap into these networks to create a continuous learning culture.

So, my challenge for you is to start thinking about your own personal learning journey and how these learning networks can help you along the way.

Continue your personal learning journey by reading the second post in my series about how learning networks emerge, the third post about how to uncover those networks, and my final post about connecting your learning networks to SAFe.

About Audrey Boydston

Audrey Boydston is a senior consultant at Scaled Agile

Audrey Boydston is a senior consultant at Scaled Agile and an experienced SPCT, Lean-Agile coach, trainer, and facilitator. Her work focuses on continuous learning, building fundamentals, re-orienting around principles, and helping clients—from senior executives to developers—build networks and communities that support their transformations.

View all posts by Audrey Boydston

Share:

Back to: All Blog Posts

Next: Aligning Global Teams Through Agile Program Management: A Case Study

Aligning Global Teams Through Agile Program Management: A Case Study – Agile Transformation

Agile Program Management

Like many organizations, Planview operates globally, with headquarters in Austin, Texas, and offices in Stockholm and Bangalore. About two years ago, we launched a company-wide initiative to rewire our organization and embrace Agile ways of working—not just in product and R&D, but across every department and team, starting with marketing. We developed three go-to-market (GTM) teams, whose goals and objectives centered around building marketing campaigns to create a pipeline for sales. Each team aligned to a different buyer group, with members from the product, marketing, and sales.

The challenge: integrating international teams in our Agile transformation

Like many organizations, we struggled to align and execute our marketing programs across our international teams, defaulting to “North-America-first efforts” that other regions were then left to replicate. As we built out these new groups, we considered how to best include our five-person team of regionally aligned field and demand marketers in Europe, the Middle East, and Africa (EMEA).

At the beginning of our Agile transformation, the EMEA marketers were often misaligned and disconnected from big-picture plans. The EMEA teams were running different campaigns from those in North America. Before forming cross-functional GTM teams, the EMEA team had to individually meet with the different functions in marketing, product marketing, and other departments. The extra complications of time zones and cultures also made it difficult to get things done and stay on strategy.

With team members feeling disconnected, at Planview we suffered lower-impact campaigns and less-than-ideal demand generation. To succeed in our Agile transformation journey, it was critical to properly align the international team through an integrated Agile program management strategy.

The approach: forming and integrating the EMEA team into Agile program management

While the three GTM teams had dedicated cross-functional members representing demand generation, content strategy, and product marketing, it was clear that assigning an EMEA team member to each of these teams wouldn’t solve the problem. Each EMEA marketer is organized by region and language, not by GTM Agile Release Train (ART), so we needed to develop our own EMEA Agile program that would meet the challenges and achieve the needed international alignment.

Agile Program Management

Working with our Chief Marketing Officer and other stakeholders, we determined that we would continue to align our EMEA team by region/language. Now that the GTM teams were formed (with each team having all the necessary people to deliver end-to-end value), the EMEA team could meet with each team in the context of the prioritized strategic initiatives. Drawing on our local expertise, we could weigh the campaigns from the three GTM teams against each other to determine which would drive the most pipeline and impact in each region. This structure enabled EMEA marketers to opt into GTM campaigns that were regionally impactful, instead of creating standalone campaigns. This approach has been a success. At our last PI planning event, EMEA progressed from just replicating campaigns into co-planning and co-creating the campaigns that were of local interest and fit.

By including the distributed teams in Agile program management, we achieved better alignment as a global marketing team; gave our EMEA marketers the opportunity to leverage fully supported, regionally impactful campaigns; and ultimately, achieved better results for our demand generation campaigns.

Learning 1: When starting the process of shifting to an Agile approach, there is an advantage in letting the GTM team form, storm, and norm before involving the EMEA team. That delay allows for the EMEA team to finish up previously committed (sales-agreed-upon) deliverables. It gives the team and the sales stakeholders time to observe and see the benefits of Agile GTM teams without feeling that they are not getting the support they were expecting.

The practice: virtual, inclusive PI planning

Our model continues to evolve in a positive way. We’ve now been through five PI planning events and have transitioned from a “one EMEA representative” approach to including our full marketing team in a truly global planning event.

What does a global planning event look like in practice?

When our EMEA team started to participate in PI planning, we had one representative join to understand the process and feed the critical milestones into the team’s plans. We then matured to the full team joining remotely, which meant that we needed to create a system that would enable inclusive planning across continents.

We created a process of “continuous planning.” First, our global team would plan “together,” from Austin and virtually via web conferencing for EMEA. Our EMEA teams would log off during the evenings in their time zones, and the US team would continue to plan with recorded readouts. The next morning, while the US teams were offline, the EMEA teams would listen to the readouts, adapt plans accordingly, and provide their own readouts on changes made once the team was back together during mutual business hours. While tricky at first, this process ensured that everyone was engaged and that all teams’ contributions were heard and considered. Most recently, we’ve conducted fully virtual planning in mutual time zones.

Learning 2: The gradual inclusion in PI planning meant the GTM teams were already well-established and well-versed in the process. The maturity of the teams and the process helped a lot in the inclusion of the international team.

The results: greater alignment, faster time-to-market, better campaigns

Agile Program Management

The impact of our EMEA Agile program can be broken down into three main categories: alignment, time, and utilization.

The collaboration between the EMEA and GTM teams has created significantly stronger connection and alignment, evidenced by both the improvement in campaign quality and our working practices. Our teams have increased visibility into shared and separate work and developed a better understanding of how decisions impact overarching shared goals.

Our Agile ceremonies, combined with the use of Planview LeanKit, have served as a catalyst and a framework to bring us closer together. Communication is easier, more frequent, and more productive, as everyone is aligned to the same goals and plans and has visibility into each other’s progress, needs, and capacity. The greater team can now make conscious trade-offs based on mutual priorities, which enables the EMEA team to focus on the right things and deemphasize asks that are not aligned to the goals. EMEA marketers feel more involved and have an important seat at the table. That is both motivating and effective.

Learning 3: Ceremonies and visual planning tools are absolutely necessary, but only really benefit teams with the right enablement and coaching. To this day we still meet weekly with our Agile coach to refine our LeanKit board and discuss WIP limits, sizing, retros, etc.

From a time-to-market perspective, we’ve seen substantial improvements. Before aligning EMEA to the GTM teams, there were delays in deploying campaigns because EMEA would “find out” about campaigns rather than being part of them from the beginning. Now, the team can give early input and feedback on how a campaign could be adapted to provide the most impact for EMEA, then roll it out more quickly. As a concrete example, we have reduced the time for campaign tactics to go live from three months to three weeks.

The volume and quality of campaigns and campaign materials has increased significantly as well. In the past, the EMEA team often made do with the materials (especially translated materials) that were available, not the assets that were ideal. There were campaign ideas that we could not realize due to a lack of localized material. Without dedicated resources for EMEA, the team had to share creative and translation services with North American providers, who often needed to prioritize programs led by corporate/North America.

Now that EMEA has full visibility into the North American programs, they know what kind of material is in development.

Scaled Agile

They give input on what is needed to execute campaigns in global markets and when delivery will happen. That means EMEA campaigns can begin at almost the same time as the North American ones, and their marketers can prepare for when translated assets and other materials will be available.

Overall, by transforming our EMEA Agile program, the region went from running one or two campaigns each PI to running five campaigns per PI. EMEA marketing went from approximately four to six new localized assets/materials per year to 18 – 20. We added three translated, campaign-specific landing pages per language. And, most importantly, we’re beginning to see direct indication of pipeline improvements.

Agile program management can be challenging with international, distributed teams. By integrating our global team members into our planning processes from the beginning of our Agile transformation, we’ve been able to achieve measurable benefits across the marketing organization.

About Verena Bergfors

Verena is the Marketing Director for Planview’s EMEA markets

Verena is the Marketing Director for Planview’s EMEA markets. She’s from Germany but moved to Sweden around 10 years ago and has been with Planview for over four years. Prior to living in Sweden, she worked in Shanghai for seven years where she held positions in marketing and sales. Verena’s true passion is languages and she enjoys working on diverse international teams.

View all posts by Verena Bergfors

Share:

Back to: All Blog Posts

Next: Use WSJF to Inspire a Successful SAFe Adoption

Use WSJF to Inspire a Successful SAFe® Adoption – Agile for Business

SAFe® Adoption

By definition, Weighted Shortest Job First (WSJF) is a prioritization model used to sequence jobs to produce maximum economic benefit. Utilizing WSJF relies on the Cost of Delay and job size to determine its weight in priority. Think of the Cost of Delay as the price you pay for not delivering a feature to the end-user in a timely manner. For instance, if you know a competitor is also working on a similar initiative to yours, you can acknowledge the risk of losing customers if the experience you deliver pales in comparison.

I like to refer to WSJF as a tool that helps you take the emotion and politics out of a decision and rely on facts instead. WSJF allows us to take an economic view and not be swayed by the loudest complainer (aka squeaky wheel) or the person with the longest title in the room.

I’m sure we can all relate to being in a prioritization meeting either before, during, or after your SAFe® adoption where people demand that their feature be the top priority. But what they can’t clearly explain is why they want it, why that feature is important to the business, end-user, or buyer, and how it aligns with the organization’s purpose. After the WSJF exercise, participants often assume that the biggest, most needed items will find their way to the top of the priority list and are surprised by what features actually get selected. Remember, in Agile, we like to show value quickly. So, WSJF also helps participants identify features that could be too large to ever get to the top, forcing them to break down the work into more manageable batches.

Here’s an example from a retail company I worked with. The company’s top priority at the time was a single-sign-on (SSO) integration feature that was considered critical to improving the user experience. SSO was all everyone was talking about. So, after going through the WSJF exercise, a marketing executive was surprised that aspects of their SSO integration weren’t at the top of the list. The conversation surrounding this—which, by the way, involved the squeaky wheel and the person with the longest tile—enabled participants to break the work down into smaller batches. Everyone involved in the discussion got the context they needed to see that by changing the scope of the work, teams could provide incremental value to customers more quickly. We then went back through the WSJF exercise with the smaller batches of work, some of which moved to the top of the priority list and others moved further down.

Going through this exercise gave participants the context and information to explain:

  • Why and when items were being delivered
  • How customers would be delighted with ongoing improvements versus one large release in the future

Having those key stakeholders in the room allowed us to work through the tough conversations and gain alignment more quickly. That’s not to say the conversations were any easier. But showing how the larger batches of work could be broken down into small batches provided proper context based on end-user value and faster delivery.

In the end, WSJF doesn’t only help an organization deliver the most value in the shortest amount of time, it also fosters decentralized decision-making. This requires your RTE or Product Managers to be steadfast in their approach to ensure trust and belief in the process. When members of the team see leadership supporting this new approach, even when that leader’s feature doesn’t land at the top, it goes a long way in building the trust and culture to inspire a successful SAFe adoption.

About Elizabeth Wilson

Elizabeth Wilson

For more than a decade, Elizabeth has successfully led technology projects, and her recent experiences have focused on connected products. As an SPC, she’s highly versed in Agile methodology practices, including SAFe, and leverages that expertise to help companies gain more visibility, achieve faster development cycles, and improve predictability. With a wealth of practical, hands-on experience, Elizabeth brings a unique perspective and contextual stories to guide organizations through their Agile journey.

View all posts by Elizabeth Wilson

Share:

Back to: All Blog Posts

Next: The SAFe Coach

The SAFe® Coach

The SAFe® Coach

Coaching appears in the Scaled Agile Framework® (SAFe®), but there isn’t one place where we define the SAFe coach. According to our recent internal survey of 2,500 SAFe Program Consultants (SPCs), over 70 percent are actively engaged in coaching SAFe implementations. In general, a SAFe coach is a servant leader, someone who can facilitate both change and collaboration at scale. A SAFe coach embodies the attributes of our Lean-Agile mindset as well as a learning and growth attitude to lead by example, while continuously fostering positive change. 

A servant leader

There are many roles or labels the coach could play within a SAFe transformation, including Scrum Master, Release Train Engineer, and SPC. Regardless of the roles and functions inherently associated with a coach, coaching takes place throughout all of SAFe’s core competencies. And all of the competencies have one characteristic in common: to guide organizations in fostering better ways of working. So that we can compete and thrive in the digital age by quickly responding to market changes and emerging opportunities with innovative business solutions.

Within the competencies, servant leadership is unique. It’s a behavior designed to continuously serve the teams, enable product delivery, and benefit the overall enterprise. Using active listening and the collective mindset and principles of SAFe, a coach as a servant leader can become more aware, and more connected to the people within their organization. This approach brings neutrality to the enterprise. So that people feel safe in voicing their thoughts and opinions, and can collaborate to realize the benefits of shared understanding, innovation, learning, and growth. By embodying servant leadership, a SAFe coach can help the organization increase SAFe’s effectiveness and relentlessly expand collaboration, coordination, knowledge transfer, and consistent information flow. 

A facilitator of change

SAFe’s dual operating system enables efficiency, stability, and the speed of innovation. Another key benefit of this system is the ability to evolve the social structure organized around value: the Agile Release Train (ART). This is brilliant, and as referenced in The Power of Empathetic Leadership in an Evolving World, Chuck Pezeshki writes, “How you set up your social structure is THE critical factor in how knowledge and synergies in design will be created. Using Conway’s Law, one can predict a priori what the functional form of a design will be. It matters who talks to who.” 

The SAFe® Coach

By organizing around value and creating the social structure around the ART, we design a social structure that encourages knowledge sharing and helps us use empathy and design thinking to innovate around the value we’re creating for customers. Pezeshki further states, “Inside the social structure, empathy is the dynamic that creates synergies in design. While empathy is always valuable, even within the simplest social structures—people that connect are much more likely to transfer correct information to each other—it is essential in creative enterprises.” In our world of ever-evolving complexity, creating social structures can help us transfer information that is accurate, reduce risk, and continuously increase the speed of delivering value. 

Coaching these social structures or networks is part of SAFe’s approach to the dual operating system. These networks take the form of virtual organizations such as ARTs or Solution Trains. And within these virtual organizations, coaches need to have personal agency as their own values and behaviors, so they can coach empathetically and address change quickly. The coaches are empowered to serve the social structure by using these synergies, creating knowledge flow, and facilitating positive growth and change. 

A facilitator of collaboration at scale

According to Jean Tabaka, author of Collaboration Explained, Facilitation Skills for Software Project Leaders, there’s an intangible component of team and organizational power fostered by collaboration. That component brings out the best in people, and in turn, the value the enterprise delivers to its customers. 

As agilists, it may seem obvious that collaboration is a key component of building software and systems. It is, after all, called out in the third value of the Agile Manifesto: Customer collaboration over contract negotiation. If your leaders ask you why this is so important as a coach, there’s a deeper understanding of the why that’s reinforced in Jean’s book, and that has decades of history in Lean highlighted in Ikujiro Nonaka’s book The Knowledge-Creating Company. He calls out the tacit knowledge—the valuable and highly subjective insights and intuitions that are in people’s heads and difficult to capture and share. 

Collaboration is what helps make that knowledge transferable, or explicit, within your enterprise. Organizations also benefit from converting tacit knowledge into explicit knowledge because it represents a way to express the inexpressible. When building software and systems at scale, there is obvious complexity. There’s literally no one that has all the knowledge. It’s inherently shared across our valued people across the value streams and the enterprise. It takes thousands, if not tens of thousands of people to build today’s computers, cars, aircraft, and satellites.

This evolution of knowledge sharing helps you grow your business through a culture of understanding and aligning with your company’s overall strategic goals. The expanded mission of SAFe 5.0 is to enable the business agility that is required for enterprises to compete and thrive in the digital ageFacilitating tacit knowledge is critical and supported through the extension, behaviors, and mindset of all the SAFe competencies. It’s also measured through the latest Measure and Grow evaluations of how well the enterprise is progressing toward overall business agility. 

The SAFe coach in the enterprise needs to find ways to coach and continuously improve collaboration. Sharing that tacit knowledge through SAFe events is one of the most powerful ways to allow people to continually innovate and gain the knowledge and collective mindset to integrate and deliver the highest level of value. 

Coaches need coaches, too

A SAFe coach needs an extensive toolbox to evolve and accelerate their organization’s SAFe implementation. There are many tools to support coaches in learning how to be servant leaders and facilitate change and collaboration. None of this comes easy. Can you imagine intuitively knowing how to do all of this in your first SAFe coaching gig? Coaches need coaches, too. We learn from others and vice versa. If we model the SAFe coaching behavior we’d like to evolve, it will become a self-reinforcing learning experience that will enable coaches to help each other, cultures to evolve, and people to be happy and heard. All of this can help us truly become that relentless learning organization that solves some of the world’s largest problems. We could all use a little of that right now.

Here’s where you can learn more about evolving your SAFe coaching expertise: 

Stay tuned for upcoming blog posts on SAFe coaching.

About Jennifer Fawcett

 Jennifer Fawcett - a retired, empathetic Lean and Agile leader

Jennifer is a retired, empathetic Lean and Agile leader, practitioner, coach, speaker, and consultant. A SAFe Fellow, she has contributed to and helped develop SAFe content and courseware. Her passion and focus has been in delivering value in the workplace and by creating communities and culture through effective product management, product ownership, executive portfolio coaching, and leadership. She has provided dedicated service in these areas to technology companies for over 35 years.

View all posts by Jennifer Fawcett

Share:

Back to: All Blog Posts

Next: Shared Objectives and Collaborative Sense Making

Shared Objectives and Collaborative Sense-Making: Key to Success – SAFe Best Practices

product owners (POs) and product managers (PMs)

Welcome to the third post in our series about SAFe best practices to create a healthy relationship between product owners (POs) and product managers (PMs) that helps to achieve business agility and drives product success. You can check out the previous post here.

In this post, we’ll dive into examples of how you might find yourself in the feature In this post, we’ll dive into examples of how you might find yourself in the feature factory described in our first post. Plus, we’ll offer some thoughts about how to get back to strong PO/PM relationships and focus on delivering value.

Scenario One: Who are you talking to?

Picture this: You’re a PM at a company that’s designing a new app. In the spirit of customer centricity, you’re actively getting feedback. You’re regularly talking to a couple of hyper-engaged customers from Company X. It’s a large company and you’ve got a strong relationship with one of their internal champions who’s easy to get in touch with. During one of these customer feedback sessions, a developer on your team joins the call, too. Afterwards, while you’re confident things are headed in the right direction, your developer wonders out loud why the customer thinks to feature A is great if she really hasn’t used it yet.

Contacting the same customer for feedback on every new thing your company is working on isn’t the best approach. Why? If you’re not careful, you might end up thinking about her as representative of all the rest of your customers with the same job title. That’s likely not the case, so you should also be talking to customers at different companies with different needs for whatever it is you’re building. Another thing to think about: if it’s just you talking to the same customer all the time, you’ll often believe that your organization is always building the right thing. Inviting other people in your organization to collaborate with you on those customer calls might uncover a different perspective, as your developer did in the previous scenario. Having those two or three perspectives in the room is greater as a whole than as individual viewpoints.

Scenario Two: What are you measuring?

product owners (POs) and product managers (PMs)

Picture this: Your organization developed a page on a website and is seeing 20 percent user adoption on that page. As the PM, you think that’s successful because you’re hitting a key performance indicator (KPI) revealing that 20 percent of people logging in are using the page. But your PO feels that’s not necessarily true because the metric represents the same handful of people logging in, not 20 percent of overall users, which is how they interpreted the KPI of “20-percent adoption.” To address the data conflict, you and the PO look at the feature to see what the details of the KPI were. Turns out there aren’t any details, nor is there any mention of baseline metrics. So, neither of you know if the page was successful or not, or if you should pivot or persevere, or what to compare the data to. And the team’s efforts turned into a feature factory because the goals were really about getting the features out the door instead of the goals themselves.

It seems really apparent that PMs and POs need to agree on what measurements translate to a successful outcome, and how they’ll be tracked and interpreted. But we often skip over that part, just assuming all that will be obvious when the time comes. But actually, that assumption often leads to data conflicts. Aligning on metrics is hard work. You may not even know exactly how to measure success yet and you might have to slow down before you speed up, but agreement is critical to avoid future data conflicts.

Get smart 

The same applies to determining the goal of the work and the value to the customer using SMART objectives. Many of us are familiar with these. But really, how often do you and the team take the time to get alignment and a clear, shared understanding of all the details of your objective? Is it specific, measurable, achievable, realistic, and time-bound (SMART)? Or is it just specific but not measurable?

And remember, it’s ok to fail, as long as you’re learning and applying what you learn to improve. The learning part is only possible in a culture that allows for failure, for example, where you’re not hitting the metrics. It’s a culture where people don’t feel the need to mess with the data or avoid committing to a measure from the beginning. It’s part of the innovation process to fail. If the culture doesn’t allow for that, then you’ll get a culture of people that skip that step on purpose to make it look like they’re successful..

The trap of the feature factory is easy to fall into. I hope now that you have a clear path to: 

  • Improve how you collect and perceive customer feedback
  • Write clearer KPIs with baseline metrics
  • Clearly define and align on SMART goals across teams

Armed with this information, you can better recognize the trap, and use your PO/PM relationship to stay out of it. 

Check back soon for another post in our PO/PM success series.

About Lieschen Gargano Quilling

Lieschen Gargano is an Agile coach

Lieschen Gargano is an Agile coach and conflict guru—thanks in part to her master’s degree in conflict resolution. As the scrum master for the marketing team at Scaled Agile, Lieschen loves cultivating new ideas and approaches to Agile to keep things fresh and exciting. She also has a passion for developing best practices for happy teams to deliver value in both development and non-technical environments. Fun fact? “I’m the only person I know of who’s been a scrum master and a scrum half on a rugby team.”

View all posts by Lieschen Gargano Quilling

Share:

Back to: All Blog Posts

Next: Agility Fuel