A conversation with Anne Fisher Ravens and Mistilyn Gebow
What was your customer’s challenge? How was it affecting people in the organization?
AFR: Our customer is a global retailer who wanted to improve its coordination and visibility into product innovation. Each of its global regions has its own product and Research and Development (R&D) teams, and each region develops product offerings. The teams did not have consistent insight into what other regions were doing, which led to duplicative product innovation, an inconsistent customer experience from one region to the next and missed opportunities to leverage economies of scale.
Leadership wanted more information sharing and coordination across its global teams without introducing unnecessary constraints or slowing innovation. They engaged Point B to design and develop a framework that would direct the right level of visibility and coordination across these teams.
Is this customer’s need for human-centered change common in their industry? If so, how would you describe what’s happening in their industry, and what it means for the future?
MG: This company excels at bringing a human-centered approach to its customer experience. They invest a lot in understanding customers’ needs. But, like many organizations, they have opportunity to increasingly build this approach into their own internal operations. The team leading this initiative is generally in a position of providing guidance, but this was a newer area for them and required building new understanding. They wanted this new framework to benefit the people who’d be using it every day. We responded by putting those future users at the center of the change in everything we did—in our recommendations, design and testing.
How did you go about developing a framework that works for teams around the world?
AFR: For the overall project, we engaged 50+ stakeholders across North America, Asia, Europe and Latin America across six core product categories. We took an iterative approach to designing, prototyping and testing the framework, the process touchpoints, the digital channels, and all the resources that enable this new level of visibility and coordination.
After getting a few rounds of feedback on presentations and PowerPoint decks, we said, “Let’s test this. Let’s bring it to life in a simulation that allows us to be sure we’ve got it right before we launch.” It’s one thing for people to review a proposed design in a slide deck —there’s value in that—but they can’t really understand what the true impact is going to be in their daily work or in the product development cycle without going a step further.
How did you design the simulation to get the feedback you needed?
MG: We knew that an end-to-end product development cycle would run 18+ months, and we wanted to test the full cycle. So we designed a simulation with low-fidelity prototypes of each resource that’s part of the new process—including a self-serve portal, a mock dashboard that reports on product development updates, and mock email communications with weekly headlines on products as they move from one stage of the development cycle to the next.
To accommodate different time zones, we held two simulation sessions for a dozen people each. Participants joined from each region and core team, including product development, R&D and supply chain reps. Around the world, they sat at their computers as we walked them through a two-hour, time-lapse reveal of our prototype for the 18-month product development cycle. Each time a new activity in the cycle came up, participants could play their roles as if the system was live. This gave them the chance to really experience the change. They were able to give feedback on multiple dimensions—from the tactical (“Yes, introducing this step at this phase of development makes sense”) to the emotional (“This is more work for me, but I see the value in doing it and would adopt it.”)
Running a simulation was a new experience for our customer and lead team. At first, they couldn’t quite envision how we could test a cycle that spans 18+ months. But through the trust we built, along with their commitment to delivering results, we agreed it was worth it to take a little more time and effort upfront to engage stakeholders in a more dynamic way.
What did you learn? Did any “aha” moments of discovery came out of this simulation?
MG: We got immediate feedback on a number of important tactical things, including the sequencing of events, regional role differences, and nuances in language such as naming conventions.
But the big “aha” discovery was uncovering that the process design shared too much information between teams too soon. We had gone into the design phase hearing people wanted more information, and that more information-sharing is better. So we built the information sharing to be robust, with more frequent updates and documentation. But once people went through the simulation and saw what that would actually look like, we heard, “Whoa, that’s too much information, and some of it’s too soon. The concept needs to be at least in ‘this stage’ of development before I want to hear about it. Otherwise, it risks becoming noise and I might tune it out and then miss something really important.” This was a key takeaway. If we had gone ahead with what we were hearing initially or hadn’t validated by pressure-testing the true need, we might have overwhelmed people in trying to meet the need.
AFR: Another win was that participants had their own moments of discovery. We heard the words, “I get it now. I get how this is going to work.” Participants went away understanding the process and the benefits that make it worth their effort to adopt. Senior-level team members and other influencers were able to go back to their teams and tell them, “This isn’t as big a lift as we thought,” and “This is actually going to deliver value.” We had stakeholders all over the world who were willing and prepared to be champions of launch activities for their teams.
How did your ability to empathize and understand shape your approach to solving this challenge?
MG: We began by tuning into the lead team’s own empathy. One of the key feelings that came across was that they truly wanted to design for the product and R&D teams in a way that reflected their real-life, day-to-day ways of working. We felt this called for a more interactive design and development process, so we proposed the simulation. The simulation was a chance to move beyond PowerPoint reviews to more of a conversation.
Responding is part of empathy—that sense of “I’m listening, I’m learning—and I’m responding.” We took the learnings, and in the next iteration, we demonstrated how we had applied them. The vision was to create an enduring, virtuous cycle in product development: “I am helping, here’s how, here’s who;” and conversely, “I am being helped, and here’s how.”
How did your ability to empathize and understand lead to results? What were those results?
AFR: We developed a way to tune into users’ needs, validated the approach of this new global framework, and brought stakeholders along every step of the way so there was buy-in and enthusiasm. Together, we got it right on day one.
Good design is empathic; it needs its future users to be part of the solution. The test and pilot process was an opportunity for the lead team to build confidence and trust among the teams in the solution. Skeptics became advocates, executive leadership took a role in the launch, and teams were ready to adopt this new process on day one. We were able to introduce a new experience and tool set they can use for future efforts, too. By being able to knowledge-share and have more opportunities to discuss customer insights, teams are equipped to continuously learn and introduce products that are really going to delight customers. The company has even expanded the process to include other categories beyond the initial launch.
What might have been missed if this customer had underestimated the human dimension of this challenge—and the solution?
MG: If the process had launched with steps out of sync or with an information overload, it would have created confusion and eroded confidence. If it hadn’t been right from day one, it might have been abandoned because there’s intense competition for the mindshare of these employees and teams. By making the process real and involving the people who will be using it, we gained deeper understanding together. Our simulation process dramatically reduced the room for error and generated stakeholder confidence, which was why there was so much excitement and momentum at launch.
If you could leave readers with just one key message about the power of human-centered change, what would it be?
AFR: It might feel like it takes more time and effort to engage people in an iterative approach to design and development—one that puts them at the heart of the process. But by testing upfront, you save tremendous time in the long run by getting it right the first time.
You can’t design for people without designing with people. They need to be at the center, not the sidelines, of changes that affect them.
RELATED INDUSTRIES
RELATED SOLUTIONS