Back
Blog Image
Tedde van Gelderen

Tedde van Gelderen

Founder & President

Experience Thinking Maturity: 5 Strategies to De-risk the Experience

Predicting how many teams figure out their product requirements has been following a distinct pattern over the years for me. Many times when I find myself in a sales situation, the conversation arrives at how the organization’s team has designed and, prior to that, figured out, what the product should be or do.  In short, their requirements gathering approach.  To describe this requirements gathering approach I usually talk about a combination of these steps:

Dreaming

This situation occurs when an executive or the CEO has a gut feel that the product really could benefit from ‘X’; whether that’s a function, some content or going after a new market. The prevailing sentiment is that because the higher ups have been in the business for decades and given their position in the company, their intuition and gut feel is basis enough to invest in the design and construction of this new ‘X’. I call that the ‘Dream’ stage of design.

It helps to dream for sure and it could well mean there is validity to this feeling of ‘this is it’. Usually these things are based on what the execs heard in conversations with clients and influential people from customer organizations and are thought about for some time before they’re brought forward. Many great product ideas came about this way and if they survive the later stages, all kudos to the people that had these ideas. They were the ones with the vision.

ExperienceThinkingMaturity-v1

The flipside to this approach is that I find many teams, more than I’d like to see, they stop asking questions. The only thing that I hear when asked ‘Why did you do this project?’ ‘What are the goals?’‘Do you know what success looks like?’ “Is that it ‘came from the top?’.

When probed further the team has many questions as to the why and how of the project, but no-one either asked the execs or, even better, raised their hand and said ‘If I don’t understand why we’re doing this I’m not inclined to waste my time on this.’ or something to that effect.  I do make a difference between understand and agree with. It’s not so much always a matter of agreeing with the direction – it would be a bonus for sure – but rather to have some reasoning as to why you’re doing this project and what the product / service goals are (from a UX perspective, amongst the business, content and technical angles).

Assuming

This is the stage were more heads are better than one. The team is tasked to improve the user experience or add feature ‘X’ and happily dives in. Many hours are spent brainstorming, reasoning and debating about whether a person would do ABC and really needs feature X to do it well. In that specific use case, our feature X needs to look and behave like this or that to make it work.  It’s the ‘Assume’ stage in design.

The good thing about this stage is that you explore many options that potentially can enhance the experience. By not going with a sometimes narrow vision, you increase the chances that the team and other internal stakeholders will come up with more and better ideas on how to make this the best experience possible. In fact, it’s likely the most favoured design stage of all. It is here where we can spend the majority of our design time and give us the most satisfaction. Why? We can demonstrate to ourselves and our peers our prowess in problem solving, show empathy for the user and come up with innovative ideas to solve the problem. We are not hindered much by external input or other data that can change the thinking that is happening within the team. All good things for the decision making process and the feeling of making progress.

Ultimately we will run out of time here as there is no real deadline to keep coming up with alternative usage scenarios or edge cases. At the point where time and/or budget does run out we will build what we have designed to that point. That too is not a really bad thing in itself unless we start to wonder in the course of all this did we design the right product? And how do we really know? This thinking leads inevitably to the next stage.

Comparing

Once the fear, uncertainty and doubt get a foothold in the team about whether we actually designed the right thing, the attention at some point in the design process turns to the need to compare their offering with both the marketplace and their competitors. In the digital space uptake and performance is often measured by web analytics to see what the traffic is, and what are the most popular pages. This is compelling information from a numbers perspective but gives little insight into why people visit these pages specifically and what their tasks and information needs are at each step in their scenarios and journeys. Precisely the kind of information that often guides good UX design.

In analyzing the market, you also see some meaning gleaned from the different market shares that each organization holds; the larger share the better the product/service must be. This leads to the second area that is often analyzed: the competition. This analysis aims to provide input to requirements and what to do with the experience by looking at market leaders in your own space or other markets / locations. The reasoning is that as long as we follow what others do we’ll be ok.

As a consultant I’ve had the baffling experience to work with two government organizations in one year where one team was looking at the other team’s design and wanted to follow that design ‘because they are a larger entity, so must know what they are doing’ while I had conversations earlier in the year with the larger team about how little informed their design was and their own realization that they had some ways to go before reaching a better experience.

Thinking it’s safe to copy your competitor, in a market where the leader is looked up to for inspiration, is quite understandable: you want to follow the leader. It doesn’t help innovation-oriented thinking and that is the dilemma here: wanting to be innovative while wanting to follow the leader and not risk anything bad. In many cases you can’t have both.

Asking

Now we’re getting somewhere. Asking the customer is simply a great approach! Teams who include this step into their experience design process get to some very deep insights into why a customer buys their product or signs up for the service. It works best with the intent to really understand the product values. What makes the customer pay attention, why is it the best solution for the problem at hand and what makes the organization the best fit for this customer? The answers to these questions are all very helpful inputs to what features and content should be part of the product or service. If you don’t get someone interested in the product then it doesn’t matter how good the UX is in the end, you didn’t get past the first hurdle. This is why it’s not surprising that many teams spend considerable time and money on finding out more about the customer. So much so that there is usually a sharp drop off in attention and budget devoted to the product/service use phase in many organizations. Many organizations think that by focussing on getting customers excited and promoting the product values it will result in overall product success. As if there is no life after the sale.  It is no surprise that you see this phenomenon back in UX research. Many studies – including ours – show that the emotional response increases during the customer buy phases and decreases after the actual purchase or sign-up. Many companies fail to ‘extend the wow’ but instead score increasingly lower during the use phase unless the support group does a great job. In that case the perceptions are definitely more positive in the later phases of the experience lifecycle.

Including customer understanding into the design creation process is definitely a good thing, it informs priorities and at minimum this insight gives a better sense of what would work out in the marketplace. It answers the question: why would I buy? Clearly helpful in steering product design, but not the only piece we should rely on.

Observing

The 5th step that drives great experiences is often the most difficult to include in the design process. It is observing users using the product/service in their ‘natural environment’ i.e. context of use. What this approach adds to the previous steps is the largely unfiltered view of how end users interact with the product/service. This gives us clear insight into what the user actually uses, how it relates to other activities they do and how the environment can play an important role in how features and content are consumed and used. The rich picture that emerges from this observation makes it much easier to determine critical requirements, priorities and the proof of why is right here, for all to see (if you have videotaped your observation that is). It’s a really matching approach to the ones described before.

Why is it so hard to include this step during requirements gathering and design? I’ve often asked myself that question as it seems such a logical thing to do. If you want to know how your product is used, just watch the user! Seems simple.  The reality is that teams often decide to gather the information they need in a different way, resorting to surveys, support call logs, co-design sessions with the customer, client demos, focus group to name a few. Many of these seem perfectly reasonable at first glance, and they are but for different purposes. Here is why. To understand to the highest level possible how users use (and not why, see the Asking section above for that) there is no substitute to observing the user, in their environment. People are notoriously poor at describing what they do with the product/service if you only ask them that in a survey, focus group or co-design session. You’ll get an answer, but it’s colored by the user’s own perception of themselves, socially accepted attitudes and blue sky wondering of ‘what-if’ that is just plain fun. Once you see the data from these techniques, where you ask and not observe, it’s often hard to separate these colors from the black and white the observation can give you.

This doesn’t mean you don’t ask, it’s in this list of strategies after all. The danger lies in ONLY asking and not observing as well. Many product requirements are based on asking internal stakeholders, execs and potential customers and base the product feature prioritization on this mainly. The assumption here is that these people can be the stand in and predict what end users do, think and feel. A shaky assumption to say the least. Certainly not the kind of approach to base multi-million dollar investments on.

Intentional Experiences

With these 5 strategies described above I hope it becomes clear that many organizations only use a subset. The conversation is not really about comparing which strategy is ‘better’ but rather that you leave strategies on the table that are available to sufficiently de-risk the experience if you don’t apply them.

Anyone working on larger projects that only failed in the marketplace because we became aware of the right requirements too late know that all too well. It’s often a conversation about spending now or later. Because you will spend if the requirements didn’t hit the mark: either in fixes to the product already in market, higher support costs or an outright failed product. And I’m sure that was not what we intended.

Tedde van Gelderen

Tedde van Gelderen

Founder & President

Tedde infuses Akendi, its services and methodology with his strong belief that customer and user experience design must go beyond a singular product interface, service or content. It should become deeply rooted in an organization’s research and design processes, culture,and ultimately be reflected in their products and services. A graduate of Radboud University, the Netherlands in Cognitive Ergonomics, Tedde has more than two decades of experience in experience research, usability testing and experience design in both public and private sectors. Prior to founding Akendi, Tedde was a founding partner of Maskery & Associates in 2001. He has worked for companies including Nortel Networks, KPMG Management Consulting and Philips Design.


Comments

Time limit is exhausted. Please reload CAPTCHA.

Learn how your comment data is used by viewing Akendi's Blog Privacy Policy


Related Articles