Jun 26, 2024
103 Views
Comments Off on How to get answers quickly and avoid features that flop
0 0

How to get answers quickly and avoid features that flop

Written by

Lessons from the one-question surveys of Instagram, LinkedIn & Trainline

One of the most common things I hear from founders is:

We keep launching features that have no impact. They just flop and we don’t know why.

This is what’s known as a feature factory, something Marty Cagan covers in his book Inspired.

It’s where teams have a constant state of busyness to launch feature-after-feature. Features which often end up having no impact on core product metrics.

One way to stop this is to test assumptions before building something. To uncover what we’re silently assuming when we think of a ‘good’ idea to build. And to test whether these things are actually true to de-risk the idea.

However, user research can be tricky for a number of reasons. One of those is response rates, i.e. getting people to give you feedback. The average response rate across all surveys for instance is a measly 5–30%.

The key question to answer is how do you make sure your questions get answered? How do you engage the lower intent cohorts who don’t want to speak to you, or fill out long forms?

Enter: one-question surveys.

I first learned about these from Teresa Torres, Product Discovery Coach and author of Continuous Discovery Habits. I went on her Assumption Testing course last year and loved it (highly recommend her book and courses).

I learned that one-question surveys are used to test assumptions. The benefit is that they are simple and used in the user experience. These two things lead to higher response rates than a survey via email (like these).

They’re also quick to launch and quick to collect data (if they’re put in the right place). You can get lots of responses within a short time due to those higher response rates.

Other common one-question surveys include: exit surveys, where did you hear about us surveys (WDYHAU), net promoter score (NPS) surveys, brand perception surveys and employee pulse surveys.

There are a few key rules about one-question surveys that can help boost response rates and get you accurate feedback:

The question needs to be simple (if someone needs to read it twice, that’s bad)They ask about actual behaviour or opinions (no ‘could you’ or ‘would you’ statements)Ideally, they need to be embedded in the user experience (if you ask somewhere that is too far removed you may impact your results)

They’re so subtle that users don’t event think twice about answering them.

So subtle that they’re hard to spot.

However, what I have spotted is three interesting and different examples from leading tech companies: LinkedIn, Instagram and Trainline.

We’ll run through them to look at how companies can research a range of hypotheses and assumptions quickly, from brand, to feed relevance to environmental sustainability.

Let’s go 🔎

LinkedIn’s feed relevance research

First up, LinkedIn.

I was on the LinkedIn feed the other afternoon and almost scrolled past a verrrry subtle module with the statement:

The content on my feed is valuableDisagree — Neutral — Agree

The question was wedged between two posts, and quite hard to see on dark mode. It stuck out slightly with the bold copy in a larger font than a normal feed post, and a white LinkedIn logo contrasting against the black.

No doubt this is minimal in its execution. So minimal in fact, perhaps the team went too far on the lack of context.

Typically these questions start with a phrase that helps the user see this is a question for them, like:

To what extent do you agree with the statementWhat’s your view on…What are your thoughts on the statement

Or, a direct question:

Are you interested in this content?Are you a software engineer?How easy do you find saving files?

With LinkedIn’s question, perhaps the low word count works better for them. But I do wonder whether some context is missing.

By nature, one-questions surveys should be a no brainer. But this one took a little more brain power from me.

When I read “The content on my feed is valuable”, my internal monologue starts:

Well, some is valuable.But not all.Depends who I see.Hmm

Then when I have to choose, agree, neutral or disagree — my thoughts are neither one nor the other; I agree to some extent, but I would 100% agree if the question was phrased:

“The majority of the content on the feed is valuable”

Or

“I get value from the LinkedIn feed”

Or if I was able to select ‘somewhat agree’. The fact that only 3 out of 5 of the responses are labelled, and there’s no extremes of ‘100% agree, 100% disagree’ means it is harder to choose.

So, some learnings here:

Make sure people can see your survey (in both light and dark mode)Make sure the question is direct, with some contextMake sure that you appropriately label the options

Then, you’re more likely to product valid results.

Next, onto a personalisation question I happened across on Instagram.

Instagram’s ad relevance

Next, in the evening I was scrolling on Instagram — for a bit of escapism.

Wedged between two feed posts — much like LinkedIn — I saw a question in a grey module:

Want more relevant ads?Tell us what you’re into and we can show you more of what you likeShoes, clothing, public services, social media, home and garden, health and wellness, bedding and linen [show more interests]

My first thought: What is a public service ad? 🥲

My second thought (as a product growth person, not as an instagram user): I wonder whether they actually use this to personalise, or whether its research for something else.

In any case, what I liked here versus LinkedIn is:

It’s a little easier to see, the grey module doesn’t look like a post, so it interrupts the mental model of the feed, perhaps leading to higher response in ratesIt’s concise: there’s enough context in there to motivate me with the whyIt’s direct: the question ‘want more X’ always works, if the X is relevant

What I wasn’t so hot on:

The categories themselves were so broad, none of them felt very ‘me’I didn’t understand a couple of them (public service ads…)There was overlap in the categories: bedding and linen is technically part of home and garden ….

I’d have loved to see categories positioned as desires instead, something like: ‘Serve me ads for things that:

Make me healthierMake me more stylishMake me fitterAre for my pet

Perhaps would have made me choose them. As in actual fact, I ignored this one question survey. But not because the catoegires were sub-par, but for one crucial reason:

I don’t want to buy more things.

Therefore I’m not going to help Instagram serve me relevant ads. I also feel like serving me relevant ads is Instagram’s job.

The truth is: I’d prefer to be served less ads, save more money and live happier in less consumerism.

Cheers to that 🍻

Trainline’s Environmental Friendliness Question

Next, a curve ball question.

I love these, a quesetion you’re not expecting.

I booked a ticket to the airport the other day using my favourite train booking app: Trainline. Trainline is a public company in Europe which generated £327.1 million in revenue in 2023, up 74$ YoY due to a net growth in ticket sales.

One of those sales, by me.

So there I was, having just purchased my ticket. When my screen greys out and I see a little white module below:

What’s your view?

Trainline actively helps me find the most environmentally friedly travel options 🌳Strongly disagree 1, 2, 3, 4, 5 strongly agree

Now, I love this one for a few reasons:

It is super easy to seeIt looks cleanIt is easy to exitIt adds contextIt is easy to answer

So answer it I did.

And I put 1.

Strongly disagree.

Why?

Because never in the UI have I ever noticed something about being environmentally friendly.

I’m curious here as the purpose of the survey. It’s either:

A brand survey, to see what they come across asAn assumption test where they want to try to improve how eco they appear, but want to see the baseline firstSomething else (lmk in the comments if you have any ideas)

I love the thank you message too. The brand voice comes through really nicely:

Best. Feedback. Ever 🙌

A phrase that works even if you put a score of 1, like I did…

For me, this question was the quickest and easier to answer out of the three examples here. It took me no double takes, no second guesses.

It also — I reckon- this UI led to the highest response rates. It is more in-your-face, bright and bold. Versus the more subtle versions from LinkedIn and Instagram. Given the volume of users there though, I doubt they struggle with response numbers.

In summary, how to run great 1Q surveys: lessons from LinkedIn, Insta & Trainline

Lesson 1: Make it visible — no hiding it in the UI. And make sure you check how it looks on dark mode to ensure it is accessible.

Lesson 2: Give context— do your responses cover all the options? Are the responses labelled? Do people know the question is for them?

Lesson 3: Make the question a no-brainer — do your responses cover all the options? Are the responses labelled? Does your question use language that customers use themselves?

Lesson 4: Thank genuinely — you could add the ‘why’ or just a nice lil’ note thanking people for their time.

How to get answers quickly and avoid features that flop was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

Article Categories:
Technology

Comments are closed.