Certainly, no research is easy. But it’s often less difficult to get immediate feedback on the way a web page or product looks or feels. Less quick is soliciting feedback on copy — and finding out what works or what doesn’t.
Traditionally, most design teams would approve copy like this: an editor will make sure it follows the tone of voice, then A/B test two different versions to see which works more. Sometimes a brand team might be involved.

There’s nothing… wrong with that per se. But it doesn’t necessarily provide you with specific feedback on which types of copy work, and why or why not. A/B testing that copy will give you a binary outcome: one works better than the other. But it could be missing blind spots where copy may confuse users, or may only be doing a good enough job.
You might even miss points where that copy is actively damaging your brand of user experience.
That’s where user testing language comes into play. Instead of an afterthought, copy questions and tasks should be designed from the very beginning — and they ought to be an intrinsic part of your testing process.
This is how you can do it.
The right words at the right time
One of the problems with A/B testing copy is that, while you get an overall result, you don’t get specific feedbook on what works, and what doesn’t.
For basic landing pages with very little content, like a form, this isn’t necessarily too big of a problem. But when you start developing pages that are designed to show detail and explain complex topics, like a product page or a feature page, understanding what phrasing works and why it works becomes so much more crucial.
“Not all types of user testing are created equal.”
Yet it’s also difficult. Users will often tell you explicitly why they don’t like something on a visual or interactive level, but most people are user testing worker. They can’t tell you why they don’t understand something.
So they’ll generally say things like, “I don’t get it.” Or “I don’t understand what this feature does.” That’s ambiguous and often vague, but extremely valuable. It means you know your copy isn’t doing the right job.
Now, if you had just done an A/B test? You’d never know that specific feedback. For many designers, copy testing isn’t a priority. They focus on the visual, on whether the CTAs are in the right place. But then ignore the copy that would make them want to click on the CTA in the first place for usability testing.
That’s a mistake.
No comments:
Post a Comment