Illustration of a digital survey.

User Experience (UX) Surveys: The Ultimate Guide

by Mads Soegaard | | 78 min read
815 shares

Imagine you’re a business owner who wants to know what’s working and what’s not on your website—and, oh yes, where you need improvements. Sure, there are a bunch of research methods you can try—like user interviews, usability tests, and A/B testing, but read on and see how a user experience (UX) survey helps gather valuable insights in and pinpoint the areas you’ll need to work on to get a site—or any other digital product or service for that matter—in tip-top shape!

See how UX surveys can offer actionable insights, presenting qualitative data that informs decisions. 

Show Hide video transcript
  1. 00:00:00 --> 00:00:31

    Looking at when to use surveys relative to  the product or service lifecycle? Well, you might have an existing solution and for that you  may well want to consider a survey after every *major release*; or perhaps on a *calendar basis*, every quarter you might run a customer satisfaction or a user satisfaction survey. And that allows you to keep a pulse on things to know how your offerings are faring

  2. 00:00:31 --> 00:01:01

    in your users' or customers' eyes. If you have specific ideas for improvements, you can also ask users about that, ask customers to tell you the sorts of things your competitors are doing that they like or perhaps ask them about *sticking points* in your current solutions: what is it that they find problematic, where would they like to see improvements? And, of course, in those kinds of areas, you are talking more about open-ended. But certainly, if you've got a list of things that your  competitors are doing, it's very easy to ask people

  3. 00:01:01 --> 00:01:33

    whether they would be interested or would find  those particular features useful. For new solutions, you often end up using quantitative research, which is what a survey is, to what we call "triangulate" – get extra data about – back up – qualitative research that you've done. So, you might have gone out and done some contextual inquiry and you might have some really exciting ideas about new product directions, and you want to make sure that that makes sense for the majority of your customer base.

  4. 00:01:33 --> 00:02:01

    You wouldn't just go out on the strength of a dozen interviews and launch a new product or major revisions to a product  or service. So, the use of a survey is really almost essential in those kinds of cases. *Alternatives* – I mentioned contextual inquiry. The great thing about contextual inquiry is that it's grounded. We go out and speak to real people about real situations in a fairly – what's called – *ethnographic way*.

  5. 00:02:01 --> 00:02:30

    So, we're trying to do it in their own settings, where they  would be using this product or service. And a contextual inquiry is *extremely exploratory*. So, if you start hearing about certain ideas on a regular basis, you can start asking *more* about that and try to expand the scope of your inquiries to cover these new concepts and find out a lot more about the product or service that you should be providing, as opposed to the one that you perhaps currently are or were planning to.

  6. 00:02:30 --> 00:03:00

    *Semi-structured interviews* – well, these are a really  important part of most qualitative research and,  in fact, is used in contextual inquiry as well, but  they aren't necessarily as well grounded. We don't necessarily go out into the user's environment  to do those, but one of the attractions there is that we can start off in both of these examples – contextual inquiry and semi-structured interviews – start off with a collection of initial questions and then explore from those.

  7. 00:03:00 --> 00:03:30

    So, we might have only a short list of topics that we definitely wanted to cover and we'll let the conversation ramble into interesting connected areas – not just ramble in general, by the way; "interesting connected areas" is an important part of that. You want to make sure that you're still within the focus of your inquiry, of your research. *Card sorting* – it's really good for early research for finding the relationships between concepts. We've got concepts on cards, and we ask people to sort those cards

  8. 00:03:30 --> 00:04:04

    into groups, either of their own creation, so they're allowed to make the groups up themselves – that's called an *open sort* – or a *closed sort*, where we provide the groups and we want to see if people agree with where they're putting things; and in between, of course, those two is something that I call a *hybrid sort*. It has different names. And there are other early-testing tools, which we do talk about elsewhere. Those are, I should say, *tree sorting* or *tree testing* and *first-click testing*, where we're trying out very specific things; we  give users a goal, and we try to see how they

  9. 00:04:04 --> 00:04:31

    address that goal with the solutions that we're thinking of providing. So, in the case of tree sorting, it's actually *menu testing*. So, the tree is the menu, and we say, "Where would you find this?" / "How would you do this on this site?" and you show them  the menus a step at a time. And there is no site yet. There's just a listing of the menu items  in a step-by-step progression. So, they're shown the top-level menus, they're shown the second-level menus, etc., as they navigate through.

  10. 00:04:31 --> 00:05:01

    So, it's really easy to do and you get some really good hard data  out of that. And, similarly, with first-click testing, you might have just wireframes or really early prototypes; it can even be sketchings and you ask people to try to achieve a goal with these designs. You record where they click and how they try to achieve that. So, it's actually first-click testing is the most interesting part of that: Where do they focus their attention initially when trying to achieve those goals? So, these are all alternatives to asking people about things.

  11. 00:05:01 --> 00:05:32

    And, of course, in these latter cases we're talking about seeing people do things  rather than asking them their opinions, which is a much more reliable way of getting data – not that surveys are entirely unreliable; that's not the case – but first-hand information about what people  do rather than what they talk about doing is much safer. And this is a pretty typical field-working experience. The guy on the right has a PDA or phone.

  12. 00:05:32 --> 00:05:51

    Hopefully, it's a multiple-choice questionnaire  he's asking because it's really very hard to make notes on a device like that, but this is the kind of situation where you can direct the questioning according to how the participant is answering. So, this is an alternative to surveys.

Table of contents

What are UX Surveys?

UX surveys—or user experience surveys—help you gather information about users’ feelings, thoughts, and behaviors related to a product or service they encounter, and online surveys form a part of the broader field of usability surveys. They focus on understanding how users interact with a system, application, or website, and they’re helpful sources of qualitative data that can give you actionable insights to, in turn, inform decisions that can give customer satisfaction a big boost as you create a more user-centered design.

Types of UX Surveys

© Interaction Design Foundation, CC BY-SA 4.0

1. Customer Effort Score Surveys (CES)

CES surveys assess how simple it is for customers to complete tasks with what your company offers them—and it’s like a score that tells you if using your product or getting help from your service team was a breeze or a struggle for the customer. People appreciate straightforward questions, and you, dear UX researcher, will be glad of straightforward feedback—and less time you need to spend on it. Time is precious, so spending less effort—and time—on fixing issues is better. Note that ease of experience can be more revealing than overall satisfaction, and experts use the Customer Effort Score as a reliable data source.

For instance, after a customer service interaction, the question could be:

"How easy was resolving your issue with our customer support?"

  • Very Difficult

  • Difficult

  • Moderate

  • Easy

  • Very Easy 

2. Customer Satisfaction Surveys (CSAT)

A CSAT survey measures how happy customers are with your company—and the main question is, “How satisfied are you with our service?”. Answers range from 1, meaning “very dissatisfied,” to 5, which indicates “very satisfied.”

CSAT surveys focus on individual interactions—things like purchasing or using customer support—and they use numeric scales to track satisfaction levels over time. These surveys help you understand what your customers’ needs are like—and pinpoint issues with your products or services. They allow you to categorize customers based on their satisfaction levels, too, and that helps with targeted improvements.

3. Net Promoter Score Surveys (NPS)

NPS surveys are simple and quick since they’ve got just one question: “On a scale from 0 to 10, how likely are you to recommend this product/company to a friend or colleague?”. From the score, you can do respondent segmentation and divide them into one of three categories:

First are the Promoters (Score 9-10)—and they’re your biggest fans, the ones who’re likely to recommend your product. Second are the Passives (Score 7-8)—and these folks find your product/service satisfactory enough, sure, but the loyalty is not there, so watch out for them as they could switch to competitors with ease. Third are the Detractors (Score 0-6)—and speaking of “watch out,” they’re unhappy customers, and—in the golden age of feedback with everybody being able to write about anything—they’re ones who could harm your brand through negative word-of-mouth.

To get a good big-picture view, you can calculate the NPS score by subtracting the Detractors’ percentage from the Promoters’—it’ll give you a good snapshot of customer loyalty and areas for improvement.

4. Close-ended Questions for Quantitative Research

Well-designed, closed-ended questions are nice and easy to answer—and they’re ones where users pick from predefined options like checkboxes, scales, or radio buttons. Surveys such as these are suitable to collect data with—like in exit surveys asking users about their shopping experience—and the answers provide actionable data, like customer preferences or standard problems. You can “plug” insights you get into your redesign efforts.

Get more insights on quantitative research in this course on Data-driven Design.

You may ask,

"How satisfied are you with our delivery speed?" 

The options could be:

  • Very Satisfied

  • Satisfied

  • Neutral

  • Dissatisfied

  • Very Dissatisfied

For closed-ended ones, users don’t need to type out their thoughts—they’ve got to just pick an option that best describes their feelings. It’s a “win-win,” and it’s both quick for the user and easy for the company to analyze.

5. Open-ended Questions for Qualitative User Research

While closed-ended questions give up fixed options for quick responses (and they’re a great convenience because of it), open-ended questions allow for more detailed, free-form answers. They’re questions that ask for written responses, and so they dig deeper into how users feel—and what they expect from a brand and its product or service. It may take more time to analyze the responses you get in from this type of survey, but they’re worth the “work” and are valuable because they offer nuanced insights.

For example, ask a question like “What feature do you wish we had?” and what you get back can lead to ideas for product enhancements that meet users’ needs far better.

When and Why Should You Conduct a UX Survey?

Conducting a UX survey is a strategic decision to understand various aspects of user interaction with a product or service. Here are vital scenarios and reasons for implementing them:

1. To Evaluate Features and Make Enhancements

You may find UX surveys better suited to assess existing products than ones you intend to develop, and they’re surveys that can gather insights on how well your target audience receives a feature or service. Feedback from such surveys can be a great help to guide adjustments or additions to your product—like if customers think an existing feature could do better on the functionality side, you can leverage that valuable data to get your product more in line with user needs.

2. To Identify Pain Points

If you’re going to create a user-friendly experience, it’s vital that you spot pain points and design to alleviate problems. UX surveys provide direct feedback from users about what’s troubling them, and issues could be ones you’re unaware of that make the customer experience less enjoyable or less efficient.

For example, users might point out that they find your checkout process too complicated or that they’ve got trouble finding specific information on your website. Treat insights of this kind like gold; they give you specific areas to focus your improvement efforts on and make UX strategy decisions that will make the best of how users interact with your product or service. Addressing these issues helps you fix problems and show users you value and act upon their feedback.

3. To Assess Customer Satisfaction

Customer satisfaction is crucial for any business—and dissatisfaction can be a sore point on public feedback. A well-timed UX survey can gauge how well you meet customer expectations after a critical interaction—such as a purchase or customer service call—and there’s value to be had in the “good” and the “bad.” Positive feedback helps identify vital areas, while negative feedback highlights issues that need attention.

4. To Evaluate Customer Loyalty

For sure, long-term success hinges on customer loyalty, and NPS surveys—a type of UX survey—help gauge this. When you identify promoters, passives, and detractors, it can help you tailor customer retention and referral strategies, and a dip in loyalty scores is an alert to dig deeper into potential issues and find out how you can solve them.  

5. To Journey Map

Journey mapping visually represents a user’s interactions with your product or service. It tracks the entire experience—from the first touchpoint to the final interaction—and a well-designed UX survey can provide insights at multiple stages of this journey to measure things like ease of use. Are customers finding it simple to navigate from one section of your website to another?

Show Hide video transcript
  1. 00:00:00 --> 00:00:33

    You know, as human beings we have the strength of  visual reasoning to understand and perceive things visually, and it's just an incredible tool to bring  people together at actually a pretty low cost. If you think about what it takes  to perceive and understand a broad, rich journey, that's a *book*, right? And so, you have to find ways to articulate the customer journey

  2. 00:00:33 --> 00:01:00

    and contact with your company and product  in a way that's succinct and in a way that people can grasp whether they're a designer  or whether they have time to read a big book or not. So, it's a tool that is almost like  a democratization of information in a way. And so, it both spreads the information  and brings people together, and helps you make decisions and perceive things that  might otherwise not be known or understood.

CSAT surveys are great to check satisfaction at critical touchpoints with—like purchase or support—and fill you in on vital details about fixes to make. Open-ended questions can bring on qualitative insights into why users make specific choices, and the answers can fill gaps in the journey map that analytics data mightn’t.

6. To Help during Major Transitions or Updates

If you’re planning a significant change, such as a rebrand or major update, a UX survey is invaluable to help assess customer sentiment and expectations with—before you roll out the differences. Getting survey data in means you can make adjustments that do align with customer needs—and you can reduce the risk of any negative backlash searing your brand.

7. To Make Continuous Improvements

It may sound like a drag, but the need for improvement never stops. Regular UX surveys create a feedback loop to help you track user sentiment and performance metrics—and they’re powerful tools that allow for ongoing adjustments based on real-world usage.

Show Hide video transcript
  1. 00:00:00 --> 00:00:33

    Software grew up in sort of a project world where *business stakeholders defined* a whole big chunk of work to do; we wrote product requirements; the team built it; we learned way late in the process nobody wanted it. And so, we've seen people move more towards an *Agile mindset* or a *continuous improvement mindset* where we're looking at: How do we work in smaller chunks? How do we get feedback more continuously? So, continuous discovery is really just: How do we continuously *infuse  our decisions* about what we're building

  2. 00:00:33 --> 00:01:02

    with customer input? — instead of thinking about it as a project approach where you do some research up front and then you rely on that research for the rest of the project. I think this switch from project to continuous is one of the hardest ones. So, even when people – let's say they start to adopt Scrum as their sort of Agile methodology and they work in a two-week sprint; really all they're doing, what most teams do, is they make the mistake of

  3. 00:01:02 --> 00:01:31

    'Oh, well, I just take my old Waterfall process and I jam it into a two-week cycle, and so I'm still defining upfront two weeks' worth of work and then my  engineers build it and then I do a little bit of research while they're building,  and I define the next two-week iteration.' The challenge with that is that when we do our  research in a project basis even if it's just a two-week project, is that we do the research on the *big questions*, but we don't do the research on all the little teeny tiny questions that come up as we're building,

  4. 00:01:31 --> 00:02:03

    like, 'What do we call this button?' or 'How do we expose this in the interface?'  or 'How should the data model work?' And the example that I give for this is I feel like anybody on the planet could look at their mobile phone and find a dozen apps that they were excited about, which is why they downloaded it. So, they got the big idea of the app right, but then they never used it, because they *tried to* and they got all those little details wrong, and the app didn't quite work as promised. So, a more continuous discovery process is

  5. 00:02:03 --> 00:02:14

    we have to answer those *big questions* like 'What should we be building?'; we also have to answer those *little questions* as we're building so that we get all the little details  right and people actually use our products.

For example, if you notice a slight dip in satisfaction scores for your app usability, you can investigate and go in and make adjustments before it flares up into a big issue.

6 UX Survey Best Practices From Experts

Visual representation of 6 UX survey best practices from experts.

© Interaction Design Foundation, CC BY-SA 4.0

1. Make it Quick

Time is precious for pretty much everyone—so show your respondents you value theirs (long surveys can be a real turn-off). A quick and concise survey ensures that the participant stays engaged—so do focus on just the essential questions and leave out any unnecessary ones.

Steps You Can Take

Limit your survey to 5-10 essential questions. Use clear and concise language. Preview the survey with a friend or colleague so you get feedback on the length and clarity.

2. Keep It Relevant

It’s vital to keep relevance going in your survey questions—and on-point questions will get you valuable data in return. If questions stray off-topic, though, they’ll risk irritating or baffling participants. Keep questions focused to ensure you get the insights for your goals.

Steps You Can Take

Define your target audience and goals before you write any questions—and don’t have any generic questions that don’t relate to the product or service when you do get writing them. Focus on specific user experiences that align with what your objectives are. Provide not applicable (N/A) / don’t know answers for all closed questions.

3. Avoid Bias

Bias—often an unfortunate “byproduct” of being human—can distort the results and lead to misguided conclusions. The objective framing of questions helps you collect unbiased responses, and some of the common biases include:

  • Question order bias: Affects responses based on the sequence of questions.

  • Confirmation bias: Where you just ask questions that affirm what you already believe.

  • Primacy bias: People choose the first options that come up.

  • Recency bias: People are more influenced by their last experience.

  • Hindsight bias: Respondents say events were foreseeable.

  • Assumption bias: Assumes respondents know certain information.

  • Clustering bias: People see patterns where none exist.

Steps You Can Take

Avoid leading questions, and be sure to use neutral language that doesn’t put any slant or spin on what you’re asking. Consider asking an expert to review your questions to weed out potential bias. Test the survey on a small group of “users” before launching it to see how unbiased it is.

4. Mix Up Your Question Types

While multiple-choice and rating scales excel at gathering numerical data—or more quantitative research data—open-ended questions offer rich, qualitative insights. Get the right blend and you can get a more comprehensive view of customer sentiment.

Steps You Can Take

It’s a great idea to use a mixture of types of questions according to the information you need—open-ended questions for in-depth insights and multiple-choice ones for quick feedback. Think about using scale questions to gauge user satisfaction or preferences.

5. Ensure Accessibility

Accessible design is a big deal in any case, and making your survey accessible helps you capture a wide range of perspectives. If you create an accessible survey for everyone—including those with reduced abilities—you’ll get a more complete and diverse set of insights to gear your decision-making around.

Steps You Can Take

Have easy-to-read fonts and adequate color contrast, and make sure you’ve got alternative text for images. Also make sure that users can navigate the survey using keyboard controls (not everyone uses a mouse). Test the survey’s accessibility features, and—on the subject of keeping things nice and accessible—avoid complex layouts and don’t have matrix-style questions in there.

Watch our video on accessibility to learn more about why it’s so important.

Show Hide video transcript
  1. 00:00:00 --> 00:00:30

    Accessibility ensures that digital products, websites, applications, services and other interactive interfaces are designed and developed to be easy to use and understand by people with disabilities. 1.85 billion folks around the world who live with a disability or might live with more than one and are navigating the world through assistive technology or other augmentations to kind of assist with that with your interactions with the world around you. Meaning folks who live with disability, but also their caretakers,

  2. 00:00:30 --> 00:01:01

    their loved ones, their friends. All of this relates to the purchasing power of this community. Disability isn't a stagnant thing. We all have our life cycle. As you age, things change, your eyesight adjusts. All of these relate to disability. Designing accessibility is also designing for your future self. People with disabilities want beautiful designs as well. They want a slick interface. They want it to be smooth and an enjoyable experience. And so if you feel like

  3. 00:01:01 --> 00:01:30

    your design has gotten worse after you've included accessibility, it's time to start actually iterating and think, How do I actually make this an enjoyable interface to interact with while also making sure it's sets expectations and it actually gives people the amount of information they need. And in a way that they can digest it just as everyone else wants to digest that information for screen reader users a lot of it boils down to making sure you're always labeling

  4. 00:01:30 --> 00:02:02

    your interactive elements, whether it be buttons, links, slider components. Just making sure that you're giving enough information that people know how to interact with your website, with your design, with whatever that interaction looks like. Also, dark mode is something that came out of this community. So if you're someone who leverages that quite frequently. Font is a huge kind of aspect to think about in your design. A thin font that meets color contrast

  5. 00:02:02 --> 00:02:20

    can still be a really poor readability experience because of that pixelation aspect or because of how your eye actually perceives the text. What are some tangible things you can start doing to help this user group? Create inclusive and user-friendly experiences for all individuals.

See the W3’s Web Content Accessibility Guidelines for more details.

6. Maintain Privacy

Make participants’ privacy a priority—it’s critical to do that if they’re going to build trust in your brand. When people feel confident that their data is safe, they’ll be more ready to engage in full in your survey, and a nifty “bonus” is how a strong privacy policy doesn’t just meet legal standards and boost participation rates but enriches the quality of your insights, too.

Steps You Can Take

State your privacy policy at the start of the survey and be clear about it. Use secure platforms for conducting the survey. Assure participants that their responses will remain confidential—and honor that. Last—but not least—put sensitive or personal questions towards the end.

The Ultimate Guide to Conduct a UX Survey

 A guide to conduct a UX survey in 8 steps.

© Interaction Design Foundation, CC BY-SA 4.0

Step 1: Define Your Objectives

Defining clear objectives sets the stage for a successful UX survey, kicks off the roadmap for the best survey possible, and it helps you understand the key insights you’re after. To zero in on what you’re aiming to discover, consider these questions:

  • What is the main goal? Understand if you want to measure user satisfaction or you want to focus on something else.

  • Which user behaviors are relevant? Is the survey targeting frequent users, new users, or both?

  • What are the key metrics? Do you want to look at completion rates, time spent, or other indicators?

  • New feature opinions: Are you seeking input on new rolled-out features?

  • Pain points: Are you trying to identify user frustrations and roadblocks?

It’s vital to get great clarity in the objectives, as it’ll guide every next step and ensure you align the results with your project goals. Well-defined goals will help you streamline the survey’s structure and help you craft relevant questions. The sharper focus has an added benefit in that it will help you analyze the data you collect later on.

Step 2: Identify Your Target Audience

You need to know whose opinions you want, and how to write questions that they can relate—and respond—to well, so it’s more than a little important to identify your target audience, and there are several reasons.

  • Product awareness: Gauge how much your audience knows about your product, as this will shape the depth and detail of questions.

  • Interests: Understand what topics engage your audience, and you can use that insight to make questions interesting.

  • Language: A professional audience may understand industry jargon, but a general audience may well not—so choose words with care.

  • Region: Geography can affect preferences and opinions, so do localize questions if you need to. 

Step 3: Craft Engaging Questions for the Questionnaire

There’s an art to writing good questions for surveys—and what you ask are the heart of your survey, so if you write engaging, clear, and unbiased ones, they’ll bring back the insights you need from respondents. Your questions have got to captivate the users’ interest and guide them through the survey—so, don’t bog them down with boring, irrelevant, or ambiguous words. Treat your survey as a design in itself, and give it great UX and a “seamless experience”!

Show Hide video transcript
  1. 00:00:00 --> 00:00:30

    We're going to be talking  about writing good questions. The quality of the questions, along with the quality of the respondents, is really key. So, you have to have questions which people understand, that they really can address *unambiguously*; they don't have to sit wondering what it is you meant by that, and to do it pretty quickly, too – they need to be *short*.

  2. 00:00:30 --> 00:01:00

    So, we have a long list of various  points you should take into account. And it starts with the the issue of *shortness and ambiguity*. So, we want things to be as *short as possible*. We want to use short questions, short words. We want to be *very specific*, and we want to be *very unambiguous*. So, you don't ask vague questions. If you want to know about whether somebody has done something, you need to specify over what period. A lot of questionnaires these days I noticed ask,

  3. 00:01:00 --> 00:01:32

    'What did you do yesterday?' or 'Did you watch this yesterday? Did you do that yesterday?' And you could of course change that period as required, but it's very unrealistic to expect people to either mind-read if you're not being at all clear about when you mean, or to remember very far back. And that changes with the topic, of course. Somebody might remember when they were married from a long time ago, but the last time they visited a coffee shop – you might not get away with more than a couple to four weeks, for example.

  4. 00:01:32 --> 00:02:02

    Do use *common terms* and  provide *explanations* where you're using anything that's a little bit unusual. And common terms in  English are generally fairly short. So, use short words. Short words appear more frequently, are used more frequently. The number of syllables is actually a good indicator; the more syllables, the less frequent the word is used in English. And with a lot of interactive survey tools, you are able to provide explanations.

  5. 00:02:02 --> 00:02:31

    If nothing else works, then just put something parenthetically after the question to explain what it is you mean by that particular term. Start the questionnaire or the survey with *easy, uncontroversial questions*. If you're going to go on to be asking  sensitive questions or things that people might get slightly aggravated or upset about,  then try to leave that as late as possible. You want people to get engaged; you want them to feel comfortable, and to an extent trust you.

  6. 00:02:31 --> 00:03:01

    And starting with controversial questions or in some way irritating questions is not a good way to do that. Do *use words to label all points on a scale*. I've got a terrible question that I've made up about Brexit. But I've labeled every single point. Now, this used to be for me a debatable issue, that I suggested that perhaps we should just label  the two ends and number the points in between, but I've seen good evidence and firm advice that you should label all of the points in between the two.

  7. 00:03:01 --> 00:03:32

    Certainly on seven – this is a seven-point response – five or seven is fairly typical. And in those cases, you would expect to see words being used there. Address only *one aspect of a question at a time*. *Avoid compounds* like: 'I found the website quick and easy to use,' because there is the possibility that somebody found it easy to use but not quick, or quick but not easy. So, stop that from being a problem by asking those questions separately. Just like in navigation design, *do not use overlapping categories or ranges*.

  8. 00:03:32 --> 00:04:00

    People need to be very clear about where they should be clicking; so, you want '20-49', '50-64,' '65 and over', rather than '20-50', '50-65', '65 and over' because you can see in that latter example that there are two places  where people, if they were exactly 50 or exactly 65 would not be clear about where you actually want them to answer. And *do not ask leading questions* if you want honest responses,

  9. 00:04:00 --> 00:04:32

    or any responses at all, for that matter. Certainly, it's during various elections that have taken place over the last couple of years, I have seen numerous alleged surveys come round where clearly it is not a questionnaire or a survey at all; it is a party political statement, and I just stop  as soon as I realize that that is the case. And I believe a lot of other people do as well.  So, this example on 'How bad an idea is Brexit?' is an example of a leading or loaded question because of course some people think that Brexit is a perfectly good idea

  10. 00:04:32 --> 00:05:05

    and we should not be talking about it as a bad idea in their eyes. So, this is something that you would not do.  In fact, there are quite a few things in that particular example you should not do, so just  treat that as a bad example... *Avoid question grids*; I know that they  are extremely popular, but if you can avoid them, they are worth avoiding because  they are intimidating, certainly with the large question grids, people turn over to them online typically, and they immediately get put off. It looks very complicated. It is quite complicated.

  11. 00:05:05 --> 00:05:30

    If they were on the brink of not completing your questionnaire, that probably has pushed them over the edge. And if you're unlucky and are working with a service provider for your questionnaire, your survey tool, that doesn't support converting these into individual questions, then you'll find that they don't actually work on mobile phones. And that certainly does happen from time to time,  and of course we can expect more people to be

  12. 00:05:30 --> 00:06:03

    using mobile phones as their primary internet  tool; so, that would be a bad plan. Really the best way of approaching these is to ask the questions separately. And this is actually quite a bit more likely in the problem domain that we're talking about, which is user experience, rather than, say for example, market research where you might have a list of 10 or 20 coffee shops down the left-hand  side and the frequency of use across the top. That's still not easy for respondents, but it's  perhaps a little bit easier than asking them

  13. 00:06:03 --> 00:06:34

    deep, meaningful questions like this  particular one about Barack Obama. *Ranking questions* are those where you're asked  to re-order the responses. And the *problem* with them is twofold. One is that it's actually not very easy as a participant to do this, that you have to think about, 'Well, which of these is my preference? Which is my second preferred, third preferred?' etc. And it's off-putting to them; it's time-consuming; it's hard to do in some cases,

  14. 00:06:34 --> 00:07:00

    particularly on a mobile platform, but it might be  quite technically challenging, just the dexterity required, and it's also off-putting; you're making people think really hard about something where the detailed answers are not all that important.  What does it matter to you in a list of five whether somebody lists something  fourth or fifth? It's not first; it's not second; so, do you really care that  much about it?

  15. 00:07:00 --> 00:07:34

    It turns out to be very easy and almost equivalent. I have not seen a paper saying  that they're equivalent, by the way, but certainly when I've done it, I've not been disappointed. Just ask people to choose their *favorite*. Or, if you've got a long list, maybe their favorite  top 'n', where 'n' might be 2, 3, 4 ... And you're not talking about order, then; you're just talking about which is the most favorite or which are the two most favorite. And when you come to  analyze those, the analysis is very much simpler because it is just the items with the biggest  numbers are the most popular.

  16. 00:07:34 --> 00:08:02

    And that was the second point with the whole ranking thing – is that you have to do a *weighted analysis* in order to get sensible results from the ranking, which means  taking into account where in the list these things appeared; whereas if you're just asking people  to choose their favorite, it is very much easier and it's just slightly more complicated for their favorite two or their favorite three. And most survey tools will let you set a *maximum and minimum number of responses* for this kind of question.

  17. 00:08:02 --> 00:08:32

    So, if you really insist on having two choices from a  long list, then it could complain to participants that they've not selected two or preferably  a maximum is the better way of doing that so people cannot choose the full number if they  really don't care that much. So, that is something I would just strongly recommend you avoid altogether. It's really of no particular benefit. It looks fun when you're looking at it in the tool and  maybe on the screen yourself.

  18. 00:08:32 --> 00:09:00

    But when you're talking about lots of respondents getting to  it and dealing with it, it's not fun for them. Open-ended questions do allow some flexibility,  and of course they're not in any way going to replace an interviewer who can dig a lot more  deeply and try to interpret or understand what participants are saying. They have got their use, though, open-ended questions; the set of possible answers is unknown

  19. 00:09:00 --> 00:09:30

    or very large, is the main reason for doing that. So, if you've got a list of items and there are other possibilities, you will have an 'other' response, and under the 'other' response you will have a text box for people to fill in. Perhaps you want to know the underlying cause of a response – 'Why did you rate us like that?'; 'What was the main thing?' And that's certainly very commonly done and a perfectly good use of open-ended responses. Or you need to allow participants to express *unanticipated concerns*.

  20. 00:09:30 --> 00:10:00

    So, 'Is there anything that we could have done to have prevented this or to make you happier?' – that kind of question. And these two examples from SurveyMonkey: the top one is the final question in most SurveyMonkey templates. It's open-ended. And that's recommended for almost all surveys, that you give people a chance to comment about either your organization or the questionnaire or just general comments that they might have. 'Is there anything else you'd like to add?' is the kind of question you would ask there.

  21. 00:10:00 --> 00:10:30

    The bottom one is actually from their market research  template, and it has a lot of open-ended questions in it, but it's one of the very few SurveyMonkey templates that actually has more than one or two open-ended questions; so, you really can  largely – and perhaps should try largely – to stick to multiple choice questions. An alternative – and you may see this yourself as a respondent to various questionnaires – an alternative to extensive open-ended questions is to invite survey participants to take part

  22. 00:10:30 --> 00:11:00

    in an *online or telephone interview*. So, you might ask them a few questions, and they might express some reservations  about certain aspects of your product or service. And if they did that, you might offer them the  chance to be interviewed at depth, either by telephone or through some online collaboration tool. And of course that becomes, then, a primarily qualitative approach, and you would only do this with a relatively small number of participants,

  23. 00:11:00 --> 00:11:33

    mostly because it will be moderately time-consuming; you could expect it to take at least half an hour, and you will need a qualified  interviewer to do that. Do make sure that if you're using this approach that the interviewer does have access to the respondent's initial  questionnaire so they're not repeating themselves  or are not at all aware of the participant's background or complaint. *Semi-structured interviewing* is the most appropriate technique in most cases. So, this would be a follow-up set of questions and then just the interviewer exploring

  24. 00:11:33 --> 00:11:38

    some of the responses that the participant has given already.

Use different question types—like multiple-choice ones for quick feedback or open-enders to get deeper insights back. Use simple language, don’t use jargon, and ensure each question serves a clear purpose. Be mindful of potential biases—which can come up without you noticing—and keep the questions neutral.

Step 4: Select a Tool For the UX Research Survey

It’s vital to pick the right tool for your UX survey if you’re going to get the best in data collection and analysis. A Google Form offers you a quicker way to get started with UX surveys—and here’s why:

  • Ease of use: Google Forms is user-friendly, and even if you’re not tech-savvy, you can create a survey quickly.

  • Customization: It offers various themes and allows question branching based on prior answers.

  • Integration: Google Forms integrates with other Google services like Google Sheets for real-time data tracking.

  • Free: For basic features, it’s free of charge.

  • Data analysis: It offers basic analytics like pie charts and bar graphs for quick insights.

You can also use specialized UX research tools like SurveyMonkey with more advanced features, but whichever one you go with, consider what your objectives are and what your target audience need, and then choose a tool that best serves those needs.

Step 5: Pilot the Survey

Pilot testing is a very valuable step for refining the UX survey—and that’s since it gives you a chance to uncover unforeseen issues with the survey design, questions, or technology; things you really wouldn’t notice otherwise. It’s a good idea to recruit participants in small numbers so you can test the survey.

You can ask internal team members for help or contact professionals via LinkedIn, and use this test survey to understand their experience and make necessary adjustments. This can make the difference between a good survey and a great one—and, as stated before, treat the survey like you would a design. When you pilot test well, it helps iron out the kinks and ensures a smoother product experience for the primary audience.

Step 6: Launch the Survey

Well done—it’s launch time! But launching the survey is more than just making it live—you’ll want to choose the proper channels, timing, and even incentives to get the best results. And when you promote the survey, it ensures that it reaches your intended audience and encourages participation from them. Their time is valuable, remember, so consider the time of day, week, and even platform that aligns with your audience. You’ll need to plan every aspect of the launch to maximize participation so you do get it launched well with “All systems go!”.

Step 7: Analyze and Interpret the Results

The results are in—now comes the “fun” part! Data analysis transforms raw data into valuable insights, so be sure to use analytical tools to sort, filter, and interpret the data in the context of your objectives. Look for patterns and correlations—sure—but also keep an eye out for unexpected discoveries, as they may well pop up.

Your interpretation of the data you got back should lead to actionable insights that guide you (and your team/s) to make substantial product or service improvements—and this step transforms the effort of surveying real value for your project.

Step 8: Share Insights and Implement Changes

Last—but not least—share your findings and implement changes to complete the process, and create comprehensive reports and engage stakeholders with the insights to get them on board and on the same page. The reports will need to be clear so the right ideas get communicated—since many stakeholders will need data “translated” for them. Sharing fosters a shared understanding and sets the stage for informed decisions to bring the right fixes and improvements to where they’re needed.

Plan and iterate on improvements based on the insights you get, and be sure to use the learnings for continuous enhancement. Keep your radar on for any insights that might emerge to help you tweak the user experience of your survey in any way possible so it does its job and brings you back info you can use well.

The 20 Best User Experience Survey Questions

These questions form a comprehensive framework for understanding various aspects of the user experience—just be sure to use only a few of these to keep response rates high. 

  1. “How did you find our website/app?”

This question helps assess the effectiveness of your marketing channels, and it shows you where people first encounter your brand. While Google Analytics reveals traffic from specific sources like AdWords or Facebook, it needs to track direct traffic. Know this and you can fine-tune your marketing strategy.

  1. “What was your primary goal in visiting our site today? Did you achieve it?”

So, there are two questions—but, anyway, it focuses on why users visit and if the site meets their needs and, from that, helps identify gaps in content or functionality.

  1. “How easy was it to navigate our site?”

This question looks at how effective your website is, and you’re on the right track if people find it easy to navigate (well done!). If not, well, it’s a red flag and your site’s layout or functionality may need tweaks—which you’ll need to look into with due care and attention.

  1. “What features did you use most?”

This question puts it out there as to which parts of your product or service are most valuable to customers. If the majority say they often use a specific feature, then that’s a pivotal strength to highlight in marketing.

  1. “Were there any features that needed to be clarified or easier to use?”

This question zeroes in on potential weak spots in your product design or functionality—and can leave some things wide open for “attack” but for good reason. If a feature keeps getting labeled as confusing or complicated to use, yes, it needs improvement.

  1. “How would you rate your overall experience?”

Provides a general impression of user satisfaction—the truth’s often in a number or a word or two.

  1. “What would you change about our website or app?”

This question invites suggestions for improving your digital solution—and this “suggestion box” moment gives users a voice in the development process. If they care as much as they should and something has come up, you should hear about it.

  1. “How likely are you to recommend our product to a friend or colleague?”

Recommendations measure customer satisfaction and loyalty, and it’s common for pop-up surveys to use this question based on a widely used metric called the Net Promoter Score (NPS). A high likelihood to recommend means that—sure—customers are happy and likely to become brand advocates.

  1. “What other products or services would you like us to offer?”

This question taps into unmet customer needs and wants—another golden “suggestion box” moment! Responses about this can expose gaps in your current products—or services—and even go beyond feature gaps and inspire new products themselves. Trust in your users and customers, as some of them may know what they’d like to see and can voice it well—and if they didn’t care, they wouldn’t write anything, right?

  1. “Did you encounter any technical issues?”

Technical issues—like bugs, error messages, or crashes—can affect customer satisfaction for the worse (and sometimes to the point they're steamed at you), so you’ll need to hear about it so you can change things (for the better!).

  1. “What is your preferred payment/delivery method?”

It may seem trivial, but some customers will only buy if their preferred payment method is available (and go to a competitor if they have it). Money talks, so you’ll need to understand the popular payment options that resonate with your target audience and see about going with what they want.

  1. “What is your preferred method of contact for support?”

This question seeks to know how customers prefer to reach out for help—and many people will want (or not want) to be emailed, phoned, texted, or what have you as the only way to get hold of them. Understand this and you can help your brand optimize their customer service channels.

  1. “How would you describe our product in one sentence?”

This question aims to capture a concise customer impression of your product, and one-sentence descriptions can reveal key strengths or weaknesses. Note that it’s “one sentence” and not “one word”—as the latter might open up confusing “cans of worms” (or references to other substances) and even praise mightn’t shed light on the all-important “why” factor.

  1. “How does our product compare to similar ones in the market?”

This question seeks to understand your product’s competitive edge or shortcomings and has the brand standing beside its rivals for “inspection time.” Responses can tell you where you excel or lag behind other market players and give you insights on how to up your game or make those strengths even more accessible.

  1. “Were our support resources (FAQs, live chat) helpful?”

You need to understand the effectiveness of your customer support tools—like FAQs and live chat. If most people find these resources helpful, great—they validate your support strategy. If not, it’s a cue to improve these areas, be it better Frequently Asked Questions, better live-support training, or what have you. Support is a big—no, a huge—deal, and it can determine how much your brand is worth to the customers who contact it for help, so make sure you offer assistance that benefits your customers and respects both them and the time that makes up their short lives.

  1. “How could our product better meet your needs in the future?”

This question may be about the future, but it seizes on the present moment by sending the message to customers that your brand values them and what they think. Whether it’s about adding new features or refining existing ones, the feedback can help “big-time” with roadmap planning. If multiple customers highlight the same issue (like with pricing), that’s a vital sign that needs attention—before they “jump ship” to a competitor who’s cheaper. 

  1. “How did you find the speed of the site?”

This question evaluates how your site speed impacts user satisfaction—and, of course, slow loading frustrates users and may even lead them to abandon the site. If multiple people report this issue, you’ll need to get something done for optimization—and remember most users access digital products on mobile devices with mobile UX needs.

  1. “What language options would you prefer for our website/app?”

This question identifies the language preferences of your user base—not everyone will be adept or comfortable with English, for example. If a sizeable portion prefers another language, it makes sense to offer that option—be it Spanish, Russian, Chinese, etc. (though we might leave out Esperanto). Another “perk” of offering text in other tongues is how it won’t just broaden your reach but make your platform more inclusive as well.

  1. “Would you like a follow-up from our team regarding your feedback?”

This question is another one that shows you respect your respondents’ time—and they’ll appreciate the courtesy. Get a “yes” from them and it pretty much means the respondent is engaged and open to dialogue, which you can take as a sign of higher loyalty or interest. But a “no” means they provided feedback, all right, but aren’t looking for a discussion—so just leave it at that and leave them alone.

  1. “Would you be interested in future updates or newsletters?”

For this one, it’s an easy binary response on how interested customers are in staying connected with your brand, with a “yes” meaning a “happy camper.” Wouldn’t it be wonderful if they were all satisfied customers who’re more likely to engage with future offerings, but—reality check—there are going to be “no’s” too. A “no” could suggest that they’re not fully satisfied—or not interested in long-term engagement—so just take it at face value and respect their wishes.

What are Some Great Free UX Survey Templates?

  1. Client Feedback Form

Find out what clients think about your business, and use this form as a case study to gather thoughts on customer service and more. Make changes to the template so you focus on specific aspects of customer interaction.

  1. NPS-Enhanced Software Survey

Experts have made this ready-to-use template to improve your software’s Net Promoter Score (NPS). Use it to gather critical insights to elevate your product.

  1. Basic NPS Inquiry Template

It’s easy to gauge customer loyalty with this template, where customers rate their likelihood of recommending you from 0 to 10—and you can adapt the template to explore additional areas.

  1. Support Team Feedback Form

Assess the performance of your customer service team—and adapt the survey to delve into aspects which you’re particularly interested in.

  1. Quick Response Customer Survey

Send this brief survey out to understand customer perceptions—and it doesn’t just encourage customers to elaborate on their answers, but you can make adjustments to fit your needs too.

  1. Product Feedback Survey

Use this template to collect comments on your products—and it’s nice and handy as it aims to identify issues and suggest resolutions.

  1. Snapshot Product Assessment

Collect rapid feedback on your products, and you can see this form to get concise and actionable comments from customers.

  1. Comprehensive Client Feedback Form

Capture detailed information on how your customers feel about your products and services—handy for not just pinpointing specific areas for improvement but getting a good view of your brand’s “status” with them, too.

Final Thoughts—And The Take Away

UX surveys are pretty powerful tools when you’ve got the right type working for you, the right wording of questions in, and the right mindset on how to engage the people who’ll answer them. It takes everything from defining objectives to crafting engaging questions, ensuring accessibility, analyzing results, and implementing changes, but the work is worth it to zero in on the best survey possible. From the get-go, it’s vital to make sure you’ve got clear objectives keeping the survey on course—because when you understand what you want to achieve with the survey, it sets the foundation for success and guides every subsequent step for you.

Another vital thing to bear in mind is to ask relevant questions that are also engaging: ones that also show you value the time of the people who are answering them; clear, interesting, and unbiased questions that cover various facets of the user experience—a winning formula that helps in capturing genuine feedback and insights.

Last—but not least—your survey is a design in itself that needs to satisfy its users and respect them as individuals with opinions. So, make sure you ramp up the UX of it and keep the “magic” alive so they can look on your brand as one that speaks to—and advocates for—them, and is one that’s interested in retaining them as loyal customers long into a fruitful future.

Get Weekly Design Insights

Join 315,594 designers who get useful UX / UI Design tips from our newsletter.
A valid email address is required.
815 shares

Open Access—Link to us!

We believe in Open Access and the democratization of knowledge. Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change, , link to us, or join us to help us democratize design knowledge!

Share Knowledge, Get Respect!

Share on:

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this article.

Soegaard, M. (2023, October 6). User Experience (UX) Surveys: The Ultimate Guide. Interaction Design Foundation - IxDF.

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we'll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
315,594 designers enjoy our newsletter—sure you don't want to receive it?

New to UX Design? We're Giving You a Free eBook!

The Basics of User Experience Design

Download our free ebook “The Basics of User Experience Design” to learn about core concepts of UX design.

In 9 chapters, we'll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

A valid email address is required.
315,594 designers enjoy our newsletter—sure you don't want to receive it?