Usability Testing Questions

 

How many of us have gone into usability testing sessions completely unprepared for what we were going to test or had participants struggle with unclear, ambiguous tasks? All of us.

Now we can start from a better foundation with AI Assistants. Creating usability test scripts is one of the most effective ways I have used the application to get started. You can save time by quickly generating well-crafted scenarios and tasks without spending hours, leveraging AI’s vast amount of training data.

Using the tool helps you craft testing scenarios that get to the heart of user interactions and pain points, all while saving you precious time.

Most of the teams I have managed shared these duties with product managers. This was always time-consuming and lacked alignment, but now, not so much. You can craft this with them, collaborating on the prompts.

Here are tips for writing usability test scripts. Most of this is focused on B2B, but you can play with B2C if you change the focus.

Start with the Domain

Begin by declaring the domain you want to use for the usability test. I recommend starting with a specific category.

It’s a good idea to set the context of your research so that you can refine it with deeper analysis and track your journey. To show the answer, you can specify any context, including the number of results or the display form.

We’ll start with our example: customer relationship management.

Prompt

Create 10 usability testing questions about customer relationship management systems.

Add Features

That returned an interesting list, but it wasn’t very targeted, so you have to go one level deeper by focusing on a feature.

This focus allows you to dig deeper into the user’s actual experiences and pain points. This will uncover detailed insights about what works well and what doesn’t, which helps with prioritization and pattern matching on specific needs.

Let’s focus on one goal for the feature set: managing multiple accounts. This is a really typical use case for customer relationship management.

For extra credit, you can add the requirements and personas as additional resources. I did this on another feature where I included scenarios and user stories used to write the specifications, and the questions were much more targeted. You can try it yourself.

Prompt without Document

Create ten usability testing questions about managing multiple accounts using customer relationship management systems.

Prompt with Document

Create ten usability testing questions about managing multiple accounts using customer relationship management systems. Use the included documents as a reference for specifications.

Add a User Persona

Now let’s go to the next step: adding a user persona.

User personas ensure that your questions are targeted and relevant, uncovering insights that might be overlooked with a generic approach of looking at all users in the system.

By tailoring questions to specific user personas, you can gather more actionable feedback during user interviews because you’re suggesting a specific role when using the application.

For the record, I’m calling them user personas intentionally because they don’t take the place of talking to users; they just help formulate questions you may want to ask so you can refine them. You can upload a user persona from our list of users that we focus on. I won’t cover this here, but it is something that you can play with at this point to inform your questions.

We’ll add business development representatives for this example.

Prompt

Create ten usability testing questions about managing multiple accounts as a business development representative using customer relationship management systems.

Add Tasks

Focusing on specific user personas with a specific goal is even better because it ensures the insights gathered are deeply relevant to specific tasks the user may need to complete.

The more specific you get about what tasks you add, the more it helps narrow down the request. Honing in can uncover nuanced details that might otherwise be overlooked.

However, sometimes you don’t want to get too specific because you want to explore. I like keeping it open myself because then it might catch something I missed, and it’s a great way to learn how to craft better prompts.

We’ll add a business development representative with the task of managing multiple accounts to generate qualified leads.

Prompt

Create ten usability testing questions about managing multiple accounts as a business development representative focusing on generating more qualified leads using customer relationship management systems.

Test the Questions

This doesn’t replace users, but it gives a bit of a sniff test to see if the questions make sense. This is crucial because it helps ensure clarity and comprehensibility for participants. Providing context and detail makes the questions easier to understand and answer accurately. I love these baselines because I can imagine what to expect during the test. It doesn’t have to be the right answer; it has to be something that I can get a sense of where it’s going to go.

This can reveal potential ambiguities, leading to improved question design. Additionally, if you edit the draft questions, you can paste them back in and ask for the answers using a different prompt.

Sample Questions Prompt

Create ten usability testing questions about managing multiple accounts as a business development representative focusing on generating more qualified leads using customer relationship management systems. Give three detailed examples of answers to each question.

Sample Answers Prompt

Create three sample answers about managing multiple accounts as a business development representative focusing on generating more qualified leads using customer relationship management systems from the entered content.

Test with a User

Now that you have your questions, do a dry run with someone in your organization to make any edits or clarifications, just like you would do with a set of questions you created.

This helps identify and resolve potential issues, leading to smooth execution during the actual test. This is always needed to confirm that all elements work as intended. This preparation improves the quality of feedback, leading to more valuable usability insights.

Prompt Results

Custom GPTs