Do you have to sit next to the person to get the most "direct observation"?

February 26, 2009

DSCN0323 Many of our usability lab rental customers often ask us if they can or should moderate usability tests sitting next to the user.

The most common questions include:

1. Can I sit next to the user during usability testing?

Yes, absolutely. Though there is no methodology rule for sitting next to the user during the test. The reason you would sit next to them is to provide more intimacy, or "hands on" moderation. In other words to make the user feel more comfortable.

2. Do I have to sit next to the user during the usability test? Not necessarily. Our usability labs are designed so that you do not have to sit next the user if you choose not to. This is really a moderator preference. We find it is easier to use our intercom system to communicate with the user, leaving them to work alone on their tasks. There is less chance of moderator stress or participant bias this way.

3. What benefit is there from being next to the user vs. being behind the one-way mirror?
The one-way mirror (or located in another room is another common configuration with our usability lab) is valuable because it gives moderators space to log notes or observe, comment and do their own thinking out loud! On the flip side, the benefit of sitting chair side is increased intimacy or "bedside manner".

It will depend on what kind of test and what type of user you are dealing with. When I first started moderating usability tests over ten years ago, I used to almost always sit next to the user. I used to think it added better moderator observation. The more usability testing I conducted, the more I found it awkward to note take and or observe so close to the user.

These days I find it easier to moderate using our usability lab's intercom system and or create a link with walkie-talkies, or a conference call bridge. Sometimes if I feel the user is anxious, I will sit chair-side. The bottom line, is there is no rule, it is up to you and depends on what usability lab set up you have.

Happy Usability Testing!
Frank Spillers, MS (Usability Consultant)

5 Things You Should Never Say or Do to Users (during usability tests)

January 23, 2008

How you ask a question during usability testing can color your data. At Experience Dynamics, it's important to us that we get *clean* data and that our usability testing for our clients runs smoothly. The subtleties of moderation are easily missed and usually come from years of practitioner practice.

Here are 5 Things you should Never Say or Do to users (during usability tests): I teach them regularly in my usability training courses...

1)  Praise

Example: "How am I doing?" "Good Job! You're doing great!"

Why is this bad?

Giving your users praise sets up an unhealthy relationship with the researcher and the subject. If the user makes a mistake will you be there to tell them they are doing poorly? Will you provide "therapy" to the user and tell them it's okay and console them?

I have witnessed colleagues do this. I did it once, and realized I was stuck when the user got angry and insisted I tell them if they were right. To do so would have embarrassed them, I may as well have said "You are a stupid user, don't worry".

The problem with Praise during testing is that it violates one of the principles of solid usability testing: there is no right or wrong, the user is not there to make you happy by "doing a good job".

Best Practice: Take a more neutral interaction with the user. The user does not need to know if they are making a mistake or not or if they succeed or fail- that's for you to observe, not the user!

2) Feature Like/Dislike

Example: "Do you like this feature?"

Why is this bad?

If you ask users what they like or dislike, you have turned the usability test into a focus group. Focus groups elicit opinions, usability tests elicit behaviors. Margaret Mead once said, "what people say and what they actually do are two very different things". If you ask people what they like, you'll miss how they would actually use it when they got home with it.

Best Practice: Give users familiar tasks to perform and watch them! If users really hate a feature they will vocalize it. (Note: usability testing uses a verbalization technique called the "Think Aloud" protocol).

3) Asking about 'Ease of Use'

Example: "Is this easy to use?"

Why is this bad?

It is really difficult to gauge ease of use from a questionnaire, partly for the reason mentioned in #2 above, and partly because ease of use is relative. Humans are highly flexible and will internalize difficulty with machines, often blaming themselves. What's easy for some is mind-boggling for others.

How is it that all these years major software manufacturers have given us ease of engineering instead of ease of use?

Best Practice: Again, watch users, don't ask them. Remember ease of use is not the only usability metric that counts. More on usability metrics in another post.

4) Asking about expectations

Example: "Is this what you were expecting to be on this page?"

Why is this bad?

I once accompanied a usability lab rental customer on site, with a role of observing and acting as technical support. The client was a *major* ad and interactive agency that was conducting usability testing for it's *major* financial services client. The financial services client was present, but had no idea that a "worst practices" usability test was being delivered by the agency! The facilitator sat with each user and on each screen asked, "Is this what you were expecting here?"... and each user said "I guess so, I don't know".

Best Practice: Let users vocalize their expectations by walking through your site or web application with the industry standard "Think Aloud" method.  Expectations do not need to be asked, users will tell you what they think should happen 90% of the time (either verbally or through their behavior, non-verbally).

5) Giving Instruction

Example: "Click on that button, scroll down, look at that in the top corner"

Why is this bad?

When a user is lost or confused, common sense tells us to help them. Forget about it! This is one of my cardinal usability testing rules, I stress in my usability testing training with corporate teams. If you instruct or direct the user, like with praise, they will rely on you as their crutch when they need help again.

Another thing, we have realized in over 55 live usability reviews our Portland User Interface Special Interest Group has conducted since 2001 is: let the user go off track if they need to, their confusion will teach you something about their expectations and problem-solving techniques.

Best Practice: Instruction should only be offered if you are consciously moderating and feel it is safe to "reel the user back in" (usually I leave them as long as 5-10 minutes in an off-track path).

Happy Usability Testing!
Frank Spillers, MS (Usability Consultant)

If you enjoyed this article and you are interested in refreshing your usability testing skills, you might check out my exclusive Usability Testing Skills refresher web seminar...


Join the Portland User Interface SIG, the group meets online and is open to anyone with an interest in learning more about usability!