Websites/Mozilla.org/One Mozilla/Documentation/Checklist-UXR: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
Line 78: Line 78:


'''Fix the Worst, First'''
'''Fix the Worst, First'''
<br>
Two factors inform severity:
Two factors inform severity:
*a lot of people will experience the problem (e.g. it hiders download transactions)
*a lot of people will experience the problem (e.g. it hiders download transactions)
Line 89: Line 90:
*List of problems that will be fixed in the next 1-4 weeks as a result of what you observed
*List of problems that will be fixed in the next 1-4 weeks as a result of what you observed


Laura Forrest, host a lunch and learns for the /firefox Product Site team. You might also include someone from your Support team, a developer, a designer, and people from product and product marketing.
For the /firefox Product Site, Laura Forrest hosts lunch and learns for the team (basically movies and popcorn!). You might also include someone from your Support team, a developer, a designer, and people from product and product marketing.

Revision as of 04:31, 21 February 2012

Overview: All websites have problems (some are just prettier to look at while you're growling). User research helps fix usability problems when it comes to websites. We love fixing problems, and creating user happiness. Here are some of the tactics that the websites' team uses.

Analytics

At Mozilla we use Webtrends to track website analytics. The analytics that Webtrends gathers tells us what people are doing on our websites, but not necessarily why they're doing it.

Important Metrics:

  • Bounce rate - useful for identifying pages that might not be useful or user flows that might be broken (presenting the
  • Page views - useful for identifying high-performing content & user flow entry points
  • Time spent - the amount of time a user spends on the website
  • Referrals - where users are coming from
    1. of desktop downloads
    2. of mobile downloads
    3. of social connections
    4. of backlinks

(there are many more!)

Surveys

  • Pre-conversion survey - setting the baseline
  • Post-conversion survey - measuring the effectiveness of the tactics

Laura Forrest is Mozilla's resident whiz at this, read her post on the brand awareness survey she ran in 2010-2011 here

Human Testers

3-5 human testers will provide enough of a data set.

  • recruiting testers is a lot of work and doesn't always meet the needs of an agile development team
  • more than 3-5 and you'll never get the end of your notes, we've tried!

Action Items:

  • Make a list of 5 tasks people need to be able to do
  • Create a list of short scenarios for how people will perform those tasks

Example: /firefox Product Site

  • Task: Download Firefox for Mobile to your Android device
  • Scenario: Go to the Android Marketplace and search for a Firefox that you want to download
    • note: users can get to the marketplace from Google desktop as well as their device's browser, or the app store on their device (this made it fun to test)

3rd Party Services

  • usertesting.com
    • the challenge with usertesting.com and other similar services is that the users are highly motivated to do the tasks "right." Users receive ratings on their performance and in general it's not unusual that a tester wants to "get an A."
    • we use usertesting.com to do fast testing on important changes we make to our conversion funnels, we also use it to learn more about how users interact on our competitor sites.
  • Gazehawk - Eye tracking is the process of measuring either the point of gaze (“where we are looking”) or the motion of an eye relative to the head. Gazehawk uses software to track eye positions and movement around areas of interest (AOIs) on webpages.

In-house User Testing

Use a very simple setup with Silverback

Users: Both Fans and Outliers are useful Fans

  • It might seem contradictory to bring in fans, but people who aren't part of the target audience will have problems that regular users won't, which creates false positive data
  • Similarly domain knowledge isn't always as useful as you think it might be. In the case of /firefox, we're working with 4 flavors of Desktop and Mobile downloads (nightly, aurora, beta, GA). Terms that you might think users would be familiar with like "download" turn out to be more loaded lately

Outliers

  • People who fall outside of the target audience can be useful too. Often, they can point out things that real users can't. It's useful to have at least one articulate outlier in the mix.

Post Test

Lead with three usability problems After each tester completes the test, list the three most serious usability problems:

User #1:

  • 1.) . . .
  • 2.) . . .
  • 3.) . . .

User #2:

  • 1.) . . .
  • 2.) . . .
  • 3.) . . .

User #3:

  • 1.) . . .
  • 2.) . . .
  • 3.) . . .

Fix the Worst, First
Two factors inform severity:

  • a lot of people will experience the problem (e.g. it hiders download transactions)
  • it causes a serious problem for users who experience it v. a minor inconvenience

Simple Report Keeps it Simple If you encounter one or both of those situations, you need to fix the problem fast, write up a report:

  • What was tested
  • List of tasks the participants did
  • List of problems that will be fixed in the next 1-4 weeks as a result of what you observed

For the /firefox Product Site, Laura Forrest hosts lunch and learns for the team (basically movies and popcorn!). You might also include someone from your Support team, a developer, a designer, and people from product and product marketing.