Engagement/Mozilla.org Durable Team/Testing Playbook: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
Line 2: | Line 2: | ||
## Why test? | ## Why test? | ||
### Testing gives us data to optimize user experience, leading to increased conversion (downloads, Accounts sign ups, newsletter sign ups, etc) supporting key performance indicators. | ### Testing gives us data to optimize user experience, leading to increased conversion (downloads, Accounts sign ups, newsletter sign ups, etc) supporting key performance indicators. | ||
## Test tracker | ## [https://docs.google.com/spreadsheets/d/1BSAR6EJj_lToGNNNp3k9RjRTlGcVrpRfZvEw9OwH_88/edit#gid=0 Test tracker] | ||
# Planning. Define the following in bugzilla/google docs linked from wiki page: | # Planning. Define the following in bugzilla/google docs linked from wiki page: | ||
## Hypothesis | ## Hypothesis | ||
Line 27: | Line 27: | ||
##### Basic user-agent | ##### Basic user-agent | ||
## Review | ## Review | ||
### Checklist for reviewing Optimizely set up | ### [https://gist.github.com/jpetto/30396fbfdd62794d8e02 Checklist] for reviewing Optimizely set up | ||
#### Does test look and work as expected on demo server? | #### Does test look and work as expected on demo server? | ||
#### Are correct measurements being reported in GA? | #### Are correct measurements being reported in GA? | ||
Line 35: | Line 35: | ||
# Next steps | # Next steps | ||
## Review results | ## Review results | ||
### | ### [https://datastudio.google.com/#/reporting/0B6voOaUZL-jwcGg1ZVZvSUJ4dUU Newsletter conversion] | ||
### Deploy winning tests globally with L10N team | ### [https://docs.google.com/presentation/d/15izvYKdGkCdczRu1jjjsid2KbNqAiezknH7AnQeDwto/edit?ts=56e33125#slide=id.p Participation tasks] | ||
## Deploy winning tests globally with L10N team | |||
## Define additional hypotheses and tests based on test data |
Revision as of 22:55, 17 March 2016
- Testing: Why
- Why test?
- Testing gives us data to optimize user experience, leading to increased conversion (downloads, Accounts sign ups, newsletter sign ups, etc) supporting key performance indicators.
- Test tracker
- Why test?
- Planning. Define the following in bugzilla/google docs linked from wiki page:
- Hypothesis
- Test Plan
- Measurement requirements
- Implementation
- Choose testing tool(s)
- what tool do we use to split traffic?
- Optimizely offers the most detailed targeting options
- Custom js keeps the page weight lighter and doesn’t depend on third party tools
- GA
- what tool do we use to run the test?
- When do we use GA?
- More control over the code changes
- More complex changes in Design and Page functionality
- Pages change based off of information in the Browser (eg. Welcome page - changes based off whether your browser is set as default
- Segmenting results
- Multiple Pages
- When do we use Optimizely?
- Simple Changes
- Copy - testing a lot of different versions
- Design - basic changes
- Can use Optimizely for directing traffic to any page.
- Basic user-agent
- Simple Changes
- When do we use GA?
- what tool do we use to split traffic?
- Review
- Checklist for reviewing Optimizely set up
- Does test look and work as expected on demo server?
- Are correct measurements being reported in GA?
- Checklist for reviewing Optimizely set up
- Choose testing tool(s)
- Reporting
- Tests run in Optimizely: use simple Optimizely reports
- Tests run in GA: work with Analytics team to pull/build more complex reports
- Next steps
- Review results
- Deploy winning tests globally with L10N team
- Define additional hypotheses and tests based on test data