QA/Browser Technologies/2011-12-01: Difference between revisions

 
(8 intermediate revisions by 5 users not shown)
Line 10: Line 10:
* Mobile
* Mobile
** Mobile Developers are looking for additional testing in regards to recently landed patches ({{bug|627842}}) re: improving font size readability on Firefox for Android. Please file bugs if pages you view have odd looking font inflation (size) issues.
** Mobile Developers are looking for additional testing in regards to recently landed patches ({{bug|627842}}) re: improving font size readability on Firefox for Android. Please file bugs if pages you view have odd looking font inflation (size) issues.
** Encourage and continue to audit and [https://docs.google.com/spreadsheet/ccc?key=0AocUyLHteCtSdHQ5Q2tIZVhMT3NNY0lPYzhHT2MyZXc&hl=en_US#gid=0 track testing of feature coverage here.]


=== Automation (martijn, John) ===
=== Automation (martijn, John) ===
* Functional Frameworks
** Have plans to look into the following frameworks:
*** Marionette
*** Robocop
*** SL4A code.google.com/p/android-scripting (not a test framework as such but can be fitted - takes care of Android API side stuff I have been doing until now with native java apps)
*** MonkeyRunner/Testdroid recorder http://testdroid.com/product/testdroid-recorder
=== Specialized (naoki) ===
=== Specialized (naoki) ===
* CrashKill :  
* CrashKill :  
Line 31: Line 39:
== Sync ==
== Sync ==
=== Client (Tracy)===
=== Client (Tracy)===
=== Server (James)===
* no client trains
* been working on test cases and testing tips videos
 
=== Server (James/John)===
* James
** Fairly quiet two weeks with only a couple maintenance releases that needed to be tested in Production.
** Got John ramped up on all things Sync Server


== Test pilot (tracy) ==
== Test pilot (tracy) ==
* new study for broken/incompatible add-ons released.


== BrowserID (james) ==
== BrowserID (james) ==
* One very large release in Beta over the last two weeks:
** Train 14: bug 703596 - QA and deploy BrowserID train-2011.11.17 to production
** (Yes, 14 releases already!)
* All QA, Dev, and OPs work now is focused on launching to OPs environments: Dev, Stage, Test/CI, and Production
* Details: https://wiki.mozilla.org/QA/BrowserID/OPsBuildOut


== Pancake (Naoki) ==
== Pancake (Naoki) ==
Line 54: Line 74:


== WebAPI (John) ==
== WebAPI (John) ==
* Project Status:
** Currently landed:  Battery, Camera, IndexedDB, Vibrator, SMS
** Demoed these to media, blogpost
** Need to add EXIF support to camera app
** Need to add Upgrade scenarios to IndexedDB
** Need to create test pagesL  Sensor/Accelerometer
** Camera landing on Windows soon
* Testing
** need to add coverage matrix/testplans/testcases to all apis except Battery
* App support
** Have Battery APP
** Have Info App
** Need Geolocation/Accelerometer APP


= Round Table  =
= Round Table  =
Line 74: Line 107:
== Raw Notes ==
== Raw Notes ==
<pre>
<pre>
Crowdbot dropped :  
BHAG : getting out the door
new zippity working; pulling together the new zippity
- clarity of questions
Not sure where tony is on the other items.
- Big Hairy Audacious Goals
jbonacci finished his dashboard; link in the past action items section.
- want to be able to define the direction
- what are we trying to accomplish as a QA org
- not just the weekly goals
- How does a BHAG differ from team goal?
- Let's group them up
- Products
- Automation
- Sync services : automation
- data to prove it's not ready
- Tool for webkit CSS spidering
- drive cross platform, cross browser automation testing on these products (Browser ID)
- How can we make this a QA goal? - segrata(?)/pyrmamid
- not going to write tests for every app, but maybe a template, or samples of tests
- Need more information
- Communication
- Let's have communication with Dev
- what's in our current process that can be improved?
- Fennec : There's couple of broken things in the communication
- triaging bugs
- equal amount of say : go/no go , proving with data, etc.
- communication w/ keeping up on test case, test case management
- leading the edge, rather than trailing the edge
- how is that different; how can we do better?
- Bugzilla study work, Dev gets assigned with bugs, -> assigning bugs to QA until they are signed off
- Improve by bug workflow and process communication : dev should pass the bugs to QA
- better management of flags and follow up.
- better recommendations (submit tests w/ patches); unit tests


BHAG:
tracy :
- Community
mozmill and tps, we can nearly automate all test cases.
- how can we get productivity, excitement, etc. from our community
need another person or way more time to get this done.
- test day : we don't want them to disappear after one day
mobile :
- how do we get it better?
automation solutions
- localized test days around the region/world
engagement / Community
- Here's a schedule of test days around the world
browser id :  
- moderate their own
automation solutions
- huge goals
engagement / Community
- every quarter we can break it down based on regions and helping them out
pancake :  
- quarterly goals will get broken down from the BHAGs
automation solutions
- webkit CSS testing
engagement / Community
- we want to drive automated tools that people can easily contribute
( http://testdroid.com/testdroid/547/testdroid-recorder-1-1-11-is-out )
- define some sort of metric to raise awareness per product.  each team to define metrics
- community dashboard
- device anywhere, fennec work, etc. : limitting
- continue to make what contributive to proactive members like gabriella
- Target community members and mentor them into community leaders
- motivation techniques


Need to get audience for Services side
- who the community is  
- Manual testing
- how to reach out to them
- test plans and test days, but it's starting with blank page
- it's a little different from client
- maybe have templates or something
- have global templates or duielines for creating testplans and testcases [tools]
- how do we write a test plan
- standardize metrics that we collect and results and what data needs to be...
- metrics team already doing this
- AMO, webteam doing metrics already
- certain metrics for each of the products to display to the rest of the world?
What are the number of bugs, what location of test cases, how many run, for each webapp?
- dashboard
- code coverage
Cross team cross identify
Firefox + Mobile now
Next year is going to be different playing game:
- OWAs
- What are our goals?
- OWAs is a glue to a lot of projects
- Joint bug tracking systems
- test environment
- how can we have a SLA?
- we need QA : where do we start?  What's our service agreement
- what framework, services, equipment, schedule, specs, plans, etc.
- quality should be driven from QA not the other way around and an afterthought
- should be testable before having the patch
- some of it is changing the perception amongst the company


Services need more technical community that's interested in bouncing ideas in hacking and testing which may be tricky
- small business, and organizations that are picking up OWA and Browser ID
Come up with some ideas for single or small groups through various sites that specialize in server side testing
- need to share resources with noncommunity, nonmozilla emp
Trying to reach a community, not sure of
- this will change how we do QA; new audiences businesses
trying to think of what's already running in terms of conferences or community with server ops maybe?
- pick up a secondary
hacking community : hacking for security privacy issues.
- how do we coordinate testing efforts?
 
- as other people pick up the product
Again trying to find the people would be hard
- web app, storethe apps that go in the store
 
- should be a similar model
For Browser Tech, not sure.
- landing amo, review process, etc. upgrades
- hard to accomplish each of these goals due to youth of projects (for mobile, sync/browser id, pancake)
- browser ID?
- one of the goals for next quarter is to get pancake out
- cross browser will be a big difference
 
- mozilla backed product has to be useful everywhere in web
Anything for process?
- webkit QA
- chance to reset
- figure out the processes : risk areas, etc.
- BHAG for mobile: website compatibility
- rotational program for each person for a round robin [ that way we don't get tester fatigue; coverage ]
- webkit css
- newsletter for each group
- structure doesn't work well on these sites; for cross browser
 
- can we write some tools to spider the sites?
- newsletter : cross training, bi weekly.  based on btech meeting?  should we have it through out.
- to throw them out to community and report bugs?
- try to take as little time as possible to spend on the newsletter
- https://wiki.mozilla.org/Mobile/Evangelism
 
- website compatibility
- do we need a SLA?  Need to formulate structure for SLA : response to testing ?
- work with evangelism, SUMO, tool to crawl websites and detect webkit css
- Some kind of more formalized process with the dev
- spread the testing to community
 
- Marketting
Mobile :
- Robocop would help out with.   
- exploratory in the beginning, manual verifications of nonautomated tests, remaining 10 %
- Crowdtest : martijn Crowdbot ported to native UI?  Need agreement from Martijn
- Zippity ported already
- Something around Devices?
- Performance : automation, compare against other browser.
- getting community with devices that we don't have ; community device owner
- getting devices to contributors; people that have devices that we don't have.
- Resources : move 90 % automation?  -> verify the automations, migrate tests litmus to automation, write litmus to automation, maintainence, increase in waverly help
- if we have some slack we can shift people to identify testing, browser id, etc.
 
Services:
- QA is running a fully integrated Test system, consisted of VM's and physical hardware
- sync server environments, mix of VMs and physical hardware that's already happening
- could be a scaling thing, in terms of how to cover this?
- automation picture is bigger?
- cut thing?  need tony to check out
- rest of these seems like processes, which most of them we already starting to do.
- he wants us to be doing; we have various test environments:
- requesting QA have their own environment for testing
- borrowed stuff from Ops team
- 2012 : make our own sync environment : browser ID environment rather than staging environment
- No server side automation.
- nightly unit testing; that's only nightly?  per train?  not sure if it's nightly testing?
- will soon. on the sync server side it is happening Tarak has it set up
- Browser ID ones are javascript
- sync server side it's python
- sync is not moving to pyramid
- sync is all javascript?  TPS stuff; some kind of nightly test that's being done... disregard.  it's client side.
 
BHAG:
Need a clearer idea what BHAG is
- context: for the next year
=> 2 big goals : automation, and community
Since browser ID has started, most of the stuff has been running on VM.  They are creating full facing environments for release
for staging, production, dev.  Work in progress.  All 4 environments will be up and running by end of Nov.  (est)
=> end of Q4 go live
- Automation : cross platform
- browser id people are having some discussions; AMO, and other teams
- need automation for cross product, cross browser
- selenium 2
- drive cross platform, cross browser automation testing
- have some ideas on how we see ; sense server side testing
- SL4A scripting language
- get stuff from platform API (Android) to get information that we need, need to becareful of privacy
- when we know what information that we want to get, we could potentially get it.
- by working with an automation for frameworks
- need to maintain the system
- Robotium : SL4A python
- http://testdroid.com/testdroid/547/testdroid-recorder-1-1-11-is-out
- Tracy: concern for automation, keeping up with automation?  regression testing?  automation?  Sync?
- mozmill + TPS should be able to automate manual testing that we come up with
- if it's just on Tracy, it's a BHAG on top of everything else that Tracy
- push Tracy to do it for a Quarter or hire someone else
- need framework, what has been done, etc.
- more automation for back features
- w/ TPS it has a listener, so the test cases should be in continuous integration frameworks
- Mobile : BHAG
- automation that's a big outstanding goal
- (moved stuff)
- martijn : crowdbot - getting that for native ui
- crowdsource tools that makes it simple for folks to run for cross-platform products
- results/data/reporting driven
- lower the barrier of entry, compelling and fun
- give them a reason for them to join
- a fun, interesting project
- gaming, social interaction?
- combine them?
- automation of recording and pushing scripts to review for community
- Performance process tools
- eiderticker : recording : need to investigate this BHAG
- Having a simplier for bug submissions on phones
- something that captures the data, screenshots, bugs, etc.
- how to submit bugs


How do we track the community
- BHAG form and context underneath it
=> we have one active community member for mobile team
- need the context for what areas that are being filled in
=> sync no progress as of yet.
- the quarterly goals would be the last step
actionitem: => sync newsletter to be discussed between james/oki
- large goals ideas
- poin pointed specifics
Status : read wiki
- crossbrowser specifcs
- browser id : business QA as well
Round Table : fennec table to make sure that sync and fennec stay in sync.
Are they bundling as two separate applications?
Not sure how it's going to go.
The release of first version of native is sync is a blocker if it isn't there.
- it might not be a hard blocker for having sync there.
- idea is tighter knit strategy for this.
</pre>
</pre>
Confirmed users
151

edits