Compatibility/Meetings/2018-12-work-week: Difference between revisions

→‎Minutes: Adding minutes of the discussion in Mozilla All Hands for webcompat
(→‎Topics Bank: section for discussions.)
(→‎Minutes: Adding minutes of the discussion in Mozilla All Hands for webcompat)
Line 35: Line 35:


== Minutes ==
== Minutes ==
Scribed by Everyone!
(Mostly scribed by Denschub)


=== OKR 2019H1 ( 👹 ) ===


* '''mike''': (Going through all the things we did in 2018Q4).
* '''mike''': We still have stuff in progress. We are working on Blipz v2. Rozanna wants to try to staff the development in innovations for version 3. We are working on metrics stuff. I would love to have needstriage and sitewait, etc.
* '''karl''': it  should not be very complicated, because we just listen to the milestones total number of issues.
* '''mike''': Track new top sites. It would start manually according to the formula. If we decide the numbers are interesting, we can create a software. We need to prepare a report at least once, and probably one update.
* '''karl''': after the prototype is done, we can decide what we would do with it and how much resource it takes.
* '''mike''': do you have time to finish the anonymous reporting channel?
* '''karl''': yes. 2019Q1 for evaluation.
* '''mike''': We need to figure out if we need to do more enterprise stuff/testing.
* '''karl''': most of the time, we didn't have strong usability issues.
* '''mike''': patch process. More work.
* '''mike''': About tinker tester
* '''karl''': too complex for me in the current situation.
* '''mike''': it's worth investing the time in tooling so we are more effective.
* '''mike''': figuring out how to duplicate issues. I don't know if we do this or not.
* '''dennis''': what eric started to do.
* '''mike''': it could be helpful to have the results.
* '''karl''': (disagree :) It's more complicated.)
* '''adam''': it could be an outreachy project
* '''mike''': site snapshots (saving all resources)
* '''karl''': good idea for the data, but bad idea for privacy. Probably would need to be researched before starting development. Survey on our current reporting.
* '''karl''': it could be very useful to have probes detecting some known issues (marfeel, fastclick, etc.) and adding probes little by little.
* '''adam''': https://github.com/mozilla/OpenWPM
* '''mike''': Do we keep pushing on Google Tier1 testing? Do we think it's useful?
* '''karl''': we need something to capture the work surrounding performance improvements to webcompat.com
=== OKR 2019H1 Ideas Bank ( 👹 ) ===
* More graphing/dashboarding/visualization work - webcompat metrics work.
** graphs for different stages of Web Bugs (sitewait, etc)
** integrate triage dashboard for Softvision folks (and contributors)
** recording different milestones on server side /db
*** time estimate: 1 day
** front-end refactor
*** time estimate: 1 day
** front-end implementation
*** time estimate: 1 week
** database stuff + refactoring
*** time estimate: 1 week
* Track top sites by assigning a score based on the sites priority and the amount of reports, … (webcompat metric)
** time estimate: 2 months
* Evaluate the communication channel for anonymous reporting (follow up work to Q4 stuff)
** time estimate: 1 day
* Finish building the intervention rollout process for GoFaster
** 1 month
* <code>about:compat</code> VAPORWARE
** we don’t know.
* address Tinker Tester papercuts (Improvements: Evaluate what is useful, what we can improve, …) aka build dennis stuff.
** time estimate: depends on scope. 1 week.
* tinker chrome port:
** time estimate: 1 week.
* Duplicate Web Bug reports, or more general: a way to correlate issues more quickly/easily (GitHub beta?)
** time estimate: NOBODY KNOWS
* Research if Site Snapshots would be useful
** time estimate: 2 days
* console.log step 2 - Make it useful / usable
** structure log
*** time estimate: 2 days
** UI (front end)
*** time estimate: 2 weeks
** privacy policy updates:
*** time estimate: 1 day
* Leverage one telemetry probe, like for example tracking when FastClick is used (or at report-time)
** time estimate: 2 months
* Switching to python 3 for webcompat.com
** time estimate: 1 week
* Upgrade nginx
** time estimate: 1 week
* Participating to Lighthouse and/or webhint.io
* Being more public about the core issues being fixed following webcompat issues (karl)
** regular effort, 1 hour a week.
* Performance improvement (outreachy) support by the team
** time estimate: 3 months
* migration of labels on webbugs issues, need to write software for that. make sure new issues are labelled correctly.
** time estimate: 2 weeks
* Analysis of marfeel websites to identify the type of issues for either fixing though site patching or contacting them
** time estimate: 2 months
* Creating a fastclick sitepatch with a whitelist of sites after investigating the issue.
** time estimate: 1 month
* Twitter bot for automatically posting a comment on the issue with the notice (after people have tweeted a link)
** time estimate: 1 week
* platform gap metrics work
** time estimate: 1 month
* follow up on tier 1 report card with improvements to the reporter
** time estimate: 2 weeks
=== Vision and Alignment 2019 ( 👹 ) ===
* '''Mike''': Platform will fix the bugs we deem as most important for webcompat. We need to be more efficient with Web Bugs to make that better.
* '''Adam''': Outreach is working fine IMO. I skip issues marked as Minor, and we probably need to be more agressive about re-pinging.
* '''Karl''': Outreach is very slow (and expensive), and sometimes, we may have better ways to fix sites (like Site Interventions). Outreach gets things done and the wb fixed, but it's really slow.
* '''Mike''': I agree that Site Inteverventions are important. We are working on a process to roll them out. These Interventions are not the long term solution, but a way to get our goal solved in the short term.
* '''Mike''': How's diagnosis going? Is it too much? What can we do better?
* '''Karl''': We still have too many bugs reaching the needsdiagnosis queue, but they don't need diagnosis. We need to be better at triage, and sometimes, we need to just close it if the issue looks not relevant enough.
* '''Mike''': I agree that for the next 6 to 12 months, we need to focus on the most important stuff. Karl (and Tom): Microsoft previously has closed those issues, but said they ackknowledge that there are issues, but they won't be working on it.
* '''Karl''': We also need to build tooling/make tooling better.
(not scribing individual people, but group contents)
* Diagnosis is a lot of work. With Karl doing amazing work in prediagnosis, pings from Karl are complex and take a lot of time.
* Maybe we should not ping people directly, but rather work in buckets, so that people can focus on that.
* Backlog pressure
* We may be too perfectionistic. It’s not bad to just throw an issue to the site’s developers without having the final diagnosis. If we have a general idea on what the issue may be, we could do outreach.
* We could be more active in pinging other engineers to have them help us out. It might be more efficient if people working on affected components help out.
* Let’s make a label to assign to web bugs that we cannot resolve, so The Management(tm) can triage those and help out.
* Some combination of self direction and management might be useful for diagnosis
* Diagnosis context switching (getting up to speed, finding the groove) vs. brain pain.
* Weekly diagnosis reports are useful. Good way to know what’s happening.
* Grumpiness may only be an illusion (re:grumpy core engineer).
=== Firefox Webcompat Strategy Meeting (mike) ===
'''Hosted by Mike Taylor, many participants (can’t keep track of who’s speaking)'''
* Google Tier1 search for mobile (win)
* Fixed 15+core interop bugs webcompat p1,p2,p3 (win)
* Interop vs Web compatibility
* Improve broad web api compatibility with Blink, closing the webcompat gap
** We don’t have infinite resources, we should be strategic with our efforts
* Web Compat is hard to measure but we are making progress in this area (thanks Tim)
* Review of a compat bugs lifecycle
** Standards and specs &gt;
** Api’s &gt;
** Gecko Platform &gt;
*** Web Platform Tests
** Firefox ships a release &gt;
** Web Devs &gt;
** Web Users
*** Web Compatibility bugs
* Talk about what happens when a spec implemntation happens well. CSS Grid is a good example that’s newer
* Many of the Webcompat bugs on the P1/P2 list has been old things that weren’t spec’d properly or at all
** If we had been paying attention to tests that were breaking in the past, would we have known
** In the case of window.event we knew and made a decision not to implement
** Now that we have tests for what’s not working between browsers, do we know what’s being used and important?
** We’re working on this fearture are the tests showing potential issues with implementations. We can link it to a webcompat issue after.
** What metric can we track to know that comapt issues will occur?
*** Where there are gaps between Gecko and Webkit / Blink and try to rank the severity of it
** Is there a way to surface the usage data along side the compat issues / platform tests?
** Microsoft has really good usage data on this. The challenge is it will show 0.136% of the internet uses this feature, but that could include Facebook so you can’t remove it.
* Measurement spectrum
** What browsers implement
** What sites plan to enable
** What site have enabled
** (didn’t see this point)
* Platform Gap Metric Proposal (proposal #1)
** Take all the Chrome status use counters, weight each one by it’s usage
** This data is noisey
** We are taking a closer look into this
** You could implement all the features that Chrome has, but may not fix your problem
* Webcompat Metric (proposal #2)
** We want to know that what we are working on in the platform is fixing real websites
** Site Compat Index
*** Complex equation
*** Websites position in the top-200 list
*** Web compat reports affecting website
*** Number of duplicate reports
*** Priority and product weighting factor, indicating strategic importance
** Total Site Compat Index
*** Complex equation
*** Combines all the individual ratings of the Top 200 list
* We have a metric in our DX 3 year vision to track the number of top 100 sites that give us a tier 1 experience
* In 2019 we want to close the webcompat gap
** This is the process that we follow:
** https://wiki.mozilla.org/Compatibility/WebCompat_Tracking_And_Triage
** Can RelMan track these webcompat Px bugs with a query?
* Stuck bugs
** Sometimes we can caught on diagnosing website
==== Questions? ====
* Do we have any idea of test suites that are missing pieces and causing webcompat breakage?
** Sometimes it’s compounded issues
* Do we know which areas are on this?
** We know there are a lot of issues with CSS and events (I think Karl said), possibly scrolling
* How do we take the gaps we find in bugs and highlight them in web platform tests?
** We can look back on bugs with compat type labels or Bugzilla dupes
=== Continued Vision and Alignment 2019 ( 👹 ) ===
'''Attendees: Webcompat team (not scribing individual people, but group contents)'''
==== Metrics discussion ====
(Webcompat Metric proposal #2) * https://docs.google.com/document/d/1oAvIkGVM3HKUAumI4K_qV315ujCoeMbG_Z9aUBYBecw/edit * Concerned that our metric will not reflect the severity, or impact of the reports to users * 2 severity critical bugs may be more important that 20 severity minor * Going to fill this out and try to understand how that would work
==== About:compat page ====
* Need a decision if we can deliver it fast enough
** Does product want it?


== Agenda ==
== Agenda ==
Confirmed users
1,567

edits