Compatibility/Meetings/Sync w Honza: Difference between revisions

Jump to navigation Jump to search
4th of april meeting
(March 21 2023)
(4th of april meeting)
Line 1: Line 1:
== April 4 2023 ==
=== Google offline feature (Honza) ===
* Draft document https://docs.google.com/document/d/1WIvVQ91HiOYL-o3pRh-7HRPGTm3tWo22BfrZFQKV2yw/edit
Honza: This seems done. I've shared this with others, and now I am waiting for feedback from Joe and Maire. They are interested in a bigger anaylis, to help us understand better what is broken from the webcompat point of view, like we've seen from the UI point of view, or weather we can reveal more from future testing.
They want to know how expensive will it be to fix these features. Maybe the next step is falling in the diagnosis backend. Could further testing help the diagnosis process, or even maybe find the root cause?
Raul: We can provide more testing, but we do not have the knowledge to pin point the root cause.
Paul: We'll probably need Engineering help to figure that out. Even for the initial testing Denis helped us a lot to figure out the problems.
Honza: Fair, then we shall consider this done from QA prspective.
=== Testing Methodology (Honza) ===
* Gdrive folder https://drive.google.com/drive/folders/1TAACBgrMSYtPjUswT0EbpNp3CKwFzF6p
* Methodology notes https://docs.google.com/document/d/1cTvNdGLgwwttGXIxaIcslUwGPDhYyyUvnBy-ySBf1uc/edit
Honza: I've created this folder so we can put other relevant documents here besides the ones we already have, so they will be easier to find.
Honza: I was looking at the document with the top 10 social media websites and I think we can improve the structure. I've created a guideline doc with what info would be useful to have in our reports.
Raul: Should this document help us with new OKRs when we are testing different websites?
Honza: Yes. Those kind of documents could describe the methodology on how those features/websites were tested so it could be easier for other to understand whats happening there, what the situation is.
Honza: So I'm looking at the results, I can see there are a lot of links redirecting to different issues but thats not too insightful. I think it should be more helpful if you make a summary on each site, which kind of issue are more concering for that specific website.
We could make this summary better by focusing only on te P1's after the Engineerign team triaged the issues.
Raul: There's one issue here, we don't really assign a priority ourselves. Thats normally made by the dev team. Not all the issue are trully webcompat, some are ETP issues. Should we categorise them?
Paul: No, just focus on the P1 issues after they are triaged by the team and the priority is added.
Honza: So this document is not exclusively for the webcompat team, this is higher than that. The goal is not helping with triage. This is for people outside the project to understand the problems that are going with those websites.
I think what I was editing the most in the document is the context about those websites.
Paul: We'll provide a link with all the issues we are refering too and do a summary of them, maybe even categorize them if we see a pattern.
=== Q2 Proposal Review -  Top 100 Sites (Beaucoup list) (SV) ===
As discussed, we are planning the following [OKR](https://github.com/mozilla/webcompat-team-okrs/issues/271)
Is there an up-to-date list that we can use? Also, we are thinking of running the tests in TestRail, for each domain, where we can group sites (News, E-learning, Shopping, etc.)
We have this test suite used in the past, which we can tailor it to be up-to-date and relevant: [link](https://testrail.stage.mozaws.net/index.php?/suites/view/39006&group_by=cases:section_id&group_order=asc)
Note:
The link should be accessed after account login is performed. If not, then accessing the link and performing the login requires the link to accessed again in order to see the test cases.
Testing will be conducted on Desktop and mobile. Should we include iOS + ANDROID on the mobile side, and Mac +Windows on Desktop?
[Paul]
1. I've discussed with the Mobile QA team and they only covered Android, so we can scratch iOS.
2. Regarding Desktop side, I think it's the WebCompat team role to decide what Desktop platforms to include in our testing based on markers like users numbers, platforms with most issues, etc.
3. Mobile QA team was testing websites depending on regions, but that is not mandatory either, they have only done that as the Alexa top they were using, was splitting websites on regions. Also, I think it would only add complexity for us if we want to do that, so I don't think it's worth it.
Raul: In the past we've used Alexa for the top 100 sites, in the current beaucoup list last time we had only around 20.
Honza: We actually have 100 websites now from the beacoup list but those 20 were currated.
Honza: We can use the current list from Beucoup or we can make our own list.
Paul: Top sites from Alexa was based on how accessed the pages were, but that is not avaible anymore. Maybe webcompat has another list that we might use?
Honza: We can use also the list made by Tranco, the spreadsheet from Beucoup and compile a list that reflects the Top 100 sites today. There is also the HTP archive.
Links:
https://tranco-list.eu/
https://www.similarweb.com/
https://docs.google.com/spreadsheets/d/1HcafFKM_bv-2O6qad011mTgCNWX2lEjvBNxTYld1GYk/edit#gid=1838995098
Honza: Best would be to talk to Ksenia about this as well.
Paul: We will look over it and see if we can figure out an overlapp, if we don't we will ask Ksenia's help.
However, what would be helpful, is to find out on what platforms we should focus our testing on from Desktop.
Maybe we could pick the ones most used, or pick the ones from which most of the reported issues are coming from, or other clasification.
Honza: Next week maybe we should open a new topic with the team regarding this subject.
Honza: So we are making an analisys which sites are not supported and lately we are seeing an increase in them. This should also be one of our main goals, if certain pages from top 100 list are supported by Firefox.
Paul: We can pinpoint regressions also by doing this constantly. If possible, we are planing to do this twice a year.
Honza: Maybe we can also look at a trend in this case,to see how the webcompat part evolved from one run to another.
Paul: Yes, we can have a summary in the report, and we can see the differences between runs.
Honza: That sounds good, sound like a good reason to do this OKR.
== March 21 2023 ==  
== March 21 2023 ==  


14

edits

Navigation menu