Foundation/2021/OKRs: Difference between revisions

From MozillaWiki
Jump to navigation Jump to search
(Removed glossary until reviewed)
Line 32: Line 32:
| 1.1
| 1.1
| X# of citations of Mozilla’s AI transparency best practices by builders.
| X# of citations of Mozilla’s AI transparency best practices by builders.
'''Motivation:'''
Our work on misinfo and political ads established us as a champion of AI transparency. In 2021, we will broaden this work by a. working with builders to create a list of AI transparency best practices and b. creating a transparency rating rubric for Privacy Not Included.
'''Sample Activities:'''
*Develop a taxonomy + gap analysis of ‘AI + transparency’ best practices in consumer internet tools and platforms (H1).
*Involve builders in the research, iteration and sharing of best practices. 
*Develop an AI transparency rating rubric for Privacy Not Included based on these best practices (H2).
| Eeva Moore
| Eeva Moore
|-
|-
| 1.2
| 1.2
| 25 citations of Mozilla data/models by policymakers as part of AI transparency work.
| 25 citations of Mozilla data/models by policymakers as part of AI transparency work.
'''Motivation:'''
Projects like Regrets Reporter and Firefox Rally show citizens will engage in efforts to make platforms more transparent. In 2021, we want to test whether evidence gathered from this type of research is effective in driving enforcement and policy change related to AI transparency.
'''Sample Activities:'''
* Recruit additional YouTube Regrets users w/ movement partners to generate additional data and reporting, particularly regions where AI transparency is gaining momentum.
* Use YouTube Regrets findings to engage policymakers in key jurisdictions to demonstrate the need for AI transparency policies.
* Use the Rally platform to run up to five in-depth research studies by Mozilla and others that demonstrate the value of transparency in guiding decisions re: misinfo + AI.
| Brandi Geurkink
| Brandi Geurkink
|-
|-
| 1.3
| 1.3
| 5 ‘best practice’ reports published to provide evidence about value to consumers.  
| 5 ‘best practice’ reports published to provide evidence about value to consumers.  
'''Motivation:'''
Our hope is that more AI transparency will give people more agency -- and that this is something people want. However, we don’t know that this is true. In 2021, we want to fund or run a set of experiments to see whether consumers value AI transparency in practice.
'''Sample Activities:'''
* Generate best practice reports based on investments such as PNI rubric (developed in H1) and Umang Bhatt’s research on transparency in a consumer product (eg., Spotify).
* Feature findings from these and other reports in Internet Health Report, MozFest, D+D, etc. 
* Partner with cities to test efficacy of AI registry and transparency tools (pending funding). 
* Model + test transparent recommendation engine designs based on what we learned from YouTube Regrets (pending funding).
| Becca Ricks  
| Becca Ricks  
|}
|}


=== OKR 2: Modeling Good Data Stewardship: Accelerate more equitable data governance alternatives to advance trustworthy AI.  ===
=== OKR 2: Modeling Good Data Stewardship: Accelerate more equitable data governance alternatives to advance trustworthy AI.  ===

Revision as of 21:41, 24 February 2021

Mozilla Foundation 2021 OKRs

Mozilla Foundation is fueling a movement for a healthy internet.

This document outlines the Mozilla Foundation’s organizational objectives and key results (OKRs) for 2021.

  • Objectives = “What do we want to do and why?”
  • Key Results = “How will we know if we’re successful?”

These objectives were developed based on our work in 2020, and are linked to our multi-year Trustworthy AI theory of change, with links to specific short term outcomes.

In 2021, the Foundation will also continue to increase its focus on Diversity, Equity and Inclusion, both internally and externally. Most OKRs have DEI related activities. Also, the Foundation is developing a multi-year DEI strategy that builds on past efforts, including our 2020 Racial Justice Commitments.


Related documents:

Org-wide OKRs

2021 OKRs are informed by Mozilla Foundation's three year narrative arcs.

OKR 1: Making AI Transparency the Norm: Test AI transparency "best practices" to increase adoption by builders and policymakers.

Responsible: Ashley Boyd

# Key Results KR Lead
1.1 X# of citations of Mozilla’s AI transparency best practices by builders.

Motivation: Our work on misinfo and political ads established us as a champion of AI transparency. In 2021, we will broaden this work by a. working with builders to create a list of AI transparency best practices and b. creating a transparency rating rubric for Privacy Not Included.

Sample Activities:

  • Develop a taxonomy + gap analysis of ‘AI + transparency’ best practices in consumer internet tools and platforms (H1).
  • Involve builders in the research, iteration and sharing of best practices.
  • Develop an AI transparency rating rubric for Privacy Not Included based on these best practices (H2).
Eeva Moore
1.2 25 citations of Mozilla data/models by policymakers as part of AI transparency work.

Motivation: Projects like Regrets Reporter and Firefox Rally show citizens will engage in efforts to make platforms more transparent. In 2021, we want to test whether evidence gathered from this type of research is effective in driving enforcement and policy change related to AI transparency.

Sample Activities:

  • Recruit additional YouTube Regrets users w/ movement partners to generate additional data and reporting, particularly regions where AI transparency is gaining momentum.
  • Use YouTube Regrets findings to engage policymakers in key jurisdictions to demonstrate the need for AI transparency policies.
  • Use the Rally platform to run up to five in-depth research studies by Mozilla and others that demonstrate the value of transparency in guiding decisions re: misinfo + AI.
Brandi Geurkink
1.3 5 ‘best practice’ reports published to provide evidence about value to consumers.

Motivation: Our hope is that more AI transparency will give people more agency -- and that this is something people want. However, we don’t know that this is true. In 2021, we want to fund or run a set of experiments to see whether consumers value AI transparency in practice.

Sample Activities:

  • Generate best practice reports based on investments such as PNI rubric (developed in H1) and Umang Bhatt’s research on transparency in a consumer product (eg., Spotify).
  • Feature findings from these and other reports in Internet Health Report, MozFest, D+D, etc.
  • Partner with cities to test efficacy of AI registry and transparency tools (pending funding).
  • Model + test transparent recommendation engine designs based on what we learned from YouTube Regrets (pending funding).
Becca Ricks

OKR 2: Modeling Good Data Stewardship: Accelerate more equitable data governance alternatives to advance trustworthy AI.

Responsible: J Bob Alotta

# Key Results KR Lead
2.1 7 projects tested with real users to identify building blocks for viable data stewardship models. Mehan Jayasuriya
2.2 5 regulatory jurisdictions utilize our input to enable collective data rights for users. Mathias Vermeulen
2.3 6 stakeholder groups established as constituents of the Data Futures Lab. Kasia Odrozek

OKR 3: Mitigating AI bias: Accelerate the impact of people working to mitigate bias in AI.

Responsible: Ashley Boyd

# Key Results KR Lead
3.1 X% increase in investments for AI + bias grantees. Jessica Gonzales
3.2 X# people participate (share stories, donate data, etc.) in projects on mitigating bias in AI as a result of Mozilla promotion. Xavier Harding
3.3 Pipeline of additional projects Mozilla can support to mitigate bias in AI established. Roselyn Odoyo

OKR 4: Growing Across Movements: Strengthen partnership with diverse movements to deepen intersections between their primary issues and ours, including trustworthy AI.

Responsible: J Bob Alotta

# Key Results KR Lead
4.1 Phase 1 Landscape analysis is complete and we have identified partner movements. Hanan Elmasu
4.2 The Foundation’s African Mradi workstream centering local expertise is developed. TBD (pending hire)
4.3 Synchronize internal operations to strengthen ability to strategically partner externally. Lindsey Frost Dodson

OKR 5: Organizational Effectiveness: Enhance our organizational systems and capabilities to support more data-informed decision-making.

Responsible: Angela Plohman

# Key Results KR Lead
5.1 2022 planning and budget decisions driven by systematic evaluation of our work in 2021. Lainie DeCoursy
5.2 100% of teams have workflows and reports that are supported by our integrated CRM. Jackie Lu
5.3 Complete data analysis that reveals best approaches for converting ‘subscribers’ to ‘donors’. Will Easton