2025
Mapping non-emergency support structures
Documenting local help and support structures, especially for conflict resolution, across four wikis to inform recommendations for the Incident Reporting System.
Lead: Claudia Lo with Katie Coleman, for the Product Safety and Integrity team
Understanding User Needs for Deep Reading
Understanding the "deep reading" experience on Wikipedia, to inform improvements to the reading experience, particularly around discovery and search.
Lead: Tara Rooks with Viviana Ortiz, Eli Asikin-Garmager, and Debra Kumar for the Readers team
Centralized Contributions for Moderation
Discover if experienced editors want a centralized hub to discover and find new contribution suggestions, and if so, what should be on it to encourage its use.
Lead: Claudia Lo, Jahnavi Mirashi for the Moderator Tools Team
Watchlist and Task Prioritization
Understanding how moderators and patrollers use their watchlist page and how they discover, organize and prioritize their on-wiki work.
Lead: Daisy Chen for the Moderator Tools Team
Peacock Check Concept Testing
Conversations with 4 readers and 1 editors about the Editing team's upcoming Peacock Check feature in Visual Editor.
Lead: Michael Raish with the Editing Team
Collaborative Translation
Understanding group processes & needs around translation.
Lead: Eli Asikin-Garmager and Jahnavi Mirashi
Reading and Editing in the iOS Wikipedia App
This study explores how English and Japanese Wikipedia app users engage with reading and editing, to inform future improvements to the iOS app experience.
Lead: Bethany Gerdemann, Mike Raish, Project Kobo K.K. with Haley Nordeen and the Mobile Apps team
Understanding the usage and impact of the Flagged Revisions extension
As part of the Research group, the Design Research team supports Product Design by delivering insights that help product teams make informed and inspired decisions.
Lead: Claudia Lo, Tara Rooks for the Moderator Tools Team
2024
WikiProject Survey
Design, distribute, and analyze a survey of WikiProject contributors in order to identify projects for impact.
Lead: Michael Raish and Alex Stinson for the Campaigns team
Wikipedia Administrator Experiences
Causes behind human administration recruiting, retention, and departure patterns.
Lead: Eli Asikin-Garmager, Yu-Ming Liou, Claudia Lo, Caroline Myrick, with Bethany Gerdemann and Daisy Chen
Volunteer Archetypes V2
While V1 Archetypes research engaged with both off-Wiki volunteer contributors and previous research conducted by the WMF, V2 intends to provide a strong empirical basis for Archetypes by expanding the size and number of populations engaged.
Lead: Michael Raish, Tanja Andic, Sneha Patel, and Rita Ho
Wikipedia is an Antidote to Disinformation
Literature review highlighting the place of Wikipedia as a key part of the internet information ecosystem, in the fight against disinformation.
Lead: Claudia Lo, Martin Gerlach, Pablo Aragón, Diego Saez-Trumper, Leila Zia for the Global Advocacy Team
Volunteer Archetypes V1
Using survey methods, this project builds archetypes of non-Wikimedian volunteer internet contributors in order to support strategic decisions about whom to target for recruitment into the Movement.
Lead: Michael Raish, Tanja Andic, Sneha Patel, and Rita Ho
Patroller Work Habits Survey
Surveying the contributions and working habits of patrollers, to better guide future efforts to aid this group.
Lead: Claudia Lo, for the Moderator Tools Team
Current Usage and Future Adoption of Wikifunctions
This study generates a comprehensive understanding of how Wikifunctions might successfully integrate into Wikipedia.
Lead: Andrew Russell Green and Amin Al Hazwani, for the Abstract Wiki team
Cross-Wiki Uploads
User interviews of English and Arabic Wiki uploaders were conducted to understand the experience and mindsets of those who use the visual editor upload tool, and what leads them to upload copyrighted material.
Lead: Bethany Gerdemann with Eman Yahia for the Structured Content team
Wikipedia Subscription
User research to study the reading and information seeking habits of internet users, and their usage of subscription services, to help inform products that can deliver Wikipedia content in novel formats.
Lead: Gabriel Escalante and Purity Waigi, for the Inuka Team
iOS Navigation Refresh
Variant testing in English, Arabic, German, and Chinese for a redesign the iOS navigation system with the goal of ensuring a seamless user experience.
Lead: Olga Tichonova, with Bethany Gerdemann and Eman Yahia for the Apps team
MinT (Machine in Translation) Research
The goal of this project was to understand how to better leverage MinT to support more readers and contributors in their aims of accessing, interacting with, and contributing to Wikipedia content, as well as knowledge more generally.
Lead: Eli Asikin-Garmager in collaboration with Pau Giner and Anagram Research
Non-Editing Participation
Exploring the ways that people read on and off Wikipedia, and the expectations of new Wikipedia account holders for reading features.
Lead: Mike Raish and Eman Yahia, for the Core Experiences teams
Edit Check
An interview study exploring the WMF Editing team's "Edit Check" tool and its reception among small- and medium-size Wikipedias in sub-Saharan Africa.
Lead: Mike Raish, Sandra Abrouk, and the Editing Team
2023
Community Configuration
Interviews with Wikipedia administrators about the WMF Growth team's "Community Configuration" platform.
Lead: Mike Raish for the Growth team
Wikimedia Preview Wordpress Plugin
User research to learn how site owners and site readers use and interact with the Wikimedia Preview, the Wordpress plugin.
Lead: Hureo, with Eli Asikin-Garmager, Bethany Gerdemann, and the Inuka Team
Commons Administrator Interviews
Research to understand Commons Administrators workflows, the challenges they face in moderating content, and their ideas for reducing deletions.
Lead: Bethany Gerdemann, with the Structured Content team
Commons Impact Metrics
Understanding affiliates data and metrics needs in the Commons space.
Lead: Daisy Chen for the Data Products team
Campaign Event Discovery Survey
Survey on most common and most effective campaign event discoverability pathways.
Lead: Claudia Lo with Bethany Gerdemann and Gregory Onyeahialam, for the Campaigns team
Incident reporting system
Research exploring current harassment reporting workflows and harassment target/responder pain points on Korean and Indonesian wikis.
Lead: Daisy Chen. Requestors: Aishwarya Vardhana, Madalina Ana from Trust & Safety team.
Non Editing Participation Literature Review
A literature review to identify ways in which users can engage with potential forms of non-editing participation, thereby increasing their interest and involvement in the Wikis.
Lead: Mike Raish, with YUX and the Core Experiences teams
Non Editing Participation Report
Recommendations based on a series of interviews to discover diverse avenues for user engagement through non-editing participation features.
Lead: Mike Raish, with YUX and the Core Experiences teams
Commons Uploader Experience
A project to understand the user experience of uploading images to Commons, with the purpose of improving the upload process while reducing deletion requests...
Lead: Claudia Lo, with Bethany Gerdemann, Criba Research and the Structured Content team
IP Masking Prototype Usability Testing
Moderated usability tests that explore the functions of an “IP Masking” prototype in English, Spanish, Arabic, and Japanese.
Lead: Mike Raish with Naoko Sato, and the Growth & Core Experiences teams
Future Learning Vectors
Interviews with educational, short-form video content creators about how, when, and why they use Wikipedia, in order to improve the quality and quantity of Wikipedia content that is disseminated to global audiences through learning-oriented video content.
Lead: Mike Raish, with Shobha S V and the Core Experiences teams
Trust & Wikipedia
An experiment to determine how embedding "trust signals" into the Wikipedia desktop interface affects understanding of how Wikipedia works and trust in its information.
Lead: Mike Raish, with Laura Stigliano and the Web team
Journey Transitions
This project explores what and whether notable moments cause users to deepen and expand their use of and/or contributions to Wikipedia.
Lead: Daisy Chen with Mike Raish and the Core Experiences product group
Reading Wikipedia
Understanding what readers in diverse language communities think they are doing as they read Wikipedia, their motivations for reading, the navigational structures they perceive and use, and the ways in which they interact with information.
Lead: Mike Raish with YUX for the Core Experiences product group
Translatable Pages
Developing a better understanding of the challenges users face when creating and updating translatable pages in order to improve the user experience.
Lead: Eli Asikin-Garmager, and the Language Team
Image Suggestions for Article Sections
Usability tests for a feature that suggests images to experienced contributors to add to an article section.
Lead: Daisy Chen, with the Structured Content team
New Page Patrollers
An interview series with English Wikipedia New Page Patrollers to develop a shared understanding of how new page patrolling works on the English Wikipedia and to identify potential areas for improvement grounded in today’s patrolling practices.
Lead: Claudia Lo, with the Moderator Tools team
Wikistories Early Adopters Research
A project about Wikistory adopters, both creators and readers, focusing on their motivations and experiences creating and sharing stories.
Lead: Eli Asikin-Garmager and Claudia Lo
2022
Momentum: Contribution, Content, & Reading in Wiki Growth
This study test some key hypotheses: that building awareness of the brand drives more people to read our content; that by growing readership the Movement itself grows as readers are converted into content contributors; that as the community grows so does the amount and diversity of content offered; and that more, and more diverse, content attracts more readers.
Lead: Jim Maddock with Michael Raish, Mikhail Popov, Isaac Johnson, and Margeigh Novotny
IP Editing on Japanese Wikipedia
An inquiry into why some editors don’t log in and how they might be encouraged to through a comparative study of Arabic, Bengali, Spanish, and Japanese Wikipedias.
Lead: Michael Raish and the Growth team
Newcomer Positive Reinforcement
Cross-cultural feedback on approaches to Positive Reinforcement through review of static designs.
Lead: Mary Grace Reich with Michael Raish and the Growth team
Campaigns Event Registration Page User Testing
Usability testing of the campaigns registration feature that enables organizers and participants to register for events on the Wiki platform.
Lead: YUX with the Campaigns Team
Wikimedia Commons User Interviews and Data
A project that combines usage data and in-depth interviews with Commons users to understand the current state of the platform.
Lead: Jeff Howard with Jim Maddock and Margeigh Novotny
The Future of IP-based editing
A project that investigates the difficulties with moderating anonymous edits and how best to mitigate those issues.
Lead: Claudia Lo with the Anti-Harassment Tools team
HabLatAm
A cross country study on the internet skills, habits, and behaviors of youth in Latin America.
Lead: Ana Chang & Eli Asikin-Garmager, with the Berkman Klein Center for Internet and Society
Section Translation Feedback Survey
Learning from the experiences of editors who used Section Translation during a Bengali Wikipedia article quality improvement competition in 2022.
Lead: Eli Asikin-Garmager with the Language team
Special Search Improvements: Spanish User Testing
When someone searches on Wikipedia but there isn’t an article matching their search term, they arrive at “special:search,”. This project tests new concepts for this feature with Spanish readers.
Lead: Mike Raish, Daisy Chen, Sneha Patel, UserTestingArabic, and Criba Research
Special Search Improvements: Arabic User Testing
When someone searches on Wikipedia but there isn’t an article matching their search term, they arrive at “special:search,”. This project tests new concepts for this feature with Arabic readers.
Lead: Mike Raish, Daisy Chen, Sneha Patel, UserTestingArabic, and Criba Research
Templates and Trust-o-meters: Towards a widely deployable indicator of trust in Wikipedia
This work identifies three key challenges: 1) empirically determining which metrics from community approaches most impact reader trust; 2) validating indicator placements and designs; and 3) demonstrating that such indicators can both lower trust and increase perceived trust in the system when appropriate.
Lead: Andrew Kuznetsov, Margeigh Novotny, Jessica Klein, Diego Saez-Trumper, Aniket Kittur
HabLatAm
A collaboration between the WMF and the Berkman Klein Center for Internet and Society to research the internet habits, preferences, and beliefs of youth across Latin America.
Readability on Wikipedia
A literature review of the web factors that affect text readability on the internet.
Lead: Taryn Bipat
Content Moderation in Medium-Sized Wikimedia Projects
This project aims to fill knowledge gaps in our understanding of how editors curate and moderate content on Wikimedia projects outside the largest and most well-researched communities.
Lead: Claudia Lo, with Sam Walton and the Moderator Tools pilot team
Event Organizers: A Study in 4 African Countries
This exploratory research project focused on organizations in four african countries to learn about the particular challenges faced event organizers and the strategies they have developed to succeed in growing their communities.
Lead: YUX with the Campaigns team
Section Translation Post Improvements Testing
Section Translation brings translation support to mobile device editors, and this project provided usability testing after a number of initial tool improvements and at a time when it was becoming available in a greater number of wikis, including Thai Wikipedia.
Lead: Eli Asikin-Garmager, with Teak Research and the Language team.
Wikifunctions: Usability of Bangla Prototype
Usability testing of Wikifunctions prototype with Bangla speaking programmers and developers.
Lead: UserHub with Aishwarya Vardhana and Daisy Chen
Wikistories Africa Research
This work provides a method for visualizing and comparing topical coverage from one wiki project to another for the purpose of better understanding where knowledge gaps and subject matter expertise exist across the Free Knowledge Movement.
Lead: Eli Asikin-Garmager, with Qhala Digital Consultancy, the Inuka Team, and Wikistories Team.
Wikistories Indonesia Concept Testing
The goal of this project was to collaborate with Wikimedia Indonesia and gather feedback from Indonesian wiki editor communities on early concepts and designs for WikiStories, focusing on the experience of potential WikiStory creators.
Lead: Eli Asikin-Garmager, with Ari Natarina.
2021
Communications and Mentorship
Understanding the wiki-related communications ecosystem, both on and off Wiki, that embedded community members employ for their Wiki related work.
Lead: Daisy Chen, with the Growth and Editing teams
Section Translation Entry Points
Improving discoverability of translation tools on Wikipedia.
Lead: Eli Asikin-Garmager, with Pau Giner, Anagram Research, and the Language team
Digital Spaces // Topical Neighborhoods
This project combined conversations with WMF employees and global volunteers with a literature review to support the internal conversation around the Digital Spaces // Topical Neighborhoods proposal.
Lead: Michael Raish
Media Matching
Conducted in support of the WMF Structured Data team, Media Matching attempted to identify barriers faced by experienced editors in English, Arabic, and Japanese Wikipedias.
Lead: Michael Raish, with the Structured Data team, Community Ambassadors and Project Kobo
Targets of Harassment
In order to support the Universal Code of Conduct’s phase 2 drafting committee, the Wikimedia Foundation has conducted a research project focused on experiences of harassment on Wikimedia projects.
Lead: Claudia Lo, with the Anti-Harassment Tools team
Wikimail Harassment
An analysis of community initiatives regarding Wikimail abuse over the past five years and general perspectives on harassment.
Lead: Jeff Howard
Long Tail Topic & Influence Ontology
This work provides a method for visualizing and comparing topical coverage from one wiki project to another for the purpose of better understanding where knowledge gaps and subject matter expertise exist across the Free Knowledge Movement.
Understanding Perspectives on Digital Education in African Contexts
Qualitative research examining the perspectives on the future of education from experts, students, teachers, and parents.
Lead: Ana Chang, with AfriqInsights
Verifiability on Wikipedia
This three-language study provides foundational understanding for how editors work with available resources to meet the standards required for article quality.
Lead: Ana Chang, with Sam Walton (WMF), Spiegel Institut, and Wikipedia Library team
WikiFunctions, phases 1 & 2
A study that collects feedback on the function-builder prototype from technical users, nontechnical users, and those from outside of the Wikimedia ecosystem.
Section Translation Usability Testing
Many Wikipedia contributors edit articles via mobile devices. Section Translation brings translation support to mobile device editors, and this project provided usability testing as the tool became available in the first wiki.
Lead: Eli Asikin-Garmager, with the Language team
Table of Contents & Sticky Header
This project tests the concept of a sticky header and the table of contents element as a part of a reader-focused desktop interface improvement initiative by the Web team.
Lead: Daisy Chen, with Criba Research, Urika Research, and Monafina F Silitonga
Taxonomy of Harassment
A taxonomy of negative social interactions on Wiki projects and a methodology to prioritize those in need of intervention.
Lead: Claudia Lo, with the Anti-Harassment Tools team
Wikifunctions, phases 1 & 2
A study that collects feedback on the function-builder prototype from technical users, nontechnical users, and those from outside of the Wikimedia ecosystem.
Lead: Jeff Howard
Decision-making & Influence
A collaboration to map the different types of group decision-making required to manage and improve a crowd-sourced content platform like Wikipedia and understand contributor communities as discrete social groups.
2020
Usability of the Wikipedia KaiOS app
This research evaluates usability and user understanding of the basic app elements and workflows for the Wikipedia KaiOS app.
Lead: Daisy Chen, with Hureo and the Inuka team
Multilingual Editor Experiences in Small Wikis
Supporting multilingual editors in small Wikipedias who are leveraging translation to contribute across knowledge and content gaps.
Lead: Eli Asikin-Garmager, with Anagram Research and the Language team
Patrolling Anonymous Edits
A study of how patrollers handle edits by logged-out users, and what pieces of information from IP addresses are necessary for their work in maintaining the overall health and quality of Wikimedia projects.
Lead: Claudia Lo, with the Anti-Harassment Tools team
CheckUser Improvements
Past research and Steward feedback surfaced many pain points in the CheckUser extension, the tool used by moderators to reverse vandalism of articles. These are multiple iterative rounds of usability tests of the enhanced tool.
Lead: Claudia Lo, with the Anti-Harassment Tools team
Trusted: Signals, Inferences & Indicators
The goal of this work is to develop systems for automatically detecting and characterizing the editorial debates behind Wikipedia articles for the purpose of surfacing indications of trustworthiness to the reader. This project is a collaboration between the Wikimedia Foundation and the CMU Human Computer Interaction Institute.
Lead: Andrew Kuznetsov of the Carnegie Mellon Human Computer Interaction Institute, with Margeigh Novotny
Trusted: Signals, Inferences & Indicators
Researchers are analyzing which types of cues work best to help readers understand the true quality of an article and scraping talk pages for the signals and inferences that can drive these cues.
Content Moderation Explained
An overview of how a global community of volunteer moderators determine which contributions to Wikipedia are accepted and rejected.
How Do News Organizations Use Wikipedia?
A literature review that seeks to understand more about how and why media and news organizations use Wikipedia.
Lead: Anna Rader
Interpersonal Communication on Wikipedia
A review of the available academic literature on interpersonal communication and coordination within the Wikipedia community, in particular the role of talk pages in creating and maintaining Wikipedia’s content.
Lead: Anna Rader
Content Moderation Explained
An overview of how a global community of volunteer moderators determine which contributions to Wikipedia are accepted and rejected.
Lead: Claudia Lo, with Lucy Blackwell
Machine Translation Meets Human Perception
The Machine Translation Meets Human Perception (MTMHP) study developed a protocol for evaluating how readers perceive and evaluate machine-translated content - across three different languages, cultural contexts, and content domains.
Lead: Michael Raish, with Hureo and Bethany Gerdemann
Android App Suggested Edits
This project addresses user difficulties using and understanding the Contributions screens as part of the Section Editing feature.
Lead: Daisy Chen, with the Android App Team
Free Knowledge Movement Personas
This is a collection of all personas developed to date for desktop and mobile readers, new and veteran contributors, institutional curators and Movement organizers.
CheckUser Workflow & Tools
CheckUser is a critical tool for moderators in anti-vandalism efforts on Wikipedia. This research surfaces the pain points in the CheckUser workflow and opportunities to improve the tool.
Lead: Claudia Lo, with the Anti-Harassment Tools team
Content Translation Newcomer Survey
A customizable survey that can be used to easily, quickly, and reliably collect feedback from a more diverse pool of Content Translation users.
Lead: Eli Asikin-Garmager, with Pau Giner and Amir E. Aharoni
Article Section Translation Study
An evaluation of new designs for tools that help editors translate article sections and receive translation support on mobile devices.
Lead: Eli Asikin-Garmager, with the Language team and Pau Giner
Why Do People Edit Wikipedia?
This literature review examines the drivers of engagement and the motivators of sustained involvement in Wikipedia. It provides an overview of academic research on Wikipedia production and editor motivations, highlighting the themes and queries of scholarship in this area.
Lead: Anna Rader
WMF Remote Worker Experience
This is a lean, inward-facing research project that surfaced the challenges and highlights of working remotely for the Foundation and offered opportunities for the Foundation and non-remote staff to improve the work lives and for its globally distributed staff.
Readers’ Perceptions of Desktop Wikipedia
This study seeks to understand the experience and sentiment of new and casual readers on Wikipedia.
Lead: Daisy Chen, with Hureo and the Readers Web team
2019
Micro-contributions on the Android App
This research studies the factors that motivate and retain Wikipedia editors who use the section editor tool on the Android app.
Lead: Daisy Chen, with the Android team and Robin Schoenbaechler
Wiki Comparison
This initiative, lead by Neil Shah-Quinn, established a baseline for comparing key characteristics of 732 wiki projects (number of editors, amount of content, readers, readers by platform, etc).
Lead: Neil Shah-Quinn
New Page Patrol
Content integrity on Wikipedia relies on editors who patrol articles and respond to vandalism. This research brings to light the workflows of editors who patrol on Wikipedia(s) and the tools used in the course of their work.
Lead: Jonathan Morgan
Community Health Survey
An inquiry into the unique challenges faced by volunteer administrators in conflict resolution situations and what types of resources would be most beneficial to this work.
Lead: Michael Raish, with the Anti-Harassment Tools Team
IP Masking Impact Report
An examination of how edits by users who have not logged-in effect administrator workflows.
Lead: Claudia Lo, with the Anti-Harassment Tools Team
Movement Organizers
The Movement Organizer Study sought to understand the life cycle of contributors by documenting paths, practices, challenges, and risks faced by successful organizers throughout the Free Knowledge.
Lead: Abbey Ripstra, The Community Engagement Team, and Ana Chang of Concept Hatchery
Searching on Commons: Structured Data Across Wikimedia
This project tests usability of prototypes and captures users’ expectations and desires in their search experiences on the wikis.
Lead: Daisy Chen, with the Structured Data Team
Ethical & Human-Centered AI
This project identfies challenges and emerging opportunities to leverage AI technologies to further the mission of Wikimedia.
Lead: Jonathan Morgan
Harassment on Arabic Wikipedia
This study aimed to establish a baseline understanding of the frequency of harassment, trolling, and threatening communicative practices on Arabic Wikipedia, to explore the effects of these practices on editing and community health, and to guide future investigation and strategic decisions for the Anti-Harassment Tools team.
Lead: Mike Raish, with the Anti-Harassment Tools Team
Reporting System Rubric for the Community Health Initiative
A review of best practices in reporting systems for disputes and abuse across different platforms.
2018
Steward Spambot Workflow
An investigation into one of the most common tasks performed by stewards, global locks of suspected spambots that are active across multiple wikis.
Lead: Claudia Lo, with the Anti-Harassment Tools team
Augmentation
This is a position piece developed by Wikimedia Foundation to frame the potential of machine-in-the-loop technologies in making all the world's knowledge available to all, and to ensure the process to assemble that knowledge is inclusive, balanced and safe.
Lead: Wikimedia Foundation, with David Goldberg
Contribution Workflow Taxonomy
This project details the workflow of contributors and assesses the level of difficulty they encounter with each step in the workflow in order to understand what contributes to editors' retention or burn-out.