Wikipedia adopts new AI strategy to ease workloads of humans editors, volunteer staff

Wikipedia adopts new AI strategy to ease workloads of humans editors, volunteer staff

Technology


Wikipedia has announced plans that are focused on integrating generative AI into the workflows of its human editors rather than replacing them with it.

Wikimedia Foundation, the non-profit entity that runs the omnipresent online encyclopedia, said its newly adopted AI strategy looks to reduce the workloads of its unpaid, volunteering team of moderators, editors, and patrollers so that they can focus more on quality control.

The move comes as other platforms like Duolingo and Shopify are increasingly betting on AI to do the work that human workers used to handle, fueling broader concerns about the impact of AI on job roles across industries.

Story continues below this ad

However, the Wikimedia Foundation ruled out replacing the team of volunteers behind Wikipedia with AI.

“For nearly 25 years, Wikipedia editors have researched, deliberated, discussed, built consensus, and collaboratively written the largest encyclopedia humankind has ever seen. Their care and commitment to reliable encyclopedic knowledge is something AI cannot replace,” the organisation said in a blog post published on Wednesday, April 30.

Festive offer

“We will take a human-centered approach and will prioritize human agency; we will prioritize using open-source or open-weight AI; we will prioritize transparency; and we will take a nuanced approach to multilinguality,” it added.

Wikimedia has said it will not use AI tools to generate content. Instead, AI will be used to remove “technical barriers” faced by its human volunteers and automate “tedious tasks” such as background research, translation, and onboarding new volunteers.

Story continues below this ad

AI will also be used to help Wikipedia’s editors find information more easily within the platform’s database, as per the group’s statement.

This is not the first time that Wikimedia has embraced AI. For instance, it relies on AI to detect vandalism, translate content, and predict readability of the content uploaded on the site.

In April this year, Wikimedia announced it was building an open dataset of “structured Wikipedia content” specifically optimised to be scraped by bots for AI training purposes. This move was aimed at reducing the strain on Wikipedia’s servers due to relentless content scraping by bots on the original Wikipedia site, leading to a 50 per cent rise in bandwidth consumption.

© IE Online Media Services Pvt Ltd





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *