Rise of the bots: Team completes first census of Wikipedia bots
Researchers at Stevens Institute of Technology, in Hoboken, N.J., have now completed the first analysis of all 1,601 of Wikipedia’s bots, using computer algorithms to classify them by function and shed light on the ways that machine intelligences and human users work together to improve and expand the world’s largest digital encyclopedia. The work, published in Proceedings of the ACM on Human-Computer Interaction, could inform the development and use of bots in commercial applications ranging from online customer service to automated microchip design.
“AI is changing the way that we produce knowledge, and Wikipedia is the perfect place to study that,” said Jeffrey Nickerson, a professor in the School of Business at Stevens and one of the study’s authors. “In the future, we’ll all be working alongside AI technologies, and this kind of research will help us shape and mold bots into more effective tools.”
By leveraging Wikipedia’s transparency and detailed record-keeping, Nickerson used automated classification algorithms to map every bot function as part of an interconnected network. By studying the places where functions clustered, the team identified bots’ roles like “fixers,” which repair broken content or erase vandalism; “connectors,” which link pages and resources together; “protectors,” which police bad behavior; and “advisors,” which suggest new activities and provide helpful tips.
In total, bots play nine core roles on Wikipedia, accounting for about 10 percent of all activity on the site, and up to 88 percent of activity on some sub-sections such as the site’s Wikidata platform. Most of that activity comes from more than 1,200 fixer-bots, which have collectively made more than 80 million edits to the site. Advisor-bots and protector-bots, by contrast, are less prolific, but play a vital role in shaping human editors’ interactions with Wikipedia.
New members of online communities are more likely to stick around if they’re welcomed by fellow members — but Nickerson and his team found that new Wikipedia users who interacted with advisor- and protector-bots were significantly more likely to become long-term contributors than those greeted by humans. That remained true even when the bots were contacting users to point out errors or delete their contributions, as long as the bots were cordial and clear about their reasons.
“People don’t mind being criticized by bots, as long as they’re polite about it,” said Nickerson, whose team includes Feng Mai, graduate student Lei (Nico) Zheng and undergraduate students Christopher Albano and Neev Vora. “Wikipedia’s transparency and feedback mechanisms help people to accept bots as legitimate members of the community.”
Over time, some bots fell into obsolescence while others expanded and took on new roles. Studying the evolution of bots, and the ways that human-defined policies shape the bot ecosystem, remains a promising field for future research. “Are we heading for a world with a handful of multipurpose super-bots, or one with lots and lots of more specialized bots? We don’t know yet,” said Nickerson.
One thing is clear, though: Wikipedia’s bots, and the governance and feedback systems that have sprung up around them, offer lessons for commercial bot-builders. “The things we’re seeing on Wikipedia could be a harbinger of things to come in many different industries and professions,” said Nickerson. “By studying Wikipedia, we can prepare for the future, and learn to build AI tools that improve both our productivity and the quality of our work.”