Home Apps Battle of the (Wikipedia) bots

Well-intentioned programmers have made bots to automate Wikipedia editing. Yet one person's treasure is another person's trash, resulting in bots clashing and consistently undoing each other's work.

Chances are you've dealt with a bot in your daily online activities. Depending on your age, it may have been an IRC bot that welcomed you to a chatroom. Or, today, it may be a Twitter bot that follows you to lure you into following its master; it may be an email autoresponder or so-called live chat on a website; it may be the plethora of spam you delete on a daily basis, maybe you received one of the 20 million messages sent by 70,000 "female" bots to trick users of the Ashley Madison website. Or perhaps your interaction with a bot today may be changes to the most recent Wikipedia article you've read.

It's already bad enough that many human editors are bizarrely committed to Wikipedia's destruction, seeking to speedily delete new content, but researchers have found Wikipedia bots are now fighting over what their respective owner believes to be the "right" change to an article.

The paper, Even good bots fight: The case of Wikipedia, analyses the interaction between bots that edit articles on Wikipedia, tracking the extent to which they undid each other's edits over the period 2001-2010. It also models how pairs of bots interact over time, and identify different types of interaction trajectories.

The researchers found Wikipedia bots, while well-intentioned to support the encyclopaedia, often undid each other's work, with some fights continuing for years.

Bots, as computer programs, do not have the capacity for emotions and are predictable automatons. Wikipedia bots are intended to handle repetitive and mundane tasks that maintain the encyclopaedia and are expected to follow Wikipedia's official bot policy that includes bot accounts be explicitly flagged and approved.

For the most part, Wikipedia bots benefit the system. In 2014, about 15% of edits were performed by bots, identifying and undoing vandalism, enforcing bans, checking spellers, welcoming newcomers and so forth.

Yet, not all bots operate harmoniously. The researchers focused on edits and specifically reversions, where an edit had been undone.

Their exploration uncovered that, on average, bots on the English-language Wikipedia reverted the work of another bot 105 times.

These bot conflicts could extend for years, with the average response time being a month, likely due to the combination of time needed to re-crawl articles, and to Wikipedia's constraints on bot activity frequency.

Ultimately, the research showed that bots, despite their predictability, ultimately interact as unpredictably and as inefficiently as humans.

The researchers suggest even relatively "dumb" bots, therefore, have complex interactions, and there are potential learnings for artificial intelligence, as well as automated social media management, cyber-security and autonomous vehicles.

LEARN NBN TRICKS AND TRAPS WITH FREE NBN SURVIVAL GUIDE

Did you know: Key business communication services may not work on the NBN?

Would your office survive without a phone, fax or email?

Avoid disruption and despair for your business.

Learn the NBN tricks and traps with your FREE 10-page NBN Business Survival Guide

The NBN Business Survival Guide answers your key questions:

· When can I get NBN?
· Will my business phones work?
· Will fax & EFTPOS be affected?
· How much will NBN cost?
· When should I start preparing?

DOWNLOAD NOW!

David M Williams

joomla site stats

David has been computing since 1984 where he instantly gravitated to the family Commodore 64. He completed a Bachelor of Computer Science degree from 1990 to 1992, commencing full-time employment as a systems analyst at the end of that year. Within two years, he returned to his alma mater, the University of Newcastle, as a UNIX systems manager. This was a crucial time for UNIX at the University with the advent of the World-Wide-Web and the decline of VMS. David moved on to a brief stint in consulting, before returning to the University as IT Manager in 1998. In 2001, he joined an international software company as Asia-Pacific troubleshooter, specialising in AIX, HP/UX, Solaris and database systems. Settling down in Newcastle, David then found niche roles delivering hard-core tech to the recruitment industry and presently is the Chief Information Officer for a national resources company where he particularly specialises in mergers and acquisitions and enterprise applications.

 

 

 

 

Connect

Join the iTWire Community and be part of the latest news, invites to exclusive events, whitepapers and educational materials and oppertunities.
Why do I want to receive this daily update?
  • The latest features from iTWire
  • Free whitepaper downloads
  • Industry opportunities