The Internet changed our information consumption habits. Information used to come to us in the form of newspapers, radio, and television that reported on a schedule - print editions, programs, and broadcasts. Now, information is around-the-clock. The expectation of instant information has implications such as misinformation and disinformation. Distinguishing the difference between the two further challenges internet users because many content websites appear legitimate. Wikipedia earned the public’s trust, yet Wikipedia is prey to mis-/dis-information. For media and brand teams, learning about Wikipedia’s tools to defend against mis-/dis-information aids in the strategic planning and management of media relations and brand management programs. Doing so helps control misinformation on the company Wikipedia article so it’s an accurate and balanced company story.
When it comes to separating fact from fiction, it helps to know what to look for. First Draft, the non-profit organization advocating for quality and ethical journalism, created a list of the seven types of mis-/dis-information:
Fabricated content – new content that 100% false, designed to deceive and do harm.
Manipulated content – when genuine information or imagery is manipulated to deceive.
Imposter content – when genuine sources are impersonated.
False context – when genuine content is shared with false contextual information.
Misleading content – misleading use of information to frame an issue or individual.
False connection – when headlines, visuals, or captions don’t support the content.
Satire or Parody – no intention to cause harm but has potential to fool.
Mis-/dis-information on a company Wikipedia article is referred to as vandalism. Vandalism comes in different forms. For example, it not uncommon to see companies planting misinformation on their competitor’s Wikipedia articles. The example below is the edit history of well-known auto parts retailer's Wikipedia article edited by a competitor.
Wikipedia Notification of IP Address Edit
Advanced Auto Parts' IP Address
Company Wikipedia articles are easy vandalism targets. Take for instance following the 2008 Financial Crisis, AIG and Assurant’s Wikipedia articles were continually vandalized. Both companies refrained from directly editing their Wikipedia articles. Instead they adopted an ethical approach of curating secondary sources then petitioning balance to the overtly negative content on the Wikipedia articles.
In an ethical approach to controlling mis-/dis-information on a company Wikipedia article, the recommended best practice is:
Pinpoint the source of the mis-/dis-information. Scroll down to the References to identify it.
Click the reference URL to visit the website.
Confirm if it’s on the Perennial Sources list. If it is on the list, the offending statement cannot be removed. If it is not on the list, proceed to Step 5.
Do not directly edit on the company Wikipedia article.
Submit an Edit Request to replace the content.
Edit request responses rates are from a few weeks to a few months. Corrections to Wikipedia articles are dependent on the availability and interest of Wikipedia’s volunteer editors. About seventy-seven percent (77%) of Wikipedia’s content is written by one percent (1%) of Wikipedia’s one hundred thirty-three thousand (133,000) volunteer editors.
A Wikipedia mis-/dis-information management protocol is a component of a Wikipedia program for managing the company brand story and brand messaging. Continuous engagement in the Edit Request system nurtures relationships with Wikipedians. The Wikipedians become unofficial brand ambassadors who react to vandalism on the company Wikipedia article. When mis-/dis-information appears, they remove it immediately so the company's Wikipedia article content is balanced and accurate.