This image was lost some time after publication.

We love Cal Tech graduate student Virgil Griffith's Wikipedia Scanner — a tool that has revealed to the public what we've always known: That people working at corporations, government agencies, and mass media outlets are duplicitous bastards. For instance, a State Farm employee buried commentary on its Katrina policy, a Fox News reporter erased aggregated battery charge, and someone at the Israeli Embassy sorted out the Palestinian-Israeli conflict to his satisfaction. It's certainly good gossip to learn who's blotted out petty grievances, but you have to know what you're looking for. And therein lies the problem with Wikipedia Scanner.

You likely already suspect the worst from, say, Wal-Mart and just want to know what its employees may have sanitized — but you don't question any other contributors to the page, who may or may not be trustworthy. That leads to what persnickety researchers call an "observational bias." Which is why University of California at Santa Cruz professor Luca de Alfaro is leading the Wiki Lab. Its latest project gauges the trustworthiness of authors. It looks at all entries submitted by Wikipedia authors and analyzes whether they've been edited, deleted, or expanded. Trustworthy lines display normally. The more text is edited, the more the Wiki Lab's tool highlights it in variants of orange. Right now, trust coloring is just a demo, but hopefully this will serve to dissuade gullible schoolchildren — a category in which we include most journalists — from using Wikipedia as a primary source.