Side-blotched lizards have a colourful set of mating strategies. Some males, with orange throats, are large and bully their way to a harem of mates. Others, with blue throats, are a bit smaller and bond with just a single female. And then there is a third type. These lizards, which have yellow throats, look very similar to females. In a rock-paper-scissors-type mating strategy, the big orange ones can chase off the medium blue ones, who, in turn, can keep a close eye on their single mate. The yellow ones, however, sometimes sneak into the big males’ harems. Biologists affectionately call such lizards “sneaky fuckers”.
The lizards face a trade-off between abundance and the cost of verification. It is a trade-off that is increasingly of concern to humans, too—particularly avaricious information consumers. The cost of producing text, video or audio has plummeted; doing so now means just a few prompts on a chatbot rather than hours of effort. As a consequence, roughly half of articles published on the internet last year were AI-generated, according to Graphite, a coding-review site. A less endearing type of sneaky fucker is profiting. Content farms can pump out seemingly real “rage bait”, material that has little basis in reality but is designed to make viewers and readers furious and induce them to keep clicking.
How can internet users distinguish an article by a genuine but little-known news website from an outright fake? Market participants often rely on “costly signals” to make such choices. An employer looking for a new hire, say, may seek applicants with a degree that took effort and a certain amount of skill, which indicates they might bring the same qualities to their work. The certificate allows employers to distinguish between good and bad applicants—creating what economists call a “separating equilibrium” and enabling transactions between willing participants. In the animal world, a peacock’s tail performs a similar role. A male lugging around a fan of feathers has to be fit to avoid predators.
During an earlier information age, newspaper branding provided a costly signal. In what was, in game-theoretic terms, a “reputation game” newspapers would, issue by issue, reveal exactly how focused on the truth they were. Repeated interactions with a reader, when they were deciding which paper to purchase, constituted a separating equilibrium. Content farms, however, can mimic these brands. Indeed, websites that copied reputable news pages but with made-up stories were how the term “fake news” originated. Are you reading this column on Economist.com or Economist.co? Or perhaps you are only seeing a snippet on social media, shorn of context and without a link to where it came from.
Other, similar signals have also been spoiled by AI. Lucid writing, realistic images and polished audio once helped readers distinguish between news that is designed to inform and that which is either indifferent or hostile to the truth. A well-crafted analogy, perhaps featuring side-blotched lizards, might have informed the reader that care and effort had gone into producing an article. For a time, rules of thumb helped tell AI output from the authentic; by checking the number of fingers on people in an image or for the use of em-dashes in writing, information-consumers could get a sense of how likely the material was to have been generated using AI. The rising frequency of the word “delve” in academic articles and “it’s worth noting that” on internet pages told you that more of both were being written by AI. But as chatbots have improved, such rules of thumb are becoming less useful.
Thus the “separating equilibrium” is devolving into a “pooling equilibrium”, in which there is no way of telling between one type of creation and another. The problem with a “pooling equilibrium”, as George Akerlof of Georgetown University put it in a Nobel-prizewinning paper, is that “Dishonest dealings tend to drive honest dealings out of the market.” Without a way to tell good from bad, buyers treat everything as bad. If careful drivers, for example, must pay the same insurance premiums as the truly risky, such drivers may decide not to bother buying a car. If enough do so, insurers will start to quit the market. In the separating-information economy, true and false information are easy to distinguish; in the pooling one, doing so is impossible. In which case, why pay for any news? And if no one will pay for it, why produce it?
New costly signals are therefore required. One option is to follow “chain of custody” techniques used to ensure evidence in a criminal trial is properly handled. This would involve embedding information in a video that flags when it was created and using what tools. If readers genuinely do care about authenticity and accuracy, audiences might flock to the platforms that better handled such warnings and displayed such information prominently. Or perhaps print journalism will enjoy a revival. It has the chain of custody built in: investing in printing presses, distribution networks and so on tells a reader that the institution intends to be around for a while and trade on its reputation. A content farm would struggle to get a fake magazine into a postman’s hands. Teachers are already increasing the frequency of hand-written assessments, a kind of “proof of work” to verify that the student concerned truly produced the exam script in question.
Among side-blotched lizards, sneaky fuckers have survived for millions of years. The rock-paper-scissors nature of the lizards’ mating strategies means that if any of the colours becomes too common, one of the other types receives an advantage. That keeps the whole population in a competitive, if often brutal, harmony. Optimistically, then, a surge in the productivity of misinformation generators may provide a similar advantage to the journalistic equivalent of the blue-throated lizards—publications that jealously guard their reputation for truth-telling. ■
Subscribers to The Economist can sign up to our Opinion newsletter, which brings together the best of our leaders, columns, guest essays and reader correspondence.