0733 GMT October 18, 2019
In the last two years, around two dozen people in India have been killed by lynch mobs inflamed by rumors on WhatsApp, the encrypted messaging service owned by Facebook.
WhatsApp has also been fingered for its role in other hateful or unsavory episodes in Brazil and Pakistan, according to theguardian.com.
In each case, the accusation is essentially the same: Disinformation and lies, often of an inflammatory kind, are effortlessly disseminated by WhatsApp and obviously believed by some of the recipients, who are thereby encouraged to do terrible things.
In terms of software architecture and interface design, WhatsApp is a lovely system, which is why it is a favorite of families, not to mention Westminster plotters, who are allegedly addicted to it.
Its USP is that messages on the platform are encrypted end to end, which means that not even Facebook, the app’s owner, can read them. This is either a feature or a bug, depending on your point of view.
If you’re a user, then it’s a feature because it guarantees that your deathless prose is impenetrable to snoopers; if you’re a spook or a cop, then it’s definitely a bug, because you can’t read the damned messages.
A few years ago, WhatsApp added a key new feature — an easy way to forward a message to multiple chat groups at once. One of your friends sends you an interesting (or striking or witty or horrifying or infuriating) video clip and there’s a neat button that enables you — instantly — to send it to other WhatsApp groups to which you belong. One click and it’s done.
This was the feature that was exploited by people spreading disinformation in India and elsewhere and that transformed WhatsApp from being “a simple, secure and reliable way to communicate with friends and family” (as the company’s website lovingly describes it) into an engine for spreading religious and ethnic hatred at lightspeed.
Features such as this — and the ‘retweet’ button in Twitter — are examples of something that networked technology does well: It reduces the ‘transaction costs’ (in terms of time, effort, hassle and expense) involved in doing something.
Amazon’s ‘1-click ordering’ system is an example: See something, click, confirm, done. And of course, in many areas, reducing transaction costs is a very good thing. It makes life easier, enabling us to spend less time doing bureaucracy, filling out forms, posting letters and so on. Just think of how easy it is nowadays to tax your car online: No more searching for insurance and MOT certificates, no need to queue at the post office, etc. Bliss.
What we have learned since social media took hold of society, though, is that there is at least one area — communication — where reducing transaction costs may not be such a good thing. The reason why lies, rumors, conspiracy theories, disinformation and memes sometimes go viral and spread like pandemics is because the transaction cost of passing them on is close to zero.
A large proportion of those who relay fake news, for example, turn out to have read only the headline and the reasons that people retweet could have as much to do with human psychology as with the content that is being relayed.
They may be ‘virtue signaling’, for example, indicating to their contacts that they are, as it were, on message. There’s good empirical evidence that lies spread faster and farther than truth online.
‘Virality,’ say the authors of an academic study (pdf) of how three months’ worth of New York Times articles were shared, “is partially driven by physiological arousal. Content that evokes high-arousal positive (awe) or negative (anger or anxiety) emotions is more viral. Content that evokes low-arousal, or deactivating, emotions (eg sadness) is less viral.”
In June, in response to the horrific evidence of WhatsApp’s role in mob violence in India, where people forward more messages, photos, and videos than any other country in the world, the company did a test in which they limited to five the number of groups to which a user could relay messages. The objective was to throttle, or at any rate slow, the viral spread of misinformation. WhatsApp is now extending that limit to users everywhere.
It’s a start and a welcome recognition of social responsibility from a company that has hitherto appeared blind to the harm its products can do. But since a WhatsApp group can have up to 256 members, the new limit means that an unscrupulous user — in India or elsewhere — can still broadcast to 1,280 others. That’s enough to spark a sizable mob. But I suppose we should be thankful for small mercies.
*John Naughton is a professor of the public understanding of technology at the Open University.