We know from the latest Facebook whistleblower that there are plenty of governments and others hard at work telling us lies or misleading half-truths. Many of them don’t even care that it’s obvious that the thousands of fans posting “likes” don’t actually exist.
We are getting wise to that. We are even learning more about how a dangerous mass-superstition spread like wildfire over the Internet: the QAnon conspiracy theory.
And all that poisonous misinformation about Covid? The lights are being shone into that too. Veteran digital literacy expert Michael Caulfield, or as he styles himself “Director of Blended and Networked Learning”, has developed a useful guide to how to spot Covid nonsense in 30 seconds
CYBER-NONSENSE DECIPHERED
But what about the overall picture of the more-and-more sinister cyber-playground?
El Pais in Madrid has just published an article under the title Lessons in fighting misinformation: Less critical thinking, more knowing what you’re looking at about Caulfield’s four-step method for identifying reliable sources; it is now being used by more than 100 US universities.
In students’ first year , Caulfield, 51, explains, students are often given a few classes on how to search for information and check sources. But with the explosion of the Internet over the past 10 years it is now clear that what students thought were reliable sources were frequently nothing of the sort.
GRAB YOUR ATTENTION THEN DON’T LET GO
The problem is not so much the lies that are circulating, but how people are seduced into paying attention to them, then pulled down endless, confusing, rabbit-holes, each more seductive than the one before.
A once-scarce commodity like information has suddenly become almost infinite. You no longer have to “pay attention” to understand something. No, millions of Web pages, videos or infographics are clamouring for your attention. Lies and exaggerations à la Trump are only two ways of doing that.
Caulfield’s four-step method is aimed at changing the way young people search for information. He calls it SIFT – see below – and it is now being taught at more than 100 American universities and dozens of middle schools.
1 ) DON’T start with critical thinking
The first target is misunderstood critical thinking, he told El Pais in a Zoom conversation from Vancouver, the campus of his university in the US North-West.
“Traditional critical thinking doesn’t work,” said Caulfield. “We ask students to take a document or a photo or some data and we tell them the most direct way to the truth is to look at it very carefully, to immerse themselves in it.”
But they don’t. It’s not that it’s wrong in itself. But it’s not enough: “You have to know what you’re looking at first.”
Once something or someone has grabbed your attention, your making the effort to analyse it gives the attention-grabbers a head-start: “A racist or anti-vaccine website aims to make you doubt what you have heard so far. Its effectiveness lies above all in the fact that the reader doesn’t know (and isn’t told) that this website, apparently serious, is in fact the work of a Nazi or a collective promoting homeopathy.”
2) SIFTing, a four-step method
Caulfield’s method is to avoid using critical thinking on unreliable sites. He called it SIFT.
- – Stop reading.
- – Research the source.
- – Look for more reliable information.
- – Look for the original context, especially in the case of photos, videos, or quotes.
The method is based on a very simple resource: before reading down the page vertically, open another tab and investigate the source horizontally. In Spain, the information verification platform Verificat has proved useful for this.
The concept of “lateral reading” was devised by Sam Wineburg, a Stanford University professor on whom Caulfield based his method (and who has developed a free-of-charge course in digital literacy.)
Wineburg and a group of professors recently published a scientific paper about an experiment in lateral reading. Before explaining anything to them, they asked a group of 87 young people to identify the credibility of a page as acceptable or not. Only three of them opened another tab to check the authors’ funding or CV.
LATERAL READING, LATERAL THINKING
The rest tried to unravel the answer by analysing the page using traditional critical thinking. The methods were almost random, without any basis: is it a .com or .org domain? Does it have a lot of links? Does it have a lot of ads? What does it say in “About As”?
After four sessions on lateral reading, 67 of the 87 searched for information outside that page. With the first method, only two of the three who did side-reading knew that the information was of dubious credibility and could say why. The other 84 focused “exclusively on features that were irrelevant or could be manipulated” by the authors. After the sessions, 36 of the young people had discovered the suspect funding of the site.
3) Doubts lead to cynicism
Caulfield’s method is to focus the questions and so reduce the time it takes to get something resembling an answer. “People get so many opposing points of view that to know whether something is true or false they have to look at it in depth, analyse the data, download an Excel sheet,” he says.
“They feel that discovering the truth will be an arduous journey, so they throw up their hands and say ‘who knows?’,” and from there on they become cynical. “The risk then is to believe that no-one is telling the truth, that everyone is equally lying,” he argues. So his aim is to eliminate some of that cynicism.
Caulfield’s SIFT method has been demonstrably effective in both conservative and progressive areas. He has come to the conclusion that fewer people post deliberate misinformation than we think: “We overestimate the number of people like this because the people who shout the loudest on the Internet are the most engaged, the ones who post on Facebook 15 times a day. The impression may be that there are lots of people like that. But there aren’t.”
6) The return of context
The Internet has removed the context from a lot of information. Before, it was clear where things came from: an encyclopaedia, a news report, a neighbour. Everyone gave it the weight they thought it had. Now the confusion is extraordinary. A partisan blog can resemble traditional media, a biased dictionary copies Wikipedia, or an anti-vaccine post imitates the language of a scientific article.
Thanks to our traditional training, we give weight to information that comes to us in an apparently serious way. That is no longer enough.
Caulfield’s method focuses on the users. But platforms also have a responsibility. On a mobile phone it’s harder to open another browser tab and type in a new search. “We’ve been trying to persuade the developers of these apps to make the process easier. WhatsApp is experimenting with incorporating some tools into their messages and they may end up doing this,” he says.
He believes teaching people about misinformation should be like teaching road safety: using your indicator or recognising road signs doesn’t mean companies and authorities shouldn’t play their part.
“Car manufacturers include seat belts, airbags, or collision detectors and the people who plan and build roads look for ways to protect pedestrians and cyclists. Everyone has to work together. You can’t solve the problem just by asking people to be better or changing platforms. You have to make a safer ecosystem while teaching better tools.”
Javier Jimenez-Moratalla, the AEJ’s Brussels Special Representative, attended a meeting in the European Parliament a few days ago with NandiniJammi, from Check My Ads .
She insisted on the need to hold advertising platforms to account.
DINFORMATIONINDEX
At the same meeting, Harris-Newton Ghita from Google said the company were “fighting disinformation and foreign interference”and trying to ensure “a safe and secure environment for all our users”. She announced the opening of a new Google SafetyEngineering Centre in Dublin focussed on content responsibility and making things more transparent for regulators and policymakers across the EU.
FURTHER LINKS
Harvard guide to “lateral reading”