Thanks for sharing. I was able to read the entire article from a college database where my husband is an adjunct. 
The fact checkers use similar methods suggested for students by Mike Caulfield. If you haven't read his free online book, it's a great read.
https://webliteracy.pressbooks.com/


Juliann T. Moskowitz
Director of Library Media
St. Joseph High School
Trumbull, CT 06611
juliann14@hotmail.com

"Google can bring you back 100,000 answers. A librarian can bring you back the right one."

-- Neil Gaiman, author


 





 
     

   


From: CASL-L <casl-l-bounces@mylist.net> on behalf of Theresa Welch via CASL-L <casl-l@mylist.net>
Sent: Tuesday, January 14, 2020 9:24 AM
To: casl-l@mylist.net <casl-l@mylist.net>
Subject: [CASL-L] Follow up to CRAAP question
 
From the Marshall Memo (highlight added)

6. Who Is Best at Spotting Junk on the Internet? 

            “Technology can do many things, but it can’t teach discernment,” say Sam Wineburg (Stanford University) and Sarah McGrew (University of Maryland) in this Teachers College Record article. “The Internet has democratized access to information but in so doing has opened the floodgates to misinformation, fake news, and rank propaganda masquerading as dispassionate analysis.” Wineburg and McGrew share the results of their study of three different groups’ ability to critically examine online material, how long they took, and the strategies they used. They watched 25 Stanford undergraduates, 10 Ph.D. historians, and 10 professional fact checkers as they looked at online material on bullying in schools, minimum wage policy, and teacher tenure.

Who did best? It wasn’t even close. “Only two of the 10 historians adroitly evaluated digital information,” say Wineburg and McGrew. “Others were often indistinguishable from college students. Both groups fell victim to the same digital ruses.” Only 20 percent of the undergraduates were able to identify the most reliable website for one of the issues, only 50 percent of the historians – and 100 percent of the fact checkers. The amount of time needed to find a relevant source for another issue was 318 seconds for the students, 220 seconds for the historians, and 51 seconds for the fact checkers. 

The fact checkers did significantly better because they used two specific techniques that are eminently teachable: 

-   Taking bearings – “Before diving deeply into unfamiliar content, chart a plan for moving forward,” say Wineburg and McGrew. “Taking bearings is what sailors, aviators, and hikers do to plot their course toward a desired destination.” 

-   Lateral reading – This means immediately leaving the website being examined and opening new tabs along the browser’s horizontal axis, drawing on the resources of the Internet to learn more about the site in question and its claims. 

It’s interesting that these approaches are quite different from the Common Core skill of close reading: “read and reread deliberately” in order to “reflect on the meanings of individual words and sentences.” When looking critically at Internet content, quite a different approach is needed. “Instead of closely reading or ticking off elements on a list,” say Wineburg and McGrew, “[the fact] checkers ignored massive amounts of irrelevant (or less crucial) text in order to make informed judgments about the trustworthiness of digital information. In short, fact checkers read less but learned more.” 

 

“Lateral Reading and the Nature of Expertise: Reading Less and Learning More When Evaluating Digital Information” by Sam Wineburg and Sarah McGrew in Teachers College Record, November 2019 (Vol. 121, #11, pp. 1-40), available for purchase at https://bit.ly/2RbxpbL; Wineburg can be reached at wineburg@stanford.edu, McGrew at mcgrew@umd.edu