Chapter 2. How Fake News Spreads: Word of Mouth
News has always been disseminated by word of mouth. Early humans lived in small groups, moving from place to place as needs required. As the human population grew, there was greater need for communication. Contact between groups became more common, and the connections between groups became more complex.1 News was still spread by word of mouth, but there was more to tell. There were, of course, subsistence details to convey, but there was also family news to share, gossip to pass on, fashion trends to consider, and theological questions to answer. There were few means to verify news that came from outside the local group. If a traveler arrived from a distance and said that the people in the next large town were wearing silk rather than skins, there was no way to verify this information without visiting the distant place in person.
Presumably as people came to view local resources as belonging to the group, there might have been incentive to mislead outsiders about the size of the population protecting those resources or to understate the quality or amount of resources. If a resource was scarce or valuable, there might be reason to provide misinformation. However, because news was oral, there is no record. We can’t know exactly what was said.
Written Word
Groups began to create tools that would allow them to tell a story, keep track of numbers, give directions, and so on about the same time as populations became sedentary and began to grow. In the Middle East, farmers, landowners, politicians, and family historians began to invent the means to keep track of, remember, and convey information.2 Some groups used pictures, some used counting devices, and eventually systems of writing were born. Written information posed its own set of problems.
First, there is the problem of writing material. Some people used stone for a writing surface.3 Marking stone takes a lot of time and effort. The result is permanent, but it is hard to carry around. Some groups used clay as a writing surface.4 This is a terrific material to use if you want to make your information permanent. Mark the clay, fire it, and the information is available for a long period of time. The downside of clay is that it is relatively heavy, it takes up a lot of room, and it breaks easily. This makes it somewhat difficult to transport. The Egyptians used papyrus (labor intensive and expensive).5 Native Americans used tree bark (delicate and easily damaged).6 People with herds of animals used animal skins to make parchment and vellum (not always available when required, lots of preparation needed).7 The Incas used knotted cords called quipus that acted as mnemonic devices as well as counting devices.8
Second, not everyone knew the secret of how to interpret the writing between groups or even inside a group. If knowledge is power, knowing how to read allowed people to assume the reins of power and to limit access to information, thus controlling what people did or did not know. This control made people dependent on those who knew the secret. As we saw above, some people did not hesitate to offer fake news to serve their own purposes to manipulate or influence those who could not read.
While the elite used systems of writing, the nonliterate members of the group would have continued to use word-of-mouth transmission of information. Information was conveyed from those in power by proclamation. A representative of the leader would be sent to read out a message to those who could not read but who had a need to know. Again there was no guarantee that the information being read was written truthfully, nor that it was read accurately to the nonliterate public. What people knew in the early stages of literacy was controlled by the literate.
Different writing systems required translators to convey information between groups. Here again, the honesty and or accuracy of the translation had a large effect on the exact information that people received. The same is true today. We often see articles that essentially “translate” information from highly technical and specialized fields into information most people can understand. The translator’s motives can influence what is reported and what language is used to report it. In the Wild West of the internet world, it’s hard to know what a translator’s motives are without spending an inordinate amount of time checking out the author’s credentials.
Printed Media
As more people became literate, it became harder to control information. More information appeared in printed form. More kinds of information were shared.9 Printed information was carried from place to place, and as new and faster means of transportation became available, people got news faster and more often. As means of spreading news widely and quickly, without intervention or translation, became more common, it was harder to control the messages people saw and heard. Newspapers, magazines, telegraph, and eventually radio, television, and the internet provided multiple avenues to transmit information without necessarily getting permission from the state or other power holder. As new media inventions became viable, they were used to share the news and other information, creating a wide range of options for news seekers.
Internet
With the birth and spread of the internet, it was thought that a truly democratic and honest means of sharing information had arrived. Control of the content accessible via the internet is difficult (but not impossible), making former information power holders less powerful. Anyone with access and a desire to share their thoughts could use the internet to do so. At first the technological requirements for creating a web page were beyond most individuals, but companies who saw a market built software that allowed “non-programmers” to create a web page without any knowledge of the computer code that was actually responsible for transmitting the message.
Information can now come from anywhere and at any time. Literally billions of actors can participate in the spread of information. The rate of flow of information and the sheer volume of information are overwhelming and exhausting. The democratization in information allows everyone and anyone to participate and includes information from bad actors, biased viewpoints, ignorant or uninformed opinion—all coming at internet users with the velocity of a fire hose. The glut of information is akin to having no information at all, as true information looks exactly like untrue, biased, and satirical information.
Added to the overwhelming amount of information available today is the impossibility for anyone to know something about everything. The details about how things work or what makes them function are beyond most individuals. What makes a cellphone work? What happens when you store something “in the cloud”? How does a hybrid car engine know which part of the engine to use when? What is the statistical margin of error, and how does it affect polls? Are vaccines harmful? Did the Holocaust really happen? Arthur C. Clarke’s Third Law states, “Any sufficiently advanced technology is indistinguishable from magic.”10 What this means in terms of fake news is that people are vulnerable to being misinformed because, in a world where all things seem possible, they have little or no basis for separating truth from fiction. It’s hard to find a trusted source, so all sources must be trustworthy or all must be suspect.
When the internet was made available to the general public in the 1990s, it was seen as a means of democratizing access to information. The amount of information that became available began as a trickle and turned into a Niagara, fed by a roaring river of new content. It became wearisome and then almost impossible to find a single piece of information in the torrent. Search engines were developed that used both human and computer power to sort, categorize, and contain much of the content on the internet. Eventually Google became the go-to means for both access to and control of the flood of information available, becoming so common that Google became a verb.
Computerization of information has a number of benefits. Large amounts of information can be stored in increasingly small spaces. Records of many kinds have become public because they can be conveyed electronically. With the advent of the internet, people can benefit from the combination of computerization and access, allowing information to be sent and received when and where it is needed. New devices have been invented to supply the fast and furious appetite for information. New types of information and new avenues for communication have become commonplace in the last decade. More and newer versions of devices and platforms appear with increasing frequency. Originally this explosion of information available to the public was viewed as the democratization of power for the benefit of everyone, but this view didn’t last long.11
This utopian view of the benefits of the computerization of information began to be overshadowed almost immediately. The concept of free information for the masses required that someone other than the consumers of that information pay for it. To make paying for the internet attractive, data was needed. Automatic software programs were developed to perform repetitive tasks that gathered data. These programs were known as bots—short for robots. What they collected became a commodity. Data collected by bots showed what sites were being used and what products were being purchased, by whom, and how often. This information could be used to convince advertisers to pay to place their advertisements on websites. The data could also be offered for sale to prospective clients to use for their own purposes. Through using bots, it became possible to harvest a wide variety of information that could be sold. Once bots were successfully programmed to collect and send information, that ability was expanded for uses far beyond simple advertising.
Social Media
The advent of social media presented another opportunity for advertising to specific and targeted groups of people. On social media sites such as Facebook and Twitter, information is often personal. These platforms are used to find like-minded people, to stay in touch with family and friends, to report the news of the day, and to create networks among people. These platforms provide an easy way to share information and to make connections. Social media networks provide a shorthand method of communication using icons to indicate approval and various emotions. This allows people to respond to items posted on their pages without actually having to write something themselves. If they enjoy something, the push of a button allows that message to be conveyed. It they wish to share the information with friends and followers, a single click can accomplish that task. It is possible for bots to be programmed to count those clicks and respond to them.
News outlets, advertisers, political parties, and many others have created web pages that can be directed to the accounts and networks of social media users using programmed algorithms called bots. The bots can be programmed to search for information on the internet that is similar to what a social media user has already clicked on, liked, or shared. They can then inject that new information into what the user sees.12 So, for example, rather than seeing stories from hundreds of news outlets, a bot will find news outlets that are similar to those already being viewed. Bots provide users with easy access to information about things they already like. By following links between accounts, bots can push information to the friends of a user as well. This means that friends begin to see the same array of information. Eventually one user and the friends and followers of that individual are seeing only information they agree with. This creates an information bubble that makes it appear that the likes of the group inside the bubble represent the likes of the majority of people (because the group inside the bubble never sees anything contrary to its preferences).
In Imperva Incapsula’s 2015 annual report on impersonator bot and bad bot traffic trends, Igal Zeifman states, “The extent of this threat is such that, on any given day, over 90 percent of all security events on our network are the result of bad bot activity.”13 Social and political bots have been used for the purposes of collecting and sharing information. In the last decade, there has been a concerted effort to design bots and bot practices that work to steer populations in general toward a particular way of thinking; to prevent people from organizing around a specific cause; and to misdirect, misinform, or propagandize about people and issues.14 The bots work much faster than humans can and work 24/7 to carry out their programming.
Humans assist bots in their work by liking and sharing information the bots push at them, often without reading the information they are sending along. Tony Haile, CEO of Chartbeat, studied “two billion visits across the web over the course of a month and found that most people who click don’t read. In fact, a stunning 55% spent fewer than 15 seconds actively on a page. . . . We looked at 10,000 socially-shared articles and found that there is no relationship whatsoever between the amount a piece of content is shared and the amount of attention an average reader will give that content.”15 This means that once a message has reached a critical number of people via bots, those people will assist in the spread of that information even though more than half of them will not have read it. The manipulation of computer code for social media sites allows fake news to proliferate and affects what people believe, often without ever having been read beyond the headline or caption.
Notes
- “History of Communication,” Wikipedia, last updated August 28, 2017, https://en.wikipedia.org/wiki/History_of_communication.
- Joshua J. Mark, “Writing,” Ancient History Encyclopedia, April 28, 2011, www.ancient.eu/writing/.
- “Stone Carving,” Wikipedia, last updated August 30, 2017, https://en.wikipedia.org/wiki/Stone_carving.
- “Clay Tablet,” Wikipedia, last updated August 25, 2017, https://en.wikipedia.org/wiki/Clay_tablet.
- Joshua J. Mark, “Egyptian Papyrus,” Ancient History Encyclopedia, November 8, 2016, www.ancient.eu/Egyptian_Papyrus/.
- “Uses for Birchbark,” NativeTech: Native American Technology and Art, accessed September 6, 2017, www.nativetech.org/brchbark/brchbark.htm.
- “Differences between Parchment, Vellum and Paper,” National Archives website, US National Archives and Records Administration, accessed September 6, 2017, https://www.archives.gov/preservation/formats/paper-vellum.html.
- Mark Cartwright, “Quipu,” Ancient History Encyclopedia, May 8, 2014, www.ancient.eu/Quipu/.
- Winstone Arradaza, “The Evolution of Print Media,” Prezi presentation, November 11, 2013, https://prezi.com/qpmlecfqibmh/the-evolution-of-print-media/; “A Short History of Radio with an Inside Focus on Mobile Radio,” Federal Communications Commission, Winter 2003–2004, https://transition.fcc.gov/omd/history/radio/documents/short_history.pdf; “Morse Code and the Telegraph,” History.com, accessed September 6, 2017, www.history.com/topics/inventions/telegraph; Andrew Anthony, “A History of the Television, the Technology That Seduced the World—and Me,” Guardian, September 7, 2013, https://www.theguardian.com/tv-and-radio/2013/sep/07/history-television-seduced-the-world.
- Arthur C. Clarke, Profiles of the Future: An Inquiry into the Limits of the Possible (London: V. Gollancz, 1973), 39.
- Peter Ferdinand, “The Internet, Democracy and Democratization,” Democratization 7, no. 1 (2000): 1–17, https://doi.org/10.1080/13510340008403642.
- Tarleton Gillespie, “The Relevance of Algorithms,” in Media Technologies: Essays on Communication, Materiality and Society, ed. Tarleson Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot (Cambridge, MA: MIT Press, 2014), 167–94; Alessandro Bessi and Emilio Ferrara, “Social Bots Distort the 2016 U.S. Presidential Election Online Discussion,” First Monday 21, no. 11 (November 7, 2016), http://journals.uic.edu/ojs/index.php/fm/rt/printerFriendly/7090/5653; Tim Hwang, Ian Pearce, and Max Nanis, “Socialbots: Voices from the Fronts,” Interactions, March/April 2012: 38–45; Emilio Ferrara, Onur Varol, Clayton Davis, Filippo Menczer, and Alessandro Flammini, “The Rise of Social Bots,” Communications of the ACM 59, no. 7 (July 2016): 96–104.
- Igal Zeifman, “2015 Bot Traffic Report: Humans Take Back the Web, Bad Bots Not Giving Any Ground,” Imperva Incapsula Blog, December 9, 2015, https://www.incapsula.com/blog/bot-traffic-report-2015.html.
- Samuel C. Woolley, “Automating Power: Social Bot Interference in Global Politics,” First Monday 21, no. 4 (April 4, 2016), http://firstmonday.org/ojs/index.php/fm/article/view/6161/5300; Peter Pomerantsev and Michael Weiss, The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money (Institute of Modern Russia and The Interpreter, 2014), www.interpretermag.com/wp-content/uploads/2015/07/PW-31.pdf; Bence Kollanyi, Philip N. Howard, and Samuel C. Wooley, Bots and Automation over Twitter during the U.S. Election, Data Memo 2016.4 (Oxford, UK: Project on Computational Propaganda, November 2016), http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2016/11/Data-Memo-US-Election.pdf; Paul Roderick Gregory, “Inside Putin’s Campaign of Social Media Trolling and Fake Ukrainian Crimes,” Forbes, May 11, 2014, https://www.forbes.com/sites/paulroderickgregory/2014/05/11/inside-putins-campaign-of-social-media-trolling-and-faked-ukrainian-crimes/; Brian T. Gaines, James H. Kuklinski, Paul J. Quirk, Buddy Peyton, and Jay Verkuilen, “Same Facts, Different Interpretations: Partisan Motivation and Opinion on Iraq,” Journal of Politics 69 no. 4 (November 2007): 957–74; Sara El-Khalili, “Social Media as a Government Propaganda Tool in Post-revolutionary Egypt,” First Monday 18, no. 3 (March 4, 2013), http://firstmonday.org/ojs/index.php/fm/rt/printerFriendly/4620/3423.
- Tony Haile, “What You Think You Know about the Web Is Wrong,” Time.com, March 9, 2014, http://time.com/12933/what-you-think-you-know-about-the-web-is-wrong/.
Refbacks
- There are currently no refbacks.
Published by ALA TechSource, an imprint of the American Library Association.
Copyright Statement | ALA Privacy Policy