The internet has a ton of positives, like:
But...
Image: A cartoon of a frowning girl saying, "If television's a babysitter, the Internet's a drunk librarian who won't shut up" to a perplexed humanoid cat.
Credit: Dorothy Gambrell, "Large Mediums," Cat and Girl. Used under a Creative Commons CC BY-NC-SA 2.5 license.
One of the facts of life on the Internet and social media is that bad information is everywhere. It can be difficult to tell when the information we find is skewed, misleading, or altogether false. Relying on bad information can lead to:
Sharing misinformation can also result in others relying on bad information or coming to harm.
Definition: Articles or programs that present facts or data
Biased or not? Presented with as little bias as possible
Purpose: To pass knowledge to the audience
Definition: An article or program that analyzes the news or uses facts to support an argument
Biased or not? Biased, but clearly labeled as opinion
Purpose: To persuade the audience of something
Definition: Misleading, incorrect, or false information
Why do people and organizations create misinformation?
Human error: For example - By mistake, a reporter wrote down that only 100 protesters marched in front of the state capitol, when they meant to write down 1000.
Faulty fact-checking: In the same example, the reporter's editor didn't ask anyone to confirm that only 100 protesters marched, so the newspaper printed the wrong number.
Outright lying (disinformation): In the same example, a radical media organization was opposed to the protest. To discredit the protesters, the media organization told their audience that there were 5000 violent rioters, even though the organization knew that there were actually 1000 protesters, and there was actually no violence.
(Sometimes called "fake news." Be careful about that label, though: Calling accurate news "fake" because it doesn't support your point of view is itself a form of disinformation.)
Definition: "The search for and use of information to support an individual’s ideas, beliefs or hypotheses" (from the Catalogue of Bias). In other words, we believe what we want to believe, and it's hard to accept evidence that goes against our point of view.
How does this relate to misinformation? When we see articles and news items that seem to support our existing beliefs, we tend not to ask questions about the evidence. Likewise, we tend to closely scrutinize - or reject - information that doesn't support our beliefs. Both of these tendencies cause problems when we believe misinformation. The problem is even worse when we're stuck in filter bubbles (see below), which are created and reinforced by our reading and viewing history.
Definition: "your own personal, unique universe of information that you live in online," created as algorithms learn from your history of searching and reading, to show you the information most likely to keep you engaged on a website or service (from Eli Pariser's TED Talk "Beware online 'filter bubbles'")
How does this relate to misinformation? The more often we see things we agree with, the harder it is to find out when we are misled or aren't given the entire story.
References:
Above The Noise. (2017, May 3). Why do our brains love fake news? [Video]. YouTube. https://www.youtube.com/watch?v=dNmwvntMF5A
Big Think. (2018, December 18). How news feed algorithms supercharge confirmation bias | Eli Pariser | Big Think [Video]. YouTube. https://www.youtube.com/watch?v=prx9bxzns3g