Tweetie Pie

The Internet is a blessing and a curse.  True, it broadens the public square, and allows many more people to participate in the exchange of ideas; it democratizes information, so it has been said — no longer is public discourse merely the purview of great men (and they were mainly men) of letters.  Now, anyone with a smartphone (and who in the West doesn’t have a smartphone?) can put an idea out there, or comment on one.

Twitter

The emblem of this is Twitter; the number of “tweets” each day is equivalent to a ten million page book.  Each tweet is limited to only 280 characters, with the average tweet being only 33 characters long.  Given those factoids, on the one hand more people are “engaged” in the issues of the day.  But at what cost?

I would argue, on the other hand, the price we are paying for this information democracy is shallower, more hasty thought, a wholesale breakdown in the ability to communicate (particularly in writing), decreased attention spans, and less concern with facts.  As most tweets contain an image or video, the written content is secondary and less subject to scrutiny.  This is great for the new 21st century goal of communication — going viral.

Abraham Lincoln Internet quote

A perhaps unintended consequence of the Twitterverse is that with everyone from the pope to the president tweeting news organizations are now taken to reporting on this or that public official’s tweet.  Not wanting to sound like an old man yelling, “hey you kids, get off my lawn,” I will say tweeting is something birds do.  And birds have, well, birdbrains, a word which, according to the Merriam-Webster Dictionary, connotes “a stupid person.”

What does this portend?  I am of the opinion that writing matters less and less these days; in my view, as someone who not only enjoys writing but sets aside time to do it everyday, this also means reading and thinking matter less on the whole.  Any writer, from someone producing fiction to those who engage in various forms of non-fiction (including topical bloggers) will tell you that before pen is ever set to paper (that is an anachronism — read it “before one fires up their laptop and launches a word processor or web-posting application”), there is the necessary prerequisite of research. Research is a kind of cognitive (thinking) process involving decisions of relevance and reliability.

As an example, suppose I wanted to write something about the 2020 presidential election in the United States.  Google, the most popular search engine, is a revolutionary research tool; within milliseconds of typing in a query, thousands of sources of information are presented.  It is up to the consumer, me, to sort through the results of my inquiry.  The search engine is neutral, but in a way it is not — identifiable yet largely unknown (as to their weight) variables impact what is presented to me, and in what order.  Google uses an extremely complicated formula called a search algorithm to rank website content; they do not reveal the specifics of their algorithm — if they did, everybody would try to game the system to get ranked ahead of other sites rather than producing high-quality content.  The same is true of other search engines, but it is Google which has become the Kleenex of the information age.

Google deploys “Googlebots” (also known as web crawlers) to websites to analyze the content and evaluate its quality.  Think of Googlebots as automated website “critics” who take in a website as if it were a movie, analyze it, and report their findings to the moviegoing public (the users of their search engine). Googlebots judge and rank websites using criteria and weighting from Google’s proprietary algorithm; there are many factors involved, but among the most important is the number of links elsewhere pointing to the website being analyzed — meaning if something is being referenced a lot, the higher its rank.

A major pitfall of this is related to our shrinking attention spans.  After the 2020 election, there was a great deal of discussion about election fraud.  Pro-Trump media became like an echo chamber, with various outlets promoting the debunked notion of widespread election fraud by linking to so-called “proof” advanced by other pro-Trump media.  Increased linking to a piece of content results in a higher search engine ranking, and if people aren’t applying thought to what they are reading, they might be inclined to accept the first thing they read as true.  “Hey, it was the top hit on Google, it must be true!”

But just being first, or most popular, doesn’t make something true.  Other criteria must be brought to bear:  how reputable is the source of this information, what about bias, are the claims it makes backed up by evidence, can I verify the sources of the evidence — wash, rinse, repeat:  how reputable is this evidentiary source, what about bias in the evidence, does the evidence come from a primary or a secondary source, can I corroborate it, and so on.

9-out-of-10-believe

(apparently)

Before a writer (blogger) writes, s/he goes through this research and evaluation of research results.  Not so with a tweeter.  Not only that, but If we are more lax in our expectations of written communication, this means necessarily expecting less in thinking and the reading that fuels it.  Good writing is mental discipline, and that discipline carries over into the ability to process content; it is not merely ensuring subjects and verbs agree with each other or avoiding dangling participles, or knowing what a gerund is, or being able to spell and use punctuation — all things Twitter has dispensed with.

My fear, and I think it is well-founded, is that in the post-truth world we inhabit where someone can suggest with a straight face that something called "alternative facts” exist and not be admitted for psychiatric observation, we are living with the results of the demise of thoughtful and disciplined writing.

Copyright © 2021 matthewwilkinson.net — all rights reserved.