A Primer on Digital Thinking: Part 1 Counting and Measuring

1

The basic distinction between digital and analog is that digital means you count something and analog means you measure something. We can easily count discrete objects like apples, oranges, and people, or legs when classifying critters. Counting answers “How many?”

We can also count sand, sugar, and snowflakes, but most people don’t want to because the objects are so numerous. So we use a measurement: a grain of sand, a cup of sugar, a foot of snow. Similarly, we use measurements for water, honey, and other fluids: a glass of water, a bottle of honey, a lot of hot air. Measuring answers “How much?” using standards as analogies: a cup of milk is similar to a cup of water.

The digital revolution occurred when improvements in technology enabled analog processes to be analyzed into bits that could reproduce a medium such as a photograph or phonograph record more precisely and with better durability. Ironically, the key digital concept “data” is not a countable noun: we say a gigabyte of data, not one data. In American English, “data” has become a collective noun that takes a singular verb, like “team” or “family”.

An important axiom of counting is that all the items you’re counting are identical, or close enough that their differences don’t matter.

An important axiom of counting is that all the items you’re counting are identical, or close enough that their differences don’t matter. Nobody likes to be thought of as a statistic, but as a society grows larger, it can be helpful to view everyone as an anonymous mass when approaching community-wide problems: studying traffic patterns, predicting disease, organizing clean water, etc.

Skepticism about a digital analysis is sometimes expressed as “you’re mixing apples and oranges.” A common example of this in the political arena is tallying Hispanics as “non-white” even though the US Census officially considers Hispanic to be an ethnic group, not a race. The US has about a million people who speak Arabic at home and nearly a million who speak Russian, but these ethnic groups are not considered non-white by the mainstream media.

Hispanics are a formidable minority, and well-meaning efforts to respect this large ethnic group often lead to the apples/oranges confusion. Ever since the American Community Survey of the US Census began tallying “languages spoken at home” in 2005, Spanish has outnumbered all other non-English combined: about 62% of Americans over age 5 who speak a language other than English at home speak Spanish. Hispanics have distinct cultural and religious traditions, but so do speakers of Russian and Arabic.

The growing fraction of minorities (including those of mixed race) is an established fact, but the conflation of different types of minorities (race and ethnicity) obscures the fact that whites are still far from becoming a racial minority. The US Census classifies 64.7% of Hispanics as white, contributing to 72.6% of the US population being white.

One of my gigs is to perform cost-of-living surveys in Indonesia and Malaysia a couple times a year. I noticed that a variety like Delicious apples from the US (or Pacific Rose apples from New Zealand, or Royal Gala apples from France) will have relatively consistent prices for “jumbo” size from one supermarket to another. But the price of unlabeled regular apples of the same variety sometimes will vary because “regular” seem smaller in one supermarket than another, which makes their prices not quite comparable. One could say “mixing apples with apples” is a subtler type of flaw in digital analysis.

An example of this subtler manipulation of statistics would be the gender gap in academic salaries that became big news in the 1990s. At that time, female full professors were rare, and most had only attained the rank recently. So their salaries were being compared to male professors who had enjoyed 10, 20, or more years of cost-of-living increases. A wage gap was real, but it was exaggerated by assuming that all professors of the same rank at a given university should get the same salary regardless of seniority. Exaggeration tends to be counter-productive because it contributes to polarization by arousing skepticism among “undecided” people who are potential supporters. A more granular analysis of a “gender seniority gap” across diverse sectors in the UK was published by The Economist in April this year.

In the 2016 presidential election, exit polls showed that about 53% of white women voted for Donald Trump while only 43% voted for Hillary Clinton. To “clarify” Clinton’s loss of her own identity group, analysts added education as another factor, pointing out that 51% of white women who had a college degree voted for her. Clinton actually did even better among white women under 30, but most commentators chose to highlight the extra factor of education, implying that smart women preferred her.

What is new is the increasing amount of data in our lives, much of which we as individuals simply do not have the time or resources to question.

I’ve provided a few examples of how statistics can be sliced, recombined, and spun to support one conclusion over another. This is not new: Darrell Huff’s book How to Lie with Statistics was published in 1954. What is new is the increasing amount of data in our lives, much of which we as individuals simply do not have the time or resources to question. In Part 2, I’ll discuss some of the broader cultural implications of the digital mindset, including a few disquieting trends from the 1990s that dovetail with the mindset but aren’t directly attributable to the digital revolution.

In closing, I want to emphasize that analog and digital are very much like yin and yang, two complementary forces or approaches to the world. A black-and-white photo can be reproduced with great precision by analyzing it into off/on pixels. A color photo follows the same principle, but with each primary color having an off/on position within a pixel.

The reorganization of analog processes as digital processes is now commonplace, but the reverse is also true at some level. Most of us have experienced brownouts or other situations where a light flickered because the voltage wasn’t consistently above the threshold needed to keep it illuminated continuously. In computer terms, this means that a bit changing from 0 to 1 is not an absolute transition but something that depends on the question “How much?” Namely, is the current (or magnetic polarization) sufficient to qualify as “on” rather than “off”? In other words, the hardware must measure the flow of energy before it counts the bit as a 0 or a 1.

Click Here to Read Part 2

1 COMMENT

Comments are closed.

Exit mobile version