The quest for truth is as old as our species, and likely far older. Organisms have always used misdirection, misrepresentation, and outright deception to get what they want and need, whether it be food, shelter, or mates. Modern humans, however, show a particular affinity for fabrication. For some, lying is a pastime, an entertainment; for others, it’s a sign of emotional illness, a compulsion.
Yet everyone is less than completely honest at least some of the time, and correctly assessing reality is of critical importance if one is to flourish in complex environments in which few are expert in more than a few areas. Sussing out truth from fiction, therefore, is a constant effort within our brains, particularly when we’re weighing direct statements from one another.
This is not a trifling issue. In the world of 2019, with the exponential growth of information sources, we’re all having to suss more and more, faster and faster. The real and the fake are increasingly intertwined, to the point of being indistinguishable from one another. Even worse, individual biases, preferences, and styles (among innumerable other factors) can lead to assumptions, which lead to conclusions that just validate the original assumption. One person’s well-vetted report becomes another’s thinly disguised piece of propaganda. Getting to a reality that at least most of us can agree on is increasingly difficult, but it’s never been more important.
A large part of my professional background comes from the world of quality assurance, which pursues verity with a vengeance. That makes sense, because without a complete and accurate understanding of the true performance of a process, one cannot hope to improve it.
For most of the industrial age, the dearth of data was the death knell of pristine quality, acting as a sort of speed limit for continuous improvement. The tools of the trade—the hardware measuring activity, and, later, the software that helped interpret that acquired data—just weren’t fast or robust enough to offer more than scattered snapshots of any given process. Those snapshots, correctly deciphered, provided a piece of reality, a small slice of truth, but there was never enough quantity or granularity to push quality to its outer limits. Getting enough data to really understand a process from the inside out became a holy grail of sorts for quality people.
But then, a bit more than two decades ago, things began to change. The accuracy and speed of measuring equipment began to soar, and it was matched by the rapid expansion of functionality and power in software. Data that once trickled in grain by grain soon began to drop in clump by clump. Eventually it poured in boulder by boulder until it collected in huge mountains. Quality engineers and data scientists went from delight to apprehension to something akin to fear. “How are we going to sort through all this stuff?” they wondered. In the years shortly after Y2K, the age of Big Data had fully arrived, by which time the holy grail had been transformed into a holy terror.
Yet today, data analysis is one of the key disciplines for any quality professional. We reached this point because the tools continued to evolve, as did the attitudes of the women and men doing the work. The point of Big Data is not to get and use all the data possible. That’s impossible. No, the job is to use the software, to construct programs, and to distill all that data into information that’s accessible and meaningful to whatever process you are trying to improve. Discernment was, and still is, the order of the day.
Big Data software systems help quality managers make sense of the sheer weight of data and better understand the actual performance of any given process by utilizing tools like machine learning, sampling, and predictive analysis. That works fine in, say, a manufacturing environment where facts are valued and indisputable; as we’ll see, however, these tactics tend to be suboptimized in situations where truth is transactional, fluid, and debatable.
In part two of this series, we’ll investigate the ugly truth and the pretty lies found in today’s mass media.
This four-part article explores the nature of truth through the prism of quality, mass media, culture, and art. Here, in part one, we pose the question of why truth matters, and consider how the quality professional’s toolkit can help us mine mountains of data to uncover hidden nuggets of meaningful information.
BIO:
Mike Richman is the principal of Richman Business Media Consulting, a marketing and public relations company working with clients in the worlds of manufacturing, consumer products, politics, and education. Richman also hosts the web television program NorCal News Now, which focuses on social, economic, and political issues in California. He is a contributor to (and former publisher of) Quality Digest.
He can be reached at:
mikerichman67@gmail.com