#423 – SHOW ME THE DATA – FRED SCHENKELBERG

Early in my career, I worked for an unreasonable person.

He wanted us, his engineering staff, to show him the data. He wanted us to gather, monitor, analyze and display data regularly. Anytime we needed approval, funding, or resources he wanted to see the data.

Essentially this meant he wanted to understand the situation involved.

The variability of line yield or impact of the current tooling on product performance. If we wanted to start a project to make improvements, we had to show— with data— the current situation and why it needed improvement.

A table of summary statistics wasn’t good enough. He wanted the data, in detail. Plus, the summaries, plots, and comparisons. The reports had to include the data sources, measurement techniques, and analysis steps, along with the box plots.

He keyed in on assumptions, both engineering and statistical assumptions. When first working there I thought the only question he knew to ask was, “How do you know the data is normally distributed?”

An Experiment Every Day

Beyond having to show the data, we were also expected to conduct an experiment every day. Every day.

In college, we learned the scientific method through conducting a few experiments over a semester.

Sure, it could be a simple experiment: hypothesis, observation, analysis, and conclusion. Or rather complex such as a series of accelerated life test to determine the aging model for a new product. (Which only counted as one experiment.)

We did hypothesis tests and Taguchi L3  designs so often our statistics books wore out. Our engineering manager was happy to replace the references as needed.

We could spot an opportunity to conduct an experiment walking into the office or across the factory floor. From the material ordering process to final goods inspection, anything that could have variability, which is pretty much everything, was fair game for an experiment.

We answered questions involving could we measure a specific step in a process, could we monitor variability earlier in the process, could we determine which equipment setting contributed the most variability? We learned to ask questions, then go learn more about the subject.

We gathered data and ran experiments.

Looking Back

What a rich and supportive environment. The rule of the shop was to question everything.

Create informed decisions and share the data. We also learned the many statistical tools that allowed us to answer questions clearly.

We checked normality assumptions by gathering data and hand plotting on normal probability graph paper. We checked for measurement error by conducting gage repeatability and reproducibility studies anytime we wanted to use a gage to make a reading (this counted as an experiment).

We learned about data summary techniques, about data plotting, about companions, confidence intervals, Type I and II (and Type III) errors. We learned about control charts, process mapping, and design of experiments.

We learned how to ask questions we could answer, then ask questions that were difficult to answer. We learned a lot.

The plant ran better every day. Yield steadily improved, Throughput increased, Scrap declined. And, we as an organization learned from our data.

Are you learning from your data?

Do you have a full tool chest of statistical tools that allow you to understand, present, and convince others using data? Have you done an experiment today?

Leave a Reply

Your email address will not be published.