That's slightly
cattivo, but the point still stands.
In addition to being an engineer, I once also taught engineering classes at the University of Utah. We used to tell students that if they flunked our classes they'd have to declare a business major instead. Sadly a sizable number of my students
did drop out of engineering and
did become business majors, and now they have nicer houses than mine. Be careful what you wish for.
Most people who aren't engineers have no idea what engineering actually is. That's why when Patrick says he knows "a little engineering," I can confidently say that he knows nothing about engineering. That's because what most layman believe is engineering is really just the mundane mechanics of making machines. Engineering is the
mindset behind knowing how and why systems work; and about the delicate interactions among machine, operator, and environment; and about how to satisfice among requirements and interests that are inherently in conflict. Predictably, Patrick demonstrates
zero knowledge of those ineffable qualities of the profession.
Ok, seriously, the idea shouldn't be limited to engineers, but (often named, almost never used by conspiracists) common sense*.
Normal people change their motor oil BEFORE it degrades too much...
True, but experienced mechanics know that changing your oil too frequently actually increases engine wear. And automotive engineers know why.
There are three types of knowledge acquisition at work there.
Propositional knowledge is simply what you know and believe because someone tells you something. It's offered to you as a proposition, and you simply take it on faith. Reading a book on swimming gives you propositional knowledge, but doesn't enable you to fare well if someone pushes you into the water.
Our friends tell us to change our oil every 3,500 miles and we vaguely know that it has something to do with suspended dirt particles, but we don't delve too deeply. We accept the propositional knowledge and dutifully change our oil. And we figure that if changing our oil every 3,500 miles is good, then changing it every 3,000 might be better. It isn't; it's needlessly wasteful and hard on the engine.
Practical knowledge is simply that which we acquire by doing something. Experienced mechanics get that experience by working on lots of cars for a long time. And part of what they observe over time is that cars that keep oil in them longer show less wear on their parts when they go to disassemble them. That's knowledge you can get propositionally (such as how you're getting it now), but isn't typically part of the average person's propositional world because there's only so much time in the day to get knowledge that way, and only so much room in the brain to store that kind of knowledge. (There's a cognitive reason for why practical knowledge is "stickier" that way, but this post is probably already going to be too long.)
Practical knowlege is getting pushed into the deep end. Having read the book on swimming, one is confronted immediately with how little the book really prepares you for the
experience of swimming, with all its largely elusive details and subjective impressions. A written description of swimming fails to convey the gestalt of the experience. Someone who swims even one pool length has already acquired more useful information on swimming than the person who has read a dozen books on the subject.
Deep knowledge, also called
causal or
logical knowledge, is that which we acquire by studying something in depth, either by composing new fields out of well-understood components, or decomposing the problem at hand into well-known contributing sciences. The automotive engineer understands the hydrodynamics of a fluid-film bearing and knows the relationships among viscosity, temperature, and chemistry over time. He understands the legitimate need in some cases for metal-to-metal contact (e.g., the uppermost piston ring) and studies the equilibrium achieved by metal-bearing additives in the motor oil over time and how those contribute to reducing wear in MTM interfaces.
Only after years of mastering those sciences (chemistry, metallurgy, fluid dynamics, calculus, statistical probability, classical mechanics [the physics kind, not the automotive kind]) can the engineer state confidently how he can know that oil has a break-in period, and that to change out your oil before you have reaped the full benefits of broken-in oil is to do more harm than good.
The average customer says, "Common sense tells me that if I wait too long to change my oil, it's bad for the engine; therefore I should change my oil frequently and that is conversely good for the engine." The mechanic says, "No, don't do that; from what I've seen the engines do better if you stick to Ford's schedule." The engineer says, "Here's a graph of zinc deposition on wear surfaces over time, and another of peak hydrodynamic rolling RPMs for a given viscosity; that's why should should leave the oil in a bit longer."
they realize that "best-before-dates" don't mean that the food goes bad instantly at 0:01 AM of the following day...
Indeed. Most of us have practical knowledge of food-borne illnesses. Hence we accept the propositional knowledge that the date stamped on the package is a date we should trust, and has probably been derived according to logical knowledge provided by food chemists and microbiologists about the prevalence of harmful organisms and mean growth rates.
If I stumble out to the kitchen for breakfast and notice that my milk expired yesterday, a number of factors influence my future decisions: the availability of eggs and bacon instead of cereal, the amount of money I have, my degree of hunger, the proximity of places to get more milk.
I might pour the expired milk down the drain and have eggs and bacon instead, reminding myself to pick up more milk on the way home from work. I might skip breakfast. That is, I trust the propositional knowledge and act accordingly to play it safe.
Or I might be a famished college student with a limited income who really needs that meal. In that case I'll still open the lid and sniff and decide that it really doesn't smell all
that bad. That is, having rejected the propositional knowledge, I'm now undertaking some empiricism to determine in a practical sense whether the milk is good enough. I'm attempting to measure the spoilage. I may end up drinking milk laden with bacteria, but I'll have made a reasonably informed decision.
The concept of "good enough" embodies the notion of acceptable risk. And part of common sense is the assessment and assumption of risk, even if do it unconsciously. When we exceed the speed limit on our way to work, we are weighing the risk of being late against the risk of bodily harm or legal consequences.
The fact that we all do this leads some to believe that all risk assessment is simply common sense. In fact, what we learn from psychologists about risk is primarily that (a) risk estimation varies greatly from situation to situation, and over time; and (b) that informal risk estimations are typically irrational. This leads engineers to develop formalisms for evaluating, measuring, and assessing risk, precisely so that we can reason about them instead of relying on emotion.
So when an engineer says, "That's good enough," the layman often responds, "It's not good enough; what about X?" And when we delve deeper we discover how little the layman really knows about X, whereas the engineer may have studied it in depth. There are many, many factors that affect risk assessment, and laymen know about very few of them.
Expired food also introduces the "good enough" concept of a design margin. The expiration date is chosen such that natural variances in bacterial growth don't result in people eating bad food. We realize that food gradually degrades and will be non-dangerous for some time to come.
Engineers are smart enough to know that they don't know everything, including natural variances in the properties they deal with. The "Check Engine" light typically comes on
before the car bursts into flames, to tell us that a condition has been detected that requires attention. The design margin in the car, however, enables it to be driven to the nearest mechanic. We anticipate and observe impending failure before we actually act on it.
We specify that structural beams should be able to carry much more than the load we intend to put on them, not just in case we should momentarily overload it, but also in case the beam has unseen defects. This is what gives us confidence to build buildings.
Reserve capacity is a common-sense concept, but its implementation from case to case may not be intuitively apparent. It's not always possible to see where the margin lies or how much is needed.
smart people change to snow tires when they expect snow, not just after it fills the street etc
Indeed, because we have propositional knowledge in the form of a calendar that models approximately the change of climate.
We realize that the changeover process takes time and is therefore "expensive." We realize that driving with snow tires on bare roads may be noisy and perhaps not very fuel-efficient. But we understand that perparedness in this case outweighs inefficiency and noise. That's the concept of satisficing. We know enough about each of the variables to make what we feel is a comfortable compromise.
But what if we could change tires in 30 seconds simply at the push of a button? What if snow tires cost 20 times as much as regular tires and wore out faster? Changing some of the variables changes the decision. Likewise in engineering decisions that may seem superficially similar are actually very different because of differences in the attendant variables that may not be obvious or intuitive to the layman. It often takes deep knowledge to determine defensibly what really matters.
*Somehow I felt the need to defend the often abused "common sense"
Yes, you should. Keep in mind that when conspiracy theorists say "common sense," what they really mean is "my uninformed opinion." They just dress it up in that euphemism.
In fact common sense is what gets us through the day. One of the most widely-quoted engineers, Henry Petroski, can be effectively summarized as saying that engineering is merely well-informed common sense. The problem Patrick is having here is that he skipped the step of acquiring appropriate information. The degree to which has to inform himself before his "common sense" can produce, say, a reasonably safe airliner is colossal -- required years of full-time effort and many tens of thousands of hours of practical employment.
Much of what we do as engineers appeals to common sense once it has been explained. The key is,
once it has been explained. The layman can usually appreciate, according to common sense, what an engineer has labored to produce. But that does not mean that the layman can
derive such things
from common sense. Comprehension after the fact does not equate to insight before the fact.
That is the very bad habit that we must break every engineering student of before he can become successful as an engineer. Because engineering concepts seem intuitively obvious after the fact, don't assume that your intuition necessarily predicts true principles.
Always study;
always test;
always determine causation -- even if you seem silly for doing so.
Common sense is, in an ironic twist, a perfect example of "good enough" judgment to get through the day. That doesn't mean that our common-sensical diagnosis of our back pain as the effects of an unsuitable mattress makes us all doctors. But it lets us get by and make some practical headway with our lives.