Saturday, September 12, 2009

The Mysterious Field of Engineering Systems

Norman Augustine June 16, 2009 Running Time: 0:51:40

About the Lecture
One of the nation’s revered technology leaders dispenses anecdotes and wisdom on the slippery subject of engineering systems (or systems engineering). Norm Augustine just can’t get a handle on the discipline: “No one agrees on what it is, or what it does.” After years in industries like Lockheed Martin, Augustine has come up with “Norm’s Rules,” and can at least define ‘system’ as “having two or more elements that interact,” and ‘engineering’ as “creating the means for performing useful functions.” But these definitions don’t get you too far in the real world.

Augustine shows a fuel control system, which some engineers might view as part of a propulsion system. In turn, aeronautical engineers might think of the entire airplane as a system, and transport engineers view aircraft as merely components in systems incorporating airports, highways, shipping lanes. Augustine continues up the ladder until “our system that started as a fuel controller…seems to have the whole universe as a system.” Like Russian Matryoshka dolls, systems can always be embedded within larger systems. Even if you try to simplify a system in terms of just a few objects with a binary, on-off interaction, things can get complex very quickly. Five elements in a system can exist in more than a million possible states. Says Augustine, “A typical earth satellite has nearly one million parts; a 747 over 5 million. How does that make you feel about flying?”

Distinguishing the significant interactions and the important external influences on a system are central to designing and problem solving. And these days, engineers must include politics, public policy and economics as part of their systems. “The trick is to bound the scope of the system so it’s not too large to be analyzed and not too small to be representative.” Doing this right is “why systems engineers should be paid so much.”

Augustine concludes with his “Dirty Dozen” systems engineering traps, which have led to embarrassing bust-ups, monumental failures, and real tragedies. Notable among these: “the ubiquitous interface,” (or absence thereof). He describes how two flight control groups used different metric units and accidentally sent a Mars-bound spacecraft whizzing off into deep space. There’s the “single-point failure,” exemplified by the collapse of a football field-sized satellite dish due to a poorly designed bracket. There’s software, “which like entropy, always increases:” a Mariner spacecraft headed in the wrong direction due to a missing hyphen in 100 thousand lines of code. The problem with most systems ultimately is that they “contain human elements … and humans sometimes do irrational things.

No comments:

Post a Comment