Dear Colleagues
Murphy’s Law is especially quoted in engineering and as most of you would well know the gist of it is: ‘If anything can go wrong, it will’. Often, we add: ‘And at the most inopportune moment and in the most damnable way.’ (A little unfair - the assumption: A malevolent universe).
The Law’s (supposed) origin is from an engineering blunder. A Captain Ed Murphy, an engineer at Edwards Air Force Base, was dealing with a technician who incorrectly wired a strain gauge bridge – he wired it backwards! The bridge was intended for gravity tests, but as a result of the gaffe it gave an incorrect zero reading. There is vigorous debate as to whether the engineer’s wiring diagram was at fault or the technician’s wiring. Whatever happened, the law is now firmly linked with Captain Murphy. Aside from the debate, however, the set-up should have been meticulously tested before being put into service to avoid what ensued.
I reckon this is a reflection of a fundamental principle of engineering where the lack of precision results in measurement uncertainty. We can never know precisely the value of anything to any arbitrary level of precision. At the sub atomic level, Heisenberg’s principle makes this point; the more precisely you try and measure the location of a particle, the less precisely you will measure its momentum (and vice versa when trying to measure momentum). This doesn’t reflect on our ability to measure, but on the nature of the system itself.
In Safety engineering, we do what is called a risk analysis where we identify what types of harm could arise from what we are planning to make or do. We can then establish the probability of the occurrence and multiply this by the severity of the harm (broken finger, to amputated arm) and we end up with the Risk. If the Risk is great enough, we can then apply a $ value to it and determine how much time and money to spend in reducing it to a tolerable risk. It is impossible to make anything unequivocally safe. So as far as Murphy’s Law and risk management is concerned, if you have conducted your risk assessment correctly, then you must expect that the consequences it predicts will eventually come to pass.
And remember: “Absence of proof is not proof of absence”. For example, with the ill- fated space shuttle, a proper risk analysis would presumably have shown that the foam chunks falling off its booster and damaging its thermal tiles would have correctly predicted the deaths and destruction of the craft during re-entry due to the defective tiles. However, NASA believed that because they had not seen such problems so far, they would continue not to see them.
So please continue to embed Murphy’s Law in your daily engineering life and constantly seek to minimise the consequences of risk.
With our constant litany of woe with IT stuff, I think we are all familiar with Murphy’s Computer Laws (Finagles Rules) – well, I certainly am:
What Every Computer Consultant Needs to Know:
1) In case of doubt, make it sound convincing.
2) Do not believe in miracles. Rely on them.
Thanks to Keith Armstrong for his illuminating article
Yours in engineering learning
Steve Mackay