The Weakest Link FallacyA popular wisdom in the security field is that a system is as secure as its weakest link. In fact, it is probably one of the few sayings that is well known outside our community by the general public. This is not a big surprise. The saying sounds reasonable for anyone who has gotten his bike stolen because of a weak lock, or a cut chain. It also sounds obvious to anyone who got his house burgled through the back door, while the front door was locked tight.
Unfortunately, the statement is wrong.
The point is that secure systems, or at least properly designed secure systems, are never built as a chain. They are over-engineered, like planes, or bridges. If an engine on a plane fails, it will still be able to fly and land safely on an airport. The suspension of a bridge does not fail if one screw breaks.
Security in a system can be "over-designed" in many different ways.
A house may have locks on all doors and windows, but have a burglar alarm as well. A kind of intrusion detection system, in a way. If the first line of defence fails because a lock is broken, the intruder still has to bypass the alarm. The security of the system is not as weak as the weakest lock: it is as weak as the combination of the weakest lock _and_ the strength of the burglar alarm.
A companies internal network is protected from the external Internet by a firewall, that blocks all unwanted incoming (and outgoing) traffic. But the company will also run intrusion detection and/or network monitoring software to detect any illegal traffic that still gets through. And the computers on the network will each run anti virus software. Bypassing only a single one of them is not good enough to break the security of the company.
Alternatively, logging is a often used second line of defence. Recording all events in the system is good way to _detect_ illegal activity. Logging by itself does nothing to _prevent_ that activity. However, using pattern analysis or fraud detection software you will be able to detect anomalous behaviour. And to prevent _repeated_ abuse.
A smart card based electronic purse system, for instance, will record all transactions performed by all purses in the system. Clones can be detected, for instance because the same card number is used at different far-between locations at roughly the same time. Counterfeits (non existent card numbers) can be detected easily as well. Should attackers succeed to break the smart card and be able to increase the balance on the purse illegally, then such software will be able to detect that based on its own knowledge of the correct purse balance and the recorded spending pattern.
Repeated abuse of a card can be prevented in the following way. Card numbers that are involved in such fraud can be blacklisted. A blacklist contains all card numbers that are no longer valid, and is stored on all card accepting devices (aka readers). Readers will no longer accept cards on such lists in transactions. The blacklists on the readers are regularly updated. This way, repeated abuse of the same card is possible, but only for a short time. Usually this is enough to limit the possible damage, and therefore the associated risk of this attacks
All these examples clearly show that in general, for a properly designed secure system, its security does not fail when the weakest link fails. It is very unfortunate that we have taught the general public the wrong lesson. With ever clever hack that gets publicity, people assume that the whole system becomes insecure. And get the wrong intuition about the security of systems in general.
We need a better analogy. As a first idea, I propose to compare security with a wall that you can shoot at. If you break a security measure, you shoot a hole in the wall. Sometimes that hole is big enough to crawl through. But more often than not you need to shoot more holes.
It's a first idea... Better ideas are most certainly welcome.
Last Version -
(Note: changeover from CVS to dotless svn version numbers on Jan 19, 2008, and changeover to GIT versioning on May 30, 2013.)
Maintained by Jaap-Henk Hoepman