Security through obscurity

In security engineering, security through obscurity (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system. A system or component relying on obscurity may have theoretical or actual security vulnerabilities, but its owners or designers believe that if the flaws are not known, that will be sufficient to prevent a successful attack. Security experts have rejected this view as far back as 1851, and advise that obscurity should never be the only security mechanism.

History

An early opponent of security through obscurity was the locksmith Alfred Charles Hobbs, who in 1851 demonstrated to the public how state-of-the-art locks could be picked and who, in response to concerns that exposing security flaws in the design of locks could make them more vulnerable to criminals, said "Rogues are very keen in their profession, and know already much more than we can teach them."[1]

There is scant formal literature on the issue of security through obscurity. Books on security engineering cite Kerckhoffs' doctrine from 1883, if they cite anything at all. For example, in a discussion about secrecy and openness in Nuclear Command and Control:

"[T]he benefits of reducing the likelihood of an accidental war were considered to outweigh the possible benefits of secrecy. This is a modern reincarnation of Kerckhoffs' doctrine, first put forward in the nineteenth century, that the security of a system should depend on its key, not on its design remaining obscure."[2]

In the field of legal academia, Peter Swire has written about the trade-off between the notion that "security through obscurity is an illusion" and the military notion that "loose lips sink ships"[3] as well as how competition affects the incentives to disclose.[4]

The principle of security through obscurity was more generally accepted in cryptographic work in the days when essentially all well-informed cryptographers were employed by national intelligence agencies, such as the National Security Agency. Now that cryptographers often work at universities, where researchers publish many or even all of their results, and publicly test others' designs, or in private industry, where results are more often controlled by patents and copyrights than by secrecy, the argument has lost some of its former popularity. An example is PGP, whose source code is publicly available to anyone, and is generally regarded as a military-grade cryptosystem.

There are conflicting stories about the origin of this term. Fans of MIT's Incompatible Timesharing System (ITS) say it was coined in opposition to Multics users down the hall, for whom security was far more an issue than on ITS. Within the ITS culture the term referred, self-mockingly, to the poor coverage of the documentation and obscurity of many commands, and to the attitude that by the time a tourist figured out how to make trouble he'd generally got over the urge to make it, because he felt part of the community. One instance of deliberate security through obscurity on ITS has been noted: the command to allow patching the running ITS system (altmode altmode control-R) echoed as ##^D. Typing Alt Alt Control-D set a flag that would prevent patching the system even if the user later got it right.[5]

Criticism

Security by obscurity is discouraged and not recommended by standards bodies. The National Institute of Standards and Technology (NIST) in the United States specifically recommends against this practice: "System security should not depend on the secrecy of the implementation or its components."[6] However, NIST also states, "For external-facing servers, reconfigure service banners not to report the server and OS type and version, if possible. (This deters novice attackers and some forms of malware, but it will not deter more skilled attackers from identifying the server and OS type.)"[6]

A system may use obscurity as one layer of a defense in depth strategy, which involves layered security. Despite its criticism of security through obscurity, NIST also suggests not making the operating system and version disclosed to attackers to deter novices using known flaws.[6] An attacker's first step is usually identifying this information and if specific details of the system are not easily available, this step may be delayed. However, attackers with high skill and motivation will get the info they need through other means, making these obscurity measures ineffective.

The technique stands in contrast with security by design and open security, although many real-world projects include elements of all strategies.

Security through minority

A variant of the basic approach is to rely on the properties (including whatever vulnerabilities might be present) of a product which is not widely adopted, thus lowering the prominence of those vulnerabilities (should they become known) against random or even automated attacks. This approach has a variety of names, "minority"[7] being the most common. Others are "rarity",[8] "unpopularity",[9] "scarcity", and "lack of interest".

This variant is most commonly encountered in explanations of why the number of known vulnerability exploits for products with the largest market share tends to be higher than a linear relationship to market share would suggest,[7] but is also a factor in product choice for some large organizations.

A good example of this is Macintosh computers. They tend to be at a lower risk of infection with malware than their Windows counterparts. However, Macintoshes, on the average, are no harder to infect than Windows machines. There are simply many more Windows computers than any other, so Windows gets the most malware designed for it, as to reach the largest audience. This is the converse of the principle of supply and demand. Usually, demand triggers supply, but in this case the large "supply" of one type of machine triggers a greater "demand" for malware for it.

Security through minority may be helpful for organizations who will not be subject to targeted attacks, suggesting the use of a product in the long tail. However, finding a new vulnerability in a market leading product is likely harder than for obscure products, as the low hanging fruit vulnerabilities are more likely to have already turned up, which may suggest these products are better for organisations who expect to receive many targeted attacks. The issue is further confused by the fact that new vulnerabilities in minority products cause all known users of that (perhaps easily identified) product to become targets. With market leading products, the likelihood of being randomly targeted with a new vulnerability remains greater.

The whole issue is closely linked with, and in a sense depends upon, the widely used term security through diversity - the wide range of "long tail" minority products is clearly more diverse than a market leader in any product type, so a random attack will be less likely to succeed.

Security through obsolescence is a variation of security through minority. This entails using obsolete network protocols (e.g. IPX instead of TCP/IP) to make attacks from the Internet difficult. For example, ATMs often use X.25 networks.

See also

References

  1. Stross, Randall. "Theater of the Absurd at the T.S.A.". The New York Times. Retrieved 5 May 2015.
  2. Anderson, Ross (2001). Security Engineering: A Guide to Building Dependable Distributed Systems. New York, NY: John Wiley & Sons, Inc. p. 240. ISBN 0-471-38922-6.
  3. Peter P. Swire (2004). "A Model for When Disclosure Helps Security: What is Different About Computer and Network Security?". Journal on Telecommunications and High Technology Law. 2. SSRN 531782Freely accessible.
  4. Peter P. Swire (January 2006). "A Theory of Disclosure for Security and Competitive Reasons: Open Source, Proprietary Software, and Government Agencies". Houston Law Review. 42. SSRN 842228Freely accessible.
  5. "security through obscurity". The Jargon File.
  6. 1 2 3 "Guide to General Server Security" (PDF). National Institute of Standards and Technology. July 2008. Retrieved 2 October 2011.
  7. 1 2 Kiltak (December 19, 2006). "Mac Users Finally Waking Up to Security" (Blog). [Geeks are Sexy] Technology News. Retrieved 2008-05-01.
  8. Schneier, Bruce. "Crypto-Gram Newsletter: August 15, 2003". Retrieved 2008-05-01.
  9. CmdrTaco (July 23, 2001). "When 'Security Through Obscurity' Isn't So Bad". Slashdot. Retrieved 2008-05-01.
This article is issued from Wikipedia - version of the 11/28/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.