This is an extended, less-edited version of an article appearing in IEEE Security and Privacy in December 2012. This version specifically identifies all of the textbooks I reviewed while looking at information security design principles. Nội dung chính I have added an Afterword to note a ninth security principle added to the second edition of my textbook Elementary Information Security. Here is the citation for the published article:
The information security community has a rich legacy of wisdom drawn from earlier work and from sharp observations. Not everyone is old enough or fortunate enough to have encountered this legacy first-hand by working on groundbreaking developments. Many of us receive it from colleagues or through readings and textbooks. The Multics time-sharing system (Figure 1 – photo by Tom Van Vleck) was an early multi-user system that put significant effort into ensuring security. In 1974, Jerome Saltzer wrote an article outlining the security mechanisms in the Multics system (Saltzer, 1974). The article included a list of five “design principles” he saw reflected in his Multics experience. The following year, Saltzer and Michael Schroeder expanded the article into a tutorial titled “The Protection of Information in Computer Systems” (Saltzer and Schroeder, 1975). The first section of the paper introduced “basic principles” of information protection, including the triad of confidentiality, integrity, and availability, and a set of design principles.Over the following decades, these principles have occasionally been put forth as guidelines for developing secure systems. Most of the principles found their way into the DOD’s standard for computer security, theTrusted Computer System Evaluation Criteria (NCSC, 1985). The Saltzer and Schroeder design principles were also highlighted in security textbooks, like Pfleeger’s Security in Computing (Pfleeger, 1989), the first edition of which appeared in 1989. Different writers use the term principle differently. Some apply the term to a set of precisely worded statements, like Saltzer and Schroeder’s 1975 list. Others apply it in general to a collection of unidentified but fundamental concepts. This paper focuses on explicit statements of principles, like the 1975 list. The principles were concise and well stated on the whole. Many have stood the test of time and are reflected in modern security practice. Others are not. In 2008, after teaching a few semesters of introductory information security, I started writing my own textbook for the course. The book was designed to cover all topics required by selected government and community curriculum standards. Informed by an awareness of Saltzer and Schroeder’s design principles, but motivated primarily by the curriculum requirements, the textbook, titled Elementary Information Security, produced its own list of basic principles (Smith, 2012). This review of design principles arises from the mismatch between the classic list and this more recent list. The review also looks at other efforts to codify general principles, both by standards bodies and by other textbook authors, including a recent textbook co-authored by Saltzer himself (Saltzer and Kaashoek, 2009). The Saltzer and Schroeder ListSaltzer and Schroeder’s 1976 paper listed eight design principles for computer security, and noted two additional principles that seemed relevant if more general.
There were also two principles that Saltzer and Schroeder noted as being familiar in physical security but applying “imperfectly” to computer systems:
Today, of course, most analysts and developers embrace these final two design principles. The argument underlying complex password selection reflect a work factor calculation, as do the recommendations on choosing cryptographic keys. Compromise recording has become an essential feature of every secure system in the form of event logging and auditing. Security Principles TodayToday, security principles arise in several contexts. Numerous bloggers and other on-line information sources produce lists of principles. Many are variants of Saltzer and Schroeder, including the list provided in the Open Web Application Security Project’s wiki (OWASP, 2012). Principles also arise in information security textbooks, more often in the abstract sense than in the concrete. Following recommendations in the report Computers at Risk (NRC, 1991), several standards organizations also took up the challenge of identifying a standard set of security principles. Most textbook authors avoid making lists of principles. This is clear from a review of twelve textbooks published over the past ten years. This is even true of textbooks that include the word “Principles” in the title. Almost every textbook recognizes the principle of least privilege and usually labels it with that phrase. Other design principles, like separation of privilege, may be described with a different adjective. For example, some sources characterize separation of privilege as a control, not a principle. Pfleeger and Pfleeger (2003) presents its own set of four security principles. They are, briefly, easiest penetration, weakest link, adequate protection, and effectiveness. These principles apply to a broader level of security thinking than Saltzer and Schroeder design principles. However, the text also reviews Saltzer and Schroeder’s principles in detail in Section 5.4. The remaining few textbooks that specifically discuss design principles generally focus on the 1975 list. The textbook by Smith and Marchesini (2008) discuss the design principles in Chapter 3. The two textbooks by Bishop (2003, 2005) also review the design principles in Chapters 13 and 12, respectively. “Generally Accepted Principles”Following Computers at Risk, standards organizations were motivated to publish lists of principles. The OECD published a list of eight guidelines in 1992 that established the tone for a set of higher-level security principles: Accountability, Awareness, Ethics, Multidisciplinary, Proportionality, In its 1995 handbook, “An Introduction to Computer Security,” NIST presented the OECD list and also introduced a list of “elements” of computer security (NIST, 1995). Following the OECD’s lead, this list presented very high level guidance, addressing the management level instead of the design or technical level. For example, the second and third elements are stated as follows: “Computer Security is an Integral Element of Sound Management” “Computer Security Should Be Cost-Effective” The following year, NIST published its own list of “Generally Accepted Principles and Practices for Securing Information Technology Systems” (Swanson and Guttman, 1996). The overriding principles drew heavily from the elements listed in the 1995 document. The second and third elements listed above also appeared as the second and third “Generally Accepted Principles.” The OECD list also prompted the creation of an international organization that published “Generally Accepted System Security Principles” (GASSP) in various revisions between 1996 and 1999 (I2SF, 1999). This was intended to provide high-level guidance for developing more specific lists of principles, similar to those used in the accounting industry. The effort failed to prosper. Following the 1999 publication, the sponsoring organization apparently ran out of funding. In 2003, the Information System Security Association tried to restart the GASSP process and published the “Generally Accepted Information Security Principles” (ISSA, 2004), a cosmetic revision of the 1999 document. This effort also failed to prosper. In 2001, a team at NIST tried to produce a more specific and technical list of security principles. This became “Engineering Principles for Information Technology Security” (Stoneburner, et al, 2004). The team developed a set of thirty-three separate principles. While several clearly reflect Saltzer and Schroeder, many are design rules that have arisen from subsequent developments, notably in networking. For example:
While these new principles captured newer issues and concerns than the 1975 list, they also captured assumptions regarding system development and operation. For example, Principle 20 assumes that the public will never have access to “mission critical resources.” However, many companies rely heavily on Internet sales for revenue. They must clearly ignore this principle in order to conduct those sales. Training and Curriculum StandardsWhen we examine curriculum standards, notably those used by the US government to certify academic programs in information security, we find more ambiguity. All six of the curriculum standards refer to principles in an abstract sense. None actually provide a specific list of principles, although a few refer to the now-abandoned GASSP. A few of Schroeder and Saltzer’s design principles appear piecemeal as concepts and mechanisms, notably least privilege, separation of privilege (called “segregation of duties” in NSTISSC, 1994), and compromise recording (auditing). The Information Assurance and Security IT 2008 curriculum recommendations (ACM and IEEE, 2008) identify design principles as an important topic, and provide a single example: “defense in depth.” This is a restatement of NIST’s Principle 16. Saltzer and KaashoekCo-authors Saltzer and Kaashoek published the textbook Principles of Computer Design in 2009 (Saltzer and Kaashoek, 2009). The book lists sixteen general design principles and several specific principles, including six security-specific principles. Here is a list of principles that were essentially inherited from the 1975 paper:
Here are new – or newly stated – principles compared to those described in 1975:
Neither of the uncertain principles listed in 1975 made it into this revised list. Despite this, event logging and auditing is a fundamental element of modern computer security practice. Likewise, work factor calculations continue to play a role in the design of information security systems. Pfleeger and Pfleeger highlighted “weakest link” and “easiest penetration” principles that reflect the work factor concept. However, there are subtle trade-offs in work factor calculations that may makes it a poor candidate for stating as a concise and easy-to-apply principle. Elementary Information SecurityThe textbook Elementary Information Security presents a set of eight basic information security principles, While many directly reflect principles from Saltzer and Schroeder, they also reflect more recent terminology and concepts. The notion of “basic principles” stated as brief phrases seems like a natural choice for introducing students to a new field of study. The textbook’s contents were primarily influenced by two curriculum standards. The first was the “National Training Standard for Information System Security Professionals,” (NSTISSC, 1994). While this document clearly showed its age, it remains the ruling standard for general security training under the US government’s Information Assurance Courseware Evaluation (IACE) Program (NSA, 2012). In February, 2012, the IACE program certified the textbook as covering all topics required by the 1994 training standard. The second curriculum standard is the “Information Technology 2008 Curriculum Guidelines” (ACM and IEEE Computer Society, 2008). The textbook covers all topics and core learning outcomes recommended in the Information Assurance and Security section of the Guidelines. To fulfill their instructional role, each principle needed to meet certain requirements. Each needed to form a memorable phrase related to its meaning, with preference given to existing, familiar phrases. Each had to reflect the current state of the practice, and not simply a “nice to have” property. Each had to be important enough to appear repeatedly as new materials were covered. Each principle was introduced when it played a significant role in a new topic, and no sooner. Students were not required to learn and remember a set of principles that they didn’t yet understand or need. This yielded the following eight principles:
The textbook’s list focused on memorable phrases that were widely accepted in the computer security community. Principles introduced in earlier chapters always resurface in examples in later chapters. In retrospect, the list is missing at least one pithy and well-known maxim: “Trust, but verify.” The book discusses the maxim in Chapter 13, but does not tag it as a basic principle. Omitted PrinciplesFor better or worse, three of the 1975 principles do not play a central role in modern information security practice. These are simplicity, complete mediation, and psychological acceptability. We examine each below. There is no real market for simplicity in modern computing. Private companies release product improvements to entice new buyers. The sales bring in revenues to keep the company operating. The company remains financially successful as long as the cycle continues. Each improvement, however, increases the underlying system’s complexity. Much of the free software community is caught in a similar cycle of continuous enhancement and release. Saltzer and Kaashoek (2009) call for “sweeping simplifications” instead of overall simplicity, reflecting this change. Complete mediation likewise reflects a sensible but obsolete view of security decision making. Network access control is spread across several platforms, no one of which makes the whole decision. A packet filter may grant or deny access to packets, but it can’t detect a virus-infected email at the packet level. Instead it forwards email to a series of servers that apply virus and spam checks before releasing the email to the destination mailbox. Even then, the end user might apply a digital signature check to perform a final verification of the email’s contents. Psychological acceptability, or the “principle of least astonishment” is an excellent goal, but it is honored more in the breach than in the observance. The current generation of “graphical” file access control interfaces provide no more than rudimentary control over low-level access flags. It takes a sophisticated understanding of the permissions already in place to understand how a change in access settings might really affect a particular user’s access. ConclusionOnly a handful of Saltzer and Schroeder’s original 1975 design principles have stood the test of time. Nonetheless, this represents a memorable success. Kerckhoffs, a 19th century European cryptographic expert, published a list of principles for hand-operated cipher systems, some of which we still apply to cryptosystems today. But most experts only recognize a single principle as “Kerckhoffs’s Principle,” and that is his view on Open Systems: a cryptosystem should not rely on secrecy, since it may be stolen by the enemy. In addition to the Open System principle, both the principle of least privilege and of separation of privilege appeared on the 1975 list and are still widely recognized by security experts. Perhaps lists of principles belong primarily in the classroom and not in the workplace. The short phrases are easy to remember, but they may promote a simplistic view of technical problems. Students need simplicity to help them build an understanding of a more complex reality. AfterwordThe second edition of Elementary Information Security was published in 2016. Chapter 4 introduced a ninth security principle: Trust, but verify. When Saltzer and Schroeder developed their design principles, computer engineers looked upon system design as a process to develop an artifact with specific properties. We assumed that we could implement a property and the system would retain it. A “secure” computer would remain secure as long as it is operated correctly. Today, we’re painfully aware that important properties are very hard to achieve and ensure, especially since modern systems evolve over time. We wanted to assume that the security systems would block attacks without requiring intervention or even awareness by the system’s administrators. Modern systems implement numerous logging and assessment procedures to try to detect potential vulnerabilities or the effects of current or previous attacks. Trust, but verify seems like a pithy but effective statement of that design requirement. KerckhoffsA reader noted that I’d identified Kerckhoffs as a French crypto expert when he was in fact born in Holland. I called him French because he was working in Paris and published his famous article in a French military journal. As someone who was born in one place, brought up in another, and spent his adult life elsewhere, I don’t know how to conclusively identify a person’s homeland. Now I’m calling him European, which should solve the dilemma for now. ReferencesACM and IEEE Computer Society, 2008, Information Technology 2008 Curriculum Guideline, http://www.acm.org/education/curricula/IT2008%20Curriculum.pdf, (retrieved March 1, 2012). Bishop, 2003. Computer Security: Art and Science, Boston: Addison-Wesley. Bishop, 2005. Introduction to Computer Security, Boston: Addison-Wesley. I2SF, 1999. “Generally Accepted System Security Principles” International Information Security Foundation. ISSA, 2004. “Generally Accepted Information Security Principles,” Information System Security Association. Kerckhoffs, Auguste, 1883. “La cryptographie, militaire,” Journal des sciences militaires IX. NCSC, 1985. Trusted Computer System Evaluation Criteria, Ft. Meade, MD: National Computer Security Center. NIST, 1995, “An Introduction to Computer Security,” NIST SP 800-12, Gaithersburg, MD: National Institute of Standards and Technology. NSA, 2012. “IA Courseware Evaluation Program – NSA/CSS,” web page, National Security Agency. http://www.nsa.gov/ia/academic_outreach/iace_program/index.shtml (retrieved Feb 29, 2012). NRC, 1991. Computers at Risk: Safe Computing in the Information Age, Washington: National Academy Press. http://www.nap.edu/openbook.php?record_id=1581 (retrieved Feb 29, 2012). NSTISSC, 1994. “National training standard for information security (INFOSEC) professionals,” NSTISSI 4011, Ft. Meade, MD: National Security Telecommunications and Information Systems Security Committee. OWASP, 2012, “Category: Principle – OWASP,” web page, Open Web Application Security Project, https://www.owasp.org/index.php/Category:Principle (retrieved Feb 29, 2012). Pfleeger, Charles, 1997. Security in Computing 2nd ed., Wiley. Pfleeger, Charles, and Shari Pfleeger, 2003. Security in Computing 3rd ed. ,Wiley. Saltzer, Jerome, 1974. “Protection and the control of information sharing in Multics,” CACM 17(7), July, 1974. Saltzer, Jerome, and Kaashoek, 2009. Principles of Computer Design, Wiley. Saltzer, Jerome, and Schroeder, 1975. “The protection of information in computer systems,” Proc IEEE 63(9), September, 1975. Shannon, 1949. “Communication Theory of Secrecy Systems,” Bell System Technical Journal 28(4). Smith, Sean, and Marchesini, 2008, The Craft of System Security, Smith, Richard, 2012. Elementary Information Security, Burlington, MA: Jones and Bartlett. Stoneburner, Gary, Clark Hayden, and Alexis Feringa, 2004. “Engineering Principles for Information Technology Security,” SP 800-27 A, Gaithersburg, MD: National Institute of Standards and Technology. Swanson, Marianne, and Barbara Guttman, 1996. “Generally Accepted Principles and Practices for Securing Information Technology Systems,” SP 800-14, Gaithersburg, MD: National Institute of Standards and Technology. Textbooks Reviewed but not CitedForouzan, 2008. Cryptography and Network Security, McGraw-Hill. Gollmann, 2006. Computer Security 2nd ed., Wiley. Newman, 2010. Computer Security: Protecting Web Resources, Jones and Bartlett. Stallings, 2003, Network Security Essentials, Prentice-Hall. Stallings, 2006. Cryptography and Network Security, Prentice-Hall. Stallings and Brown, 2008. Computer Security: Principles and Practice, Prentice-Hall. Stamp, 2006. Computer Security: Principles and Practice, Wiley. Whitman and Mattord, 2005. Principles of Information Security 2nd ed., Thomson. Which design principle states that a user has no access by default?1. Principle of Least Privilege. The first principle for secure design is the Principle of Least Privilege. The Principle of Least Privilege means that you ensure people only have enough access that they need to do their job. Which design principle states that a user has no access by default to any resource and unless a resource is explicitly granted it should be denied?The Principle of Fail-Safe Defaults states that, unless a subject is given explicit access to an object, it should be denied access to that object. What is the principle of security design?Security design principles describe a securely architected system hosted on cloud or on-premises datacenters (or a combination of both). Application of these principles dramatically increases the likelihood your security architecture assures confidentiality, integrity, and availability. Which security design principle states that a system should maintain a safe state?Defense in Depth Fail-safe design is a related principle and stipulates that when components of the system fail, the system should remain in a secure state. Which design principles of security states that subject should be given only those privileges that it requires?The Principle of Least Privilege states that a subject should be given only those privileges needed for it to complete its task. If a subject does not need an access right, the subject should not have that right. Further, the function of the subject (as opposed to its identity) should control the assignment of rights.
What is the principle of security design?Security design principles describe a securely architected system hosted on cloud or on-premises datacenters (or a combination of both). Application of these principles dramatically increases the likelihood your security architecture assures confidentiality, integrity, and availability.
What are the 5 principles of information security?5 Principles of Information Assurance. Availability.. Integrity.. Confidentiality.. Authentication.. Nonrepudiation.. Which design principle states that a user has no access by default?1. Principle of Least Privilege. The first principle for secure design is the Principle of Least Privilege. The Principle of Least Privilege means that you ensure people only have enough access that they need to do their job.
|