Back in 1975, Saltzer and Schroeder set out 8 design principles on how to build a secure digital system. Interestingly enough these are still relevant today so I thought I might sum them up for anyone interested.
Principle of Least Privilege
Of all the security principles, this one gets the most lip service. It is the one that most people remember. It’s a good one but far from the only one.
The principle of least privilege restricts how privileges are granted. The subject (user, group, file, etc.) should be given only those privileges that it needs in order to complete its task.
Simply put – if the subject doesn’t need permissions to do something then it should not have them. We want to reduce the attack surface.
Example: elevated privileges should be reduced once the operation is complete.
Principle of Fail-Safe Defaults
This security principle restricts how privileges are initialized when an object is created. Unless the subject is given explicit access to an object then it should be denied access.
The default access to an object is NONE.
Additionally, if the subject fails to carry out whatever task it set upon then it should undo the changes it did and revert the system back to a stable and consistent state. Thus even if it fails the system will be safe.
Example: database transactions which fail should be rolled back.
Principle of Economy of Mechanism
Economy of mechanism is all about simplifying the design and implementation of security mechanisms.
Security mechanisms should be as simple as possible!
The more simple a design is the fewer possibilities that exist for errors. Errors lead to vulnerabilities which lead you to update your resume.
Additionally, testing will be much easier with a simple design. Oftentimes complex things make a lot of assumptions about how the system and their environment works. If they are incorrect or miss one then problems may arise. Be especially cautious with external entities.
Example: reduce the attack surface, don’t install OS features and roles you don’t need and don’t start services which are not needed.
Principle of Complete Mediation
Complete mediation means we restrict the caching of information which leads to simpler implementations.
Example: Active Directory (or LDAP…any directory service) requires that all objects be checked to see if they are allowed. Whenever the subject tries to do something the OS must mediate.
Each time the subject tries to interact with an object it should be checked again and not cached.
Principle of Open Design
The principle of open design asserts that complexity does not add security…it just makes things more complex. More importantly – the security should not depend on the secrecy of its design or implementation.
Security through obscurity is not security!
Secrecy adds little if anything toward the security of a system. It can however be a crutch that weak development relies upon as a shortcut to security.
This isn’t to say we will not keep our passwords and cryptographic keys secret – these are not algorithms. Proprietary software and trade secrets often tend to use a little sprinkle of obscurity. You often find out about it from the news when a beach is unloaded.
Example: in database programming there is no need to write a long complex cursor or other procedural logic if it can be achieved in a set oriented way.
Principle of Separation of Privilege
I like to think of this one like the scenes in the movies like The Hunt for Red October. If we want to launch the nuclear payload then the captain of the ship and another high ranking member must both insert their keys and turn at the same time.
Separation of privilege restricts access to system entities. A system should not grant permission based on a single condition.
Example: in IT when someone wants to request elevated permissions they must be administered by a separate entity and possible also a document that describes the work to be done.
Principle of Least Common Mechanism
The principle of least common mechanism lives to limit sharing. Basically, mechanisms used to access resources should not be shared. We want to minimize this.
Example: we don’t reuse our passwords from service accounts and other subjects.
Principle of Psychological Acceptability
It is imperative to recognize the human element in computer security. Security mechanisms should not make the resource more difficult to access than if the security mechanisms were not present.
Security and usability are frenemies!
An excess of security often reduces usability. Conversly, addition usability sometimes can weaken security. When error messages are thrown, for example, we must consider who will read them. We want to give only the info that is required and no more.
Example: if a user tries to login and enters the wrong password, the error message should not say the password is incorrect – this implies the user name checks out!