June 2004 CSO Magazine




 

Keep It Simple

If you're not thoughtful about your approach to balancing computer security with computer usability, you may end up with neither

BY SIMSON GARFINKEL


ONE OF THE HARDEST things about computer security is making the so-called secure computers easy to use. Indeed, building computers that are both secure and usable is so difficult that many IT professionals believe that security and usability are antagonistic goals that must be balanced.

Think, for example, about passwords. Computers without passwords are easy to use, but not very secure; anyone who sits down at the machine's keyboard or logs on over the network can access anything he wants. However, access controls—long, difficult-to-guess passwords that prevent the bad guys from breaking in and learning the computer's secrets—make computers difficult to use. So organizations naturally weigh security needs against user convenience.

The problem with this balancing act is that it often produces systems that are neither secure nor usable. The extremely usable system without passwords won't be much use if somebody breaks in and deletes all of its files. And the secure system with the hard-to-guess passwords won't be very secure after users post their passwords on little yellow stickies.

One reason that security traditionally has been viewed as the enemy of usability has to do with the way that security was incorporated into many traditional computing environments. Until very recently, security was frequently an extra—something added to existing operating systems and applications. Want to encrypt your business plan? Start with a word processing application, save the document in a file, then go back to that file and encrypt it with a file encryption program to add the missing security. Of course, the deleted copy of the business plan is still floating around on your hard disk, so you also have to run a special program to sanitize the hard disk.

All of these extra steps take work and require training. Make a mistake, and you might unknowingly compromise the system's security or, even worse, wipe out your data.

Today, features like file encryption and disk sanitization are built directly into applications and operating systems. The result is that using cryptography to protect a document is now much easier.
Today, features like file encryption and disk sanitization are built directly into applications and operating systems. The result is that using cryptography to protect a document is now much easier. For example, both Microsoft Word and Adobe Acrobat let you put a "password" on a file when you save it. This so-called password is actually used to generate an encryption key that, in turn, is used to encrypt your document. When you go to open the file, the application sees that the file has been encrypted and prompts the user for the password once again. A valid password can be used to decrypt the file, while an invalid one results in gibberish.

Sanitizing disk drives is also getting easier. Apple's Mac OS version 10.3, for example, gives users the option to "Empty Trash" or "Secure Empty Trash." Choose the Secure option and the operating system overwrites every block in each deleted file. Likewise, the Mac OS disk format program now allows you to click a button labeled "options" to explicitly wipe every block.


Using the User
Building security into desktop applications in a way that makes the security easy to use can be a difficult task—most programmers have a hard time building systems that are usable or secure in the first place.

Consider the problem of basic application design. Usability engineering is difficult for most programmers because a programmer usually designs software for a single user: himself. But different people use complicated software differently. Often, people who think like the original programmer find the program easy to use, while people who think differently find the program incomprehensible.

Academics have learned a lot over the past decade about how to build software that's user-friendly for a broad spectrum of users. An important principle for usability is that of iterative design. Once an application's user interface is designed, it's important to put it in front of users and see how they use it, sometimes under the watchful eye of a small focus group. Ideally, the developers will observe both inexperienced and expert users attempting to use the prerelease applications. They can observe the users' frustrations and then go back and fix the code.

Iterative design is very successful for designing word processing and stock-trading applications. But when it comes to security measures, iterative design isn't enough. That's because users are ill-equipped to make valid security judgments.

Consider once again the issue of file encryption. There's a big difference between using a password to create an encryption key, and simply storing the password in a file and then checking when the file is opened to see if the password in the file matches what the user typed in. In the first case the contents of the file are truly unrecoverable unless you can guess the password. In the second case, the file's contents can be recovered by opening the file with another program and looking at the raw data.

Observing a focus group may tell you how difficult it is to find the dialog box that's password-protecting a file. But focus group participants are paid to run the software, not to look for ways to compromise its security. The group won't be able to tell you that Microsoft Word uses the RC2 encryption algorithm with a 40-bit key when it saves a file with a password, while Intuit's Quicken doesn't use any encryption at all.

Yet this difference has real security implications: use a block-level disk editor to open a password-protected Word file and you'll see just gibberish, but look at the contents of a Quicken file and you'll see the names of all the payees in the check register.

Clearly, synergy between usability and security requires that software start with a secure substrate. No matter how easy Intuit makes putting a password on that Quicken file, the underlying data will never be secure.


Good Recovery
A good user interface sitting atop a strong security substrate is a good start, but it's still not enough to create applications where security and usability go hand-in-hand. That extra step—something I call "secure usability"—comes from a user interface that guides the user to secure practices by making other practices difficult or impossible.

Let's return to Apple's two commands for emptying the computer's trash. Both of these commands cause the operating system to display a dialog box that asks for confirmation. But the two dialog boxes have subtly different wording. Choose "Empty Trash" and the operating system warns: "You cannot undo this action." But if you choose the "Secure Empty Trash" command, the confirmation box states: "If you choose Secure Empty Trash, you cannot recover the files."

What's the difference between an action that cannot be undone and files that cannot be recovered? A practitioner versed in computer forensics will hone in on the word "recover" in the second warning. There are many tools for recovering data that has been accidentally deleted. Norton Utilities for Macintosh, for instance, comes with three "data recovery" programs—Volume Recover, UnErase and FileSaver. The wording of Apple's warnings subtly implies that the data recovery tools will be able to retrieve files that have been deleted but not those that were securely deleted.

This subtle difference is exceedingly important, but is almost certainly lost on the majority of Apple's users. One reason, I suspect, is that the phrase "Secure Empty Trash" doesn't have any obvious parallel in day-to-day life.

What might make more sense would be to have the operating system integrate the ideas of delete, sanitize and recover with a single user interface. Gone would be the trash can; instead, the computer's disk would be used to house a large database that would hold many intermediate versions of every file that you had ever worked on. There would be no option of "Empty Trash"; instead, the computer would automatically delete intermediate files as necessary to free up space. Of course, at times users might want to remove all traces of a document from their hard drives. To do that, they could select the file and drag it to an electronic shredder, which would have a direct parallel to the physical world.

I believe that we can ultimately resolve many of the apparent conflicts between security and usability in a way that addresses both concerns. In the case of passwords, the answer would be to use fairly short passwords but to constantly monitor users' behavior to see if they do anything out of the ordinary. If a salesman, for instance, starts trying to download secret plans for an unannounced product, I would want that salesman stopped—even if he authenticated using a password, a smart card and an iris scanner. The balance between security and usability should be fluid, not fixed.end


Simson Garfinkel, CISSP, is a technology writer based in the Boston area. He is also CTO of Sandstorm Enterprises, an information warfare software company. He can be reached at machineshop@cxo.com.

> Messaging Matters






Most Recent Responses:

Microsoft should be pressed into developing a system similar to Apple's "Secure Empty Trash". Is there any evidence that they might be working on such a solution?

Robert C. Frink
Mr.
Email
Print

Perhaps Microsoft will one day add a virtual document shredder alongside the Recycle Bin icon. If you want a document to be deleted and unrecoverable, use the Shredder. If you want it deleted, use the Recycle Bin. I can picture the animation already for shredding a document in my mind's eye!

B. McCarron
Security Officer
Print

Add a Comment: Your comment will be displayed at the bottom of this page, at the discretion of CSOonline.

Name:
Title:
Corp:
Email:
Subject *
Your Comment: *

* Required fields.
We do not post comments promoting products or services.
Comments are owned by whomever posted them. CSO is not responsible for what they say.
Selected comments may be published in CSO magazine.
We will neither sell nor display your personal information.







All content copyright CXO Media Inc., 1994-2002. All rights are reserved. No material may be reproduced electronically or in print without written permission from CXO Media, 492 Old Connecticut Path, Framingham, MA 01701.

Dated: June 2004


http://www.csoonline.com/read/060104/shop.html