furrbear: (Default)
[personal profile] furrbear
Great entry over on Bruce Schneier's blog about the Security Mindset. It's spot-on in its description about how security folks think about things. (Hint: There's good reason I get a mail-in paper ballot - and it's not convenience.) They tend to look at the entire system and pick out the weak links in the security chain. All those Hollywood crypto-cracking plot devices? Fairy tales. No reasonable attacker attacks the cryptography. During the Cold War, the KGB's best method for obtaining NATO secrets was a $100 a night hooker and a bottle of Scotch.

One also develops a sense for "How much is enough?" My good friend of mine makes the analogy that for most computer users, really large crypto keys are like securing your home's front door with a bank vault door, neglecting the fact that there is a large picture window next to the door.

Enjoy!
The Security Mindset

Uncle Milton Industries has been selling ant farms to children since 1956. Some years ago, I remember opening one up with a friend. There were no actual ants included in the box. Instead, there was a card that you filled in with your address, and the company would mail you some ants. My friend expressed surprise that you could get ants sent to you in the mail.

I replied: "What's really interesting is that these people will send a tube of live ants to anyone you tell them to."

Security requires a particular mindset. Security professionals -- at least the good ones -- see the world differently. They can't walk into a store without noticing how they might shoplift. They can't use a computer without wondering about the security vulnerabilities. They can't vote without trying to figure out how to vote twice. They just can't help it.

SmartWater is a liquid with a unique identifier linked to a particular owner. "The idea is for me to paint this stuff on my valuables as proof of ownership," I wrote when I first learned about the idea. "I think a better idea would be for me to paint it on your valuables, and then call the police."

Really, we can't help it.

This kind of thinking is not natural for most people. It's not natural for engineers. Good engineering involves thinking about how things can be made to work; the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal. You don't have to exploit the vulnerabilities you find, but if you don't see the world that way, you'll never notice most security problems.

I've often speculated about how much of this is innate, and how much is teachable. In general, I think it's a particular way of looking at the world, and that it's far easier to teach someone domain expertise -- cryptography or software security or safecracking or document forgery -- than it is to teach someone a security mindset.

Which is why CSE 484, an undergraduate computer-security course taught this quarter at the University of Washington, is so interesting to watch. Professor Tadayoshi Kohno is trying to teach a security mindset.

You can see the results in the blog the students are keeping. They're encouraged to post security reviews about random things: smart pill boxes, Quiet Care Elder Care monitors, Apple's Time Capsule, GM's OnStar, traffic lights, safe deposit boxes, and dorm room security.

One recent one is about an automobile dealership. The poster described how she was able to retrieve her car after service just by giving the attendant her last name. Now any normal car owner would be happy about how easy it was to get her car back, but someone with a security mindset immediately thinks: "Can I really get a car just by knowing the last name of someone whose car is being serviced?"

The rest of the blog post speculates on how someone could steal a car by exploiting this security vulnerability, and whether it makes sense for the dealership to have this lax security. You can quibble with the analysis -- I'm curious about the liability that the dealership has, and whether their insurance would cover any losses -- but that's all domain expertise. The important point is to notice, and then question, the security in the first place.

The lack of a security mindset explains a lot of bad security out there: voting machines, electronic payment cards, medical devices, ID cards, internet protocols. The designers are so busy making these systems work that they don't stop to notice how they might fail or be made to fail, and then how those failures might be exploited. Teaching designers a security mindset will go a long way toward making future technological systems more secure.

That part's obvious, but I think the security mindset is beneficial in many more ways. If people can learn how to think outside their narrow focus and see a bigger picture, whether in technology or politics or their everyday lives, they'll be more sophisticated consumers, more skeptical citizens, less gullible people.

If more people had a security mindset, services that compromise privacy wouldn't have such a sizable market share -- and Facebook would be totally different. Laptops wouldn't be lost with millions of unencrypted Social Security numbers on them, and we'd all learn a lot fewer security lessons the hard way. The power grid would be more secure. Identity theft would go way down. Medical records would be more private. If people had the security mindset, they wouldn't have tried to look at Britney Spears' medical records, since they would have realized that they would be caught.

There's nothing magical about this particular university class; anyone can exercise his security mindset simply by trying to look at the world from an attacker's perspective. If I wanted to evade this particular security device, how would I do it? Could I follow the letter of this law but get around the spirit? If the person who wrote this advertisement, essay, article or television documentary were unscrupulous, what could he have done? And then, how can I protect myself from these attacks?

The security mindset is a valuable skill that everyone can benefit from, regardless of career path.

This essay originally appeared on Wired.com.

Date: 2008-03-26 06:05 am (UTC)
From: [identity profile] grizzlyzone.livejournal.com
I once had a customer who was complaining that his car radio had been stolen while on a trip to New York. I figured the simple answer would have been to replace the radio. Not so. It looked like the old radio had been removed with a chainsaw.

There is always another way to "skin the cat".

Date: 2008-03-26 09:38 am (UTC)
From: [identity profile] barengeist.livejournal.com
Now that the FBI can snag anyone in anything by merely posting some links on a server somewhere, this immediately came to mind. What better way to completely ruin someone's life than to slip them one of these poison links through email or IM or behind a tasty picture in a live journal post?

Date: 2008-03-26 12:56 pm (UTC)
From: [identity profile] sultmhoor.livejournal.com
Have you ever read the document that (Verisign, I think it is...) details what kind of security measures you're supposed to have in place in order to be able to store customer credit card numbers say, for future purchases? It's kind of comical - the extremes they expect you to take.

I must be one of those security minded types, it doesn't usually take me very long before I start to find flaws in things. I've demonstrated for people how very easy it is to defeat mechanical locks, security personnel, and to walk away with things they thought secure. And I love novel approaches even though many times they're simply wrong. LiveJournal used to be a place of those kind of novel approaches...

Date: 2008-03-26 01:34 pm (UTC)
From: [identity profile] cipherpunk.livejournal.com
The longer I'm in this field the more I realize people use the phrase "security" to mean two orthogonal ideas. Let one axis be the level of subjective safety you feel in your life, and the other axis be the level of objective safety you feel in your life.

If you feel perfectly safe walking around downtown Fallujah simply because you have a freshly slaughtered chicken hanging around your neck, well, you're maxxing out one scale and zeroing another. That doesn't make your security practice a priori stupid--far be it from me to complain about an irrational practice that, in the main, is harmless and allows you to continue to function in an extraordinarily hostile environment--but it should be seen for what it is.

What's pissing me off is when people mix up the two. They tend to do it all the time.

I also think this is responsible for the Cassandra Effect in computer security. People live their lives in a small bubble where they have a subjective feeling of security. When a Cassandra comes along and points out their near total lack of objective security, the net effect--even if the Cassandra does nothing untoward--is to immediately and critically reduce the subjective feeling of security. Thus, by talking frankly about security problems in an objective sense, we are attacking systems in a subjective sense: and we experience pushback from the people we are trying to help as a result.

Profile

furrbear: (Default)
furrbear

May 2013

S M T W T F S
   12 34
567891011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 8th, 2025 04:30 pm
Powered by Dreamwidth Studios