Sunday, March 29, 2015

Catch Up - Part IV - Code Freeze 2015

Code Freeze had a security focus this year.  And although much of it had to do with things, the lessons were still good.  I know I've been talking about Stephen Checkoway's presentation on how to hack a car to my team for months.  He and Bruce Shneier both had great presentations.  The break outs about bitcoin and secure email were a little less inspiring, but gave me time to go have lunch with Ming at the local Asian restaurant.  Checkoway's takeaway is that cars were built with the expectation that as long as you were within the firewall/existing system you were safe.  That led to buses being connected to the extent that if you could hack the radio or force a buffer overflow via a CD or get a car outside the 4G network so that it had to communicate via traditional phone, you could hack everything right up to the point of driving it remotely (horn, shut it down, listen if there's a phone, unlock the doors, start it, etc).  It's a good lesson if you trust your database to serve up clean data all the time just because it's your database.

The panel at the end was good and elucidated on some of what Shneier was discussing earlier in the conference.  Shneier doesn’t like the worst case approach in security.  There’s too much of that in his opinion (operating in the moment), although he recognizes it’s somewhat at odds with his message around considering your adversary in advance, and it leads to “muddling through.”  That’s an old rules approach and “the old rules don’t apply.”  Checkaway agrees and sees it as positive that there’s been a shift from something might happen, to “something will happen and we have to deal with it.”

Bruce Shneier noted that getting companies to share their security data is valuable, but not easy (companies worry about their reputation).  Sharing virus information and other information helps many other industries and initiatives.  I include a few of my notes below.  Mostly for me:

  • Systems that support people instead of replacing people involve reduced switching costs (example, the military).
  • Speed is the advantage in an iterative loop of the hacker versus the hacked.
    • Observe, orient, decide, act (OODA)
    • Detection = IT.  Response = People.
    • So improve them, training them, use the OODA loop, consider the switching cost.
  1. We are losing control of our IT infrastructure.”  This means we’re also losing visibility.  We are also using devices we have less visibility/control over.
  2. Attacks are more sophisticated in their skill and focus.  There is relative security and absolute security.  “A sufficiently motivated attack will get in.  Full stop.”  The attacker has the advantage.
  3. There is an increase in government involvement in cyberspace, pro and con.  Some countries are selling cyberweapons (as arms dealers; turnkey solutions).
  • The economy affects security:
    •  Switching costs (WP vs. Word)
    •  Managerial costs
    •  Fixed costs (stamping out more copies)
    •  Lemons Market (cheap and easy wins and that’s not always good)
    •  Risk seeking when it comes to losses, risk adverse when it comes to gains.
    • “People will drive to work and fear terrorism.  What are you thinking?”
    •  Security is hard to sell.  Management is willing to “take the chance.”  Think burglar alarms  Car alarms
  •  Look up Asian Disease Experiment on Wikipedia:
  •  Look up his Crypto-gram newsletter.
  • There’s an MSST degree in security at the U of MN.

No comments: