Blockchain dreams and beyond - Why security technologies fail in the marketplace

Blockchain. It's the latest computer security technology craze. Everybody is trying to build on the Bitcoin story, get funding, and start the next great security company.

The math is cool. Anonymity, distributed ledgers.

All conceptually very, very neat.

Lots of attention and some funding.

Lots and lots of hype.

Don't hold your breath.

Because we've been here before.

I've been here before.

My Sad, Sad Story of SecurePlay - somewhat short version

Back in the 1990s when the Internet was new, Nova on PBS had a special on this "new thing"... the Internet (I don't remember quite when). Near the end of the show, there was a short piece on Internet Gambling and how "there was no way to protect against cheating".

It stuck in my head. I'm a crypto guy and a game guy.

I thought I could solve the problem.

And I did.

I came up with a set of cryptographic protocols to secure games against cheating. I didn't quite know what to do with them... shared it with some friends who knew a patent attorney.... and the fun began.

So, we filed patents and thought we could sell the patents to people (foolish us).

So, we built software (SecurePlay) that implemented the patents (and more) to try to become "the SSL of online gaming". Heck, we even won a startup business contest in DC.

We had one problem that a lot of security technologies don't have... the questionable legality of Internet gambling (in the US), but that really wasn't the thing that killed us.

And we were probably dead from the day we started.

No one cared (or cares) enough.

Because we thought that game companies (both computer games and gambling) would buy security technology that would make their product better (more secure).

But they don't need to be better (more secure) to get more customers.

... or, rather, the customers who they would get by being more secure were (and are) in the distant future.

... they were better off spending money on just getting more customers.

... or making better games.

... and they could always just say they were secure (which they do...even if they've been caught with serious security problems).

After all, consumers (and most businesses) can't tell the difference.

If I was doing this today, I'd call SecurePlay "blockchain for online gaming"... and I might even get a bit of money... but, I suspect, if I tried to sell "game security middleware" again... it would fail, again.

Because security technologies face some real challenges in the marketplace.

Back to the future

There are other hot new security technologies. They do come in waves... "Threat intelligence" and "AI for security" definitely are up there today.

But, if it's your money,


Consider the following

* No legal framework


This has been a particular problem for cryptographic security technologies since the digital signature days (at least). Back in the day, I spoke at the First NIST Public Key Invitational Workshop (sometime in the mid 1990s). We worried about the legal risks of digital signatures, potential for fraud, heck, we worried about the storage costs for signing email (2000 bits per email- horrifying)!

One topic I raised was how much money we were all going to make as expert witnesses on how easily digital signatures could be undermined in practice and therefore digitally signed contracts were going to be meaningless.

Totally wrong of course.

Digital signatures "exist" but they mean little.

I've bought digital signatures for $8 per year. Free ones exist now. The only reason to buy one is customer expectation.

Basically, consumers have been trained to "trust" those silly little lock logos on their browsers. While your data may be encrypted to the store where you  buy stuff or site where you share your personal information... hacking that encrypted link isn't today's security problem at all.

No one has adequately worked out building a true digital signature system with all of its implications. For example, I just "digitally signed" a contract to buy a property - using Comic Sans Serif to "sign" my name as my "digital signature".

The system uses a "trusted third party" with a site with lots of security rhetoric - emphasis on the "quotes" around "trusted".

And that's OK... but it sure isn't based on cryptography.

It's easy and it's cheap.

Basically, "digital signatures" have knocked out the market for fax machines for contracts.

Woo hoo.

* Greed

Everyone has tried to WAY overmonetize the value of computer security technologies. Once upon a time, Verisign wanted us all to buy $50 personal digital signatures and the banks got greedy with SET... and interesting (and potentially market changing) technologies went nowhere.

The killer app in most of these markets is lowering intermediary or operational costs and mostly security tech vendors RAISE costs and die.

Charles Schwab and Etrade changed the stock brokerage business by radically lowering transaction costs (using vanilla tech smartly - - first phone based ordering and then basic Internet browsers).

If you find a way to radically lower transaction costs, you will often find that the security technology is largely unnecessary - people will live with you as a vanilla trusted third party - because you are cheap(er).

* Security Suicide

Suppose they buy your security solution. If they sell to their customers based on your security, they are a slave to you... after all, if you sell to their competitors as well, then the industry becomes a commodity wrapped around your security tech.

So, it is irrational to license from you as their marketing becomes marketing for you instead of them.

But if they buy you, then they still incur the same marketing costs... so why buy you?

So, you are forced to become a competitor to the incumbents in the industry. Except you are starting at zero... and they can simply assert or build "good enough" security.

(Read about Sawstop's attempts to radically improve safety for table saws. They've succeeded somewhat)

* Switching costs

Switching an industry over to a new platform is hard and expensive. And, since in many cases the incumbents can simply lower costs (see the credit card processing industry), it is difficult to be a new entrant.

(Square and such have done well with small companies as the switching costs are low and they've been r*ped by the processors for years, so a lot of businesses were happy to be rid of them for any number of reasons).

* Who pays

Often, the incumbent operators (especially if there are separate infrastructure companies) simply pass on their costs to their end users.... so they don't see any benefit and the end users don't have enough leverage to push a change (credit card security being a pretty easy example again - merchants basically absorb all of the liability in the system).

In short... the problem isn't tech or security, no matter how cool, it is the real business...

... and the history of cool security tech - from encryption to digital signatures to .... pretty much everything is that we forget the business side.

Escaping NSA's Accidental Cryptography Trap

Towards Dynamic Relationship Security 

Once upon a time....

The cryptographic systems that drive the Internet today came together in the late 1980s and early 1990s. RSA and other public key cryptographic techniques became widely known and deployed in early Netscape, MOSAIC, and other browsers and web servers. At the time, the biggest consumer of cryptography was really the US government led by NSA and NIST*.

Things were different. The "big" public / private security program was to move email away from SMTP to X.400 with the classified world focused on the Defense Messaging System (DMS).

It didn't happen ...we are still using SMTP as the basis of email today and the X.509 certificate format may be the only vestige of those days.

But, many of the assumptions that drive our security systems came from those times and those concerns.

Two of which are causing us real problems today:

1. The focus on "confidentiality" and using encryption to protect it.
2. The core model of "trusted" or hardware based encryption.

Data disclosure vs. Identity Integrity

The underlying thinking of Cold War cryptography is about protecting sensitive data from an enemy for short or long periods of time.

War plans, intelligence, ... all sorts of secrets.

The keys that drive the cryptography might be compromised, but the systems are reliably managed by a central national authority (for the classified world in the US, NSA, the original idea for the unclassified world, something like the Post Office or NIST).

Basically, these systems are very hierarchical, centrally managed and control with the idea that the system itself, being the government, was inherently trustworthy and inherently secure.

The government, as it has always done for military systems, would "know" when keys had been compromised (such as a military position being overrun) and use its control of communication and security systems to remove the bad guys from the system.

... the problems begin to emerge:

  • Today, there are around 50 "certificate authorities" who issue the long term certificates in your browser. These systems are highly variable and the only thing you can say is that, at best, someone's credit card cleared somewhere when a certificate was issued... maybe.
  • While "confidentiality" is nice, our biggest problems are not "data in transit" - the real-time collection of sensitive information, but "data at rest" - the harvesting information from a server where access controls have been defeated and so any encryption that is used is irrelevant. 

The problem isn't an outside "bad guy", but a subverted "good guy".

  • Also, we are often more concerned about subversion of identity - if an access control failure occurs, that it doesn't spread and if a key is compromised, that it isn't used elsewhere. 

Basically, I care more about you using my stolen credit card number than I do about you having a copy of my Amazon purchases for January.

And this leads us right into the second big problem.

Our systems are designed assuming long term keys are stored securely

,,. and they aren't.

Back in the late 1980s and early 1990s, most cryptographic systems were still implemented in dedicated, hardware devices. Computing was expensive and cryptography, at the time, was very processor intensive, so it made sense that cryptography would be housed in a dedicated device as it had been since there were widgets that could do cryptography at all.

In the US, for DMS (and unclassifed email at the time), the idea was to use the FORTEZZA card. A PCMCIA (later PC card) peripheral which did all the cool crypto stuff.

As we now know, the plummeting price for hardware in the 1990s and since virtually eliminated the need for special purpose processors except for video. While there were cryptographic peripherals, the growing demands for ecommerce and pervasive need for some level of security essentially moved almost all commercial security implementations into software. Computing capability got way ahead of our ability to consume it... so ... why not implement security in software? It's "free" after all.

BUT, in most server environments, cryptographic keys are in software... and servers are where we've seen our major data breaches.

While I never gave my credit card to Yahoo, I certainly did to Target and Home Depot, yet somehow, their breaches never triggered a replacement for my credit card and change of my credit card number.

ASIDE: Things continue to change... hardware has finally gotten so inexpensive that dedicated security processors and peripherals are now economically viable in many environments (see the chip and PIN credit cards) though the ever reducing cost of general purpose processors will continue to challenge them.

Those keys that businesses hold.. the public key certificates that encourage us to "trust" online firms... are at real risk.

After all, when a data breach occurs, it is rarely found immediately, the problem often festers for months.

We have built up our security systems to solve a set of problems that don't reflect our needs or vulnerabilities.


Towards Dynamic Relationship Security

And it isn't just our cryptographic systems. Our access control methods (username and password, username and token, or username and biometrics) have similar vulnerabilities to server breaches as do our credit card, checking, and other payment systems.

Our main requirements are:

  • Protecting the relationship between parties for future actions
  • Protecting ourselves in the face of (nearly) inevitable breaches of individual's secrets, private keys, and identity and authorization information


* NOTE: I worked at NSA during part of this period and I had some involvement in the security of some the systems at the time. Nothing here is even vaguely classified. This narrative is a colorful recollection and reconstruction and should not be considered authoritative, but a "good security story" to scare kids and CISOs.

Get more No BS security insights and help


If you are interested in more articles like this, check out my Patreon and become a supporter. If you have a security question, ask me!


Become a Patron!

Wash your Forking Hands! Better security begins with basic digital hygiene

We've all been there. You're visiting friends and someone has to go to the bathroom. While they're out, you sit, you wait... maybe look around a bit... but you are always listening,

listening

... not listening for that "flushing sound"... though that one is pretty key,

but for the sound of running water interrupted a couple of times...

Wash your forking hands!*


Because it is pretty awkward and a bit disgusting when they've left the bathroom and you didn't hear that washing sound...

We've only really taken hand washing seriously since 1867 when Lord Lister published his paper on the use of carbolic acid to wash hands.

Just 150 years ago.

And it still a problem today at home... even in hospitals ... and doctors know better.

Digital Hygiene - rethinking better security

This past weekend, I started reading the book, Better, by the surgeon,  Atul Gawande. He opens the book with an extended dicussion of ...

handwashing

And its importance for infection control.
Fascinating stuff (I highly recommend the book and I'm still in the early parts of it).

What struck me was how computer security is very much like infection control.

Except, we've gotten wrapped up with all of our high-tech toys - our cryptography, biometrics, IDSs and IPSs, etc., etc. etc.

We are constantly looking for some security magic bullet.

But we don't wash our forking hands.

We don't practice basic digital hygiene.

It isn't really fun or sexy, but perhaps it is a better way to get better security.

And it isn't just one thing. There is no magic bullet.

Infection control works when everybody is involved. At every stage of each and every process.

There are changes in practices (washing hands, gloves). There are technologies (chemical and heat-based sterilization, disposable instruments). There are simple tools (alcohol gels instead of hand soap).

But, for computer security...we haven't built up our basic "digital hygiene" practices.

Instead of creating a comprehensive security regime of imperfect elements that strengthen security together, we keep looking for a "special security solution".

It hasn't worked.

The state of digital security today is no better, and probably worse, than it was when I started in the field in the mid 1980s.

We've gone from DES to elliptic curve cryptography, passwords to biometrics, hash functions to blockchains.... and security as experienced on the ground is still pretty awful.

So, rather than thinking big, let's start thinking small.

The No BS Security Guides that I'm creating are my effort to help make actual security better.

One step at a time.

Let's start washing our digital hands.





* In the quite funny TV series, The Good Place, the main character finds herself unable to swear because she has died and gone to "The Good Place". Every time she attempts to swear, her words are changed and f*cking becomes forking.


Become a Patron!