Craig Gentry, a cryptographer working at IBM’s Thomas Watson Research Center in the suburbs outside New York City, recently received a phone call that changed his life. His passion, an experimental and mainly theoretical type of encryption called homomorphic encryption, just won a MacArthur “Genius Grant.”
The complicated encryption method lets users run programs without actually decrypting them. Paul Ducklin, a security researcher working for Sophos, laid out a neat summary of how this works:
Imagine, however, if I could simply take your encrypted search terms, leave them encrypted, search for them directly in the still-encrypted database, and get the same results.
If I can perform calulations directly on your encrypted data, yet get the same results that you get from the unencrypted data, we both win enormously from a security and privacy point of view.
You don’t need to give me any decryption keys at all, so you no longer have to trust me not to lose, steal or sell your data. (You still have to trust me to tell you the truth about any results I work out for you, but that is a completely different issue.)
And I no longer need your decryption keys, so I can’t lose or abuse your data even if I wanted to.
For security-conscious cloud and SaaS providers, this is a very big deal. Gentry has been working on homomorphic encryption for years, and the first big steps to commercialization came out last year when IBM released an open source software package for developers called HElib. The HE stands for homomorphic encryption.
John Launchbury, a DARPA program manager, told Co.Labs that “Originally cryptography was all about keeping communications private. Then it became standard to use cryptography for securing stored data, in case someone steals your computer. Now with the prevalence of cloud computing, it is becoming clear that we also need to be serious about data confidentiality even while computing with it–in case someone is able to observe the computation as it proceeds.”
“Homomorphic encryption,” he added, “Is one way to enable this: it is a form of encryption that allows computations to be performed on data without having to decrypt the data. You could store information on a cloud server, have the cloud provider perform some tasks on the data, without the cloud provider ever learning anything about your data. This could have profound implications for improving our privacy. Unfortunately, the performance challenges are so serious that it cannot yet be used in practice.”
Writing back in 2009, security expert Bruce Schneier explained that homomorphic encryption is important because it could potentially make security much easier for distributed software systems:
Any computation can be expressed as a Boolean circuit: a series of additions and multiplications. Your computer consists of a zillion Boolean circuits, and you can run programs to do anything on your computer. This algorithm means you can perform arbitrary computations on homomorphically encrypted data. More concretely: if you encrypt data in a fully homomorphic cryptosystem, you can ship that encrypted data to an untrusted person and that person can perform arbitrary computations on that data without being able to decrypt the data itself. Imagine what that would mean for cloud computing, or any outsourcing infrastructure: you no longer have to trust the outsourcer with the data.
Although Schneier went on to be critical about practical applications for homomorphic encryption (which, to be fair, was written years ago), IBM has been taking out patents on the method that hint at eventual commercialization.
Gentry didn’t invent homomorphic encryption, but his research is going a long way to making it usable. Over the next five years, Gentry will receive a no-strings-attached grant of $625,000 from the MacArthur Foundation to follow his passions. In a few years, if his work makes its way to the marketplace, it might solve a lot of our current problems with privacy protection and data security.