When someone starts talking about privacy online, a discussion of encryption is never too far off. Whether it is a relatively old standby like Tor or a much newer and more ambitious effort like Ethereum (more on this later) privacy equals encryption. With the exception of investigative journalism and activist interventions, geeks, hackers, and privacy advocates seem to have nearly universally adopted a “good fences make good neighbors” approach to privacy. This has come at significant costs. The conflation of encryption with privacy mistakes what should be a temporary defensive tactic, for a long term strategy against corporate and government spying. It is time that we discuss a new approach.
The prevailing logic seems sound: runaway government and corporate surveillance is often accomplished through the abuse of pre-existing data or the interception of daily digital life. We may be tracked via geotragged vagueposts about our flaky friends or Kik messages between activists might be intercepted as it goes from sender to receiver. End-to-end encryption is meant to prevent the latter sort of surveillance and is often compared to a paper security envelope: the network only knows the sender and recipient and the content of the data is obscured. There are lots of protocols and technologies that provide end-to-end encryption, the most prevalent being https which verifies the identity of both parties and keeps the digital envelope closed and secure as it traverses the series of tubes. Services like Gmail, Facebook, Twitter all use https and you can tell by looking at the address bar in your browser. Chrome even turns it a happy, reassuring shade of green.
If we were to take the security envelop to its logical conclusion however, the attitudes towards data security would appear preposterous if not deeply insufficient. If the government were opening our snail mail to the same degree they were vacuuming up our digital communications, private citizens’ first inclination might be to spend the extra money on tamper-evident/proof envelopes, but would we really continue to innovate primarily in better envelopes? Would we go on to make a private postal service even if we knew that the government had given itself lawful authority, if not the capacity, to search private as well as publicly conveyed correspondence? Would we continue to pour resources and effort into making a better envelope when the problem was obviously bad government?
There is a sort of digital dualism at play here. While efforts to develop encryption are rarely questioned, I think similar tactics for different problems would be criticized as both a stop-gap measure and overly defensive at a moment that demands an offensive strategy. That is, we should be building the capacity to weather tyranny so that we may fight against it, but creating a new normal of balkanized communication is wrongheaded.
To be clear, I’m not saying we shouldn’t be working on encryption while we fight the good fight against privacy invasion. I just don’t want us to mistake a coping strategy for a solution to a big problem. We tend to confuse advancement in encryption technology with social progress in the fight against government overreach. Even the most politically radical Anon, who most certainly is engaged in offensive strategies in every sense of the word, never seems to wish for the day when all of these good fences become unnecessary.
Ethereum, the latest invention to be touted as “artillery in the running battle between technology and governments” bills itself as nothing less than “web 3.0.” Its creators describe it as a total reorganization of how the Internet is run and how data is stored. Using the spare space on personal hard drives and processors, Ethereum offers encrypted, distributed, public and unalterable transaction records for everything from bank transactions to sexts. That means no one is in control of the system so authorities can’t shut it down, nor can data be disappeared or held in private databases.
Anything that, by design, hinders the accumulation of power is a good thing in my book. I like that the technology forces a kind of anarchic or rhizomatic politics. I tend to think of horizontal organization as running on interpersonal trust, but hackers don’t seem to see it that way. Even when it comes to advertising its decentralized infrastructure, Ethereum and similarly-designed cryptocurrency organizations choose to cast the lack of central authority as “trustless” rather than trustful. It is disturbing to me that a metric for good design is the lack of trust in one-another. That is the sort of thinking that got us into this mess in the first place.
Modern hierarchical bureaucracies, as Max Weber observed early in the twentieth century, make it possible to act without interpersonal trust. I know that a doctor I have never met is qualified because she has credentials and licenses from organizations that have knowable and somewhat static requirements. In a sense, we outsource interpersonal trust to large institutions so that we may trust people that we haven’t taken the time to get to know. We might achieve the same thing through aggregating lots of opinions, so long as we trust the aggregator. Once there was even the slightest suspicion that Yelp was in the business of removing bad reviews for a fee, the ratings of independent individuals became suspect.
Instead of handing over our trust to organizations like professional associations, governments, or corporations, hackers would have us move that trust to algorithms, protocols, and block chains. Of course human organizations can be co-opted and corrupted but so can algorithms. Coding is just as much a human (and thus social) endeavour as organizing a government or creating a business. But even if technologies weren’t vulnerable to human faults, our problems do not come from organizations and code working incorrectly. Most of them are doing exactly what they are supposed to do: corporations have fiduciary responsibilities to seek profits above all other things and just as the invention of the train also brings about the invention of the derailment, so too does the invention of the nation state yield war. We don’t need more things that let us go about our lives not trusting. We need to get rid of or deeply reform the institutions that foster distrust and fear.
If we build a world full of trustless technologies what happens when we feel ready to trust again? Even the anarchists who fought in the Spanish Civil War against Franco organized their militias without officers, salutes, or rank. They recognized that means and ends were deeply intertwined; you don’t get to a vastly better world by reproducing its undesirable elements. I do not know what a more communitarian technology would necessarily look like, but it I know we have to start by changing our ethic first.
David is on Twitter.
Comments 7
Anon — April 2, 2015
As always, I appreciate your writing and thoughts on these important but unfortunately niche issues. I agree that the term "trustless" is imperfect and invites mistrust. Distrust however is an important part of creating accountable social, political, and technological systems. The best definition of trust I have found is by Dan Geer*. He described trust as "the availability of effective recourse". Encryption and free software do not eliminate trust, as you stated in your piece. However, they do make effective recourse more available. They also decrease the power differential between software developers and service providers and the users who trust them.
This is the same process that has occurred with forms of government. History has shown monarchy and dictatorship to be undesirable forms of government because of the lack of effective recourse. Faced with this injustice, people created systems with greater levels of effective recourse like representative democracies and separate branches of government. Obviously the political systems we have today are insufficient and demand reform if not outright replacement. However, I don't see designing systems with greater levels of distrust as exclusive to those aims.
I think the distinction between trustless and trustful technologies is manufactured. We may be getting too hung up on the words and political leanings of those who create software like Ethereum to recognize that those systems actually support a common goal. People are willing to cede power and invest trust in others as long as there is effective recourse. Trust will always be necessary, and building political and technological systems that increase the availability of effective recourse should be the goal.
The whole talk is here: https://www.youtube.com/watch?v=nT-TGvYOBpI
Comradde PhysioProffe — April 2, 2015
Very interesting post, and totally agreed that we should be working to reduce the need to worry about governments spying on is. I have a few thoughts about why there is so much more attention paid on the Internet to making communication unsurveillable than for physical communications, like letters.
(1) There are a lot of people who love to program and love cryptography, and so they just love inventing clever shitte like this.
(2) It is vastly easier to roll out a new Internet anti-surveillance invention and see if it takes hold of people's attention and usage than it is to roll out special tamper-proof envelopes.
So I don't think the effort being put into these things is purely that we've just become resigned to the fact that the government is going to try to surveille everything we ever do on the Internet. Although there definitely is a sense of resignation out there. I am not a political activist and I don't really deal with confidential information, so my attitude at this point is basically, "Fucke itte. What the fucke is the govt gonna do with my communications, anyway?" And I do totally get that this is not a healthy attitude, but I only have so much "give a shitte" bandwidth.
Dumbdowner — April 9, 2015
Great post Nathan.
I agree, we shouldn't think that algorithms are any more trust-worthy than institutions.
There are two kinds of trust: blind trust, and researched trust. The former is the lazy kind which people are so afraid of, trusting strangers. The latter is getting to know the stranger well enough so that you can trust with confidence and the stranger is no longer strange. In a way, it's going beyond the need to trust. It's getting to know the person, institution,(or algorithm) so well, that nothing they do is much of a surprise.
The only problem is that it takes time and work to do the latter. People don't think they have time to do the work, because they are too busy with matters they feel are more important. In reality, they are busy doing what's less important.