PRISM is not strictly speaking surveillance. It looks like surveillance, it feels like surveillance, but it lacks the main purpose of surveillance: creating a disciplinary power relationship. When scholars talk about surveillance in a rigorous sense, they’re mostly talking about Foucault’s theory of the Panopticon. The original Panopticon was a plan for a prison, with the cells arranged so that a single guard could watch all the prisoners, and dispatch punishments and rewards as appropriate to the prisoner’s behavior. Eventually, according to Foucault, the desires of the warden would be internalized by the prisoners, and they would behave as planned. They would be disciplined.
Foucault’s genius was noting that the architecture of the Panopticon allowed the mechanisms of power to operate at very little cost, because prisoners could not tell when they were under observation, and so would always have to behave as if they were being watched. Additionally, panopticonic structures were everywhere: in classrooms, hospitals, urban renewal of medieval districts into broad boulevards, even the bureaucratic organization of the modern state into administrative districts and statistical agencies, to the point that a scholar describing yet another panopticon is met with a sigh and shrug.
As Whitney Boesel of Cyborgology noted, when we look for a disciplinary purpose in these NSA programs, we find nothing. Despite editorials in the New York Times and onThinkProgress with a Foucault 101 explanation of the panoptiocn, and gestures towards the chilling effect and future potential harms, it’s difficult to point to any specific thought or speech act that someone did not have as a consequence of the potential that they might be added to an NSA database. People appear to be totally free to say and think whatever they want online, including espousing flatly anti-democratic opinions across the political spectrum. These programs are no more panopticonic than 20thcentury statecraft in general.
Privacy has a totemic value in American political discourse, but privacy as a concept is fuzzy at best. Philosophically, my colleague Jathan Sadowski describes privacy as “that which allows authentic personal growth,” a kind of antithesis to the disciplining and shaping of the panopticon. Legally, American privacy originates in a penumbra of rights defined in the 4th Amendment (protection from arbitrary search and seizure), 9thAmendment (other, unspecified rights), and 14th Amendment (right to due process). Privacy has further become established as part of the justification for reproductive freedom in Griswold v Connecticut and Roe v Wade, loading it with all the baggage of the culture war.
But what is privacy, really? Future Supreme Court Justice Louis Brandeis, in an influential 1890 essay, described it as “the right to be left alone.” Brandeis’s essay was published in the context of an intrusive popular press using the then-new technology of instant photography to violate the privacy of New York society members. Brandeis extended the basic right of a person to limit the expression of their “thoughts, sentiments, and emotions” into a fundamental divide of public and private spaces. Since then, mainstream legal thought has attempted to apply Brandeis’s theory of privacy to new technology and new concerns, with varying degrees of success.
Brandeis metaphor breaks down in the face of Big Data because Brandeis was concerned with the gradations of privacy in space (it is acceptable to be photographed on the red carpet at a premier, unacceptable on your doorstep, totally illegal inside your home), and computers and data are profoundly non-spatial. There is no “Cyberspace.” That’s an idea cribbed from a science-fiction book written by a man who’d never seen a computer. Spatial metaphors fundamentally fail to capture what computers are doing. Computers are, mathematically speaking, devices that turn numbers into other numbers according to certain rules. These days, we use computers for lots of things: science, entertainment, but mostly accounting and communication. And for the latter two uses, the phrase “my personal data” (which inspires so much angst) confuses personal to mean both “about a person” and “belonging to a person.”
Advocates of strict privacy control tend to confuse the two. Privacy is contextual, social, and promethean, so I’d like to analyze something concrete instead: secrets. A secret is something that a person or a small group knows, which they do not want other people to know. Most “personal data” is actually part of a transaction, whether you’re buying a stick of gum at the gas station or looking at pictures stored on a remote server. We’re free to keep records of your side of the transaction, yet we’re outraged when the other side keeps records as well. We could ask the other side of the transaction to delete the records, or not share them, but at its strongest this is a normative approach. There’s no force behind it.
Moving from the normative ‘ought’ to ‘is’ requires a technological fix. Physical privacy is important, but walls and screens are far more sure than the averted gaze. It’s wrong to steal, and valuable things are locked up. The digital equivalent to walls and locks is cryptography—math that makes it difficult to access a file or a computer system. Modern crypto is technically speaking, very very good. RSA-256 is unbreakable in the lifespan of the universe, assuming it’s correctly used. The problem with cryptography is that it’s very rarely used according the directions. People use and reuse weak passwords, they leave themselves logged in on laptops which get left in taxis, or they plug in corrupted USB keys, compromising entire networks.
There is a very real chance that there is no such thing as digital privacy or security; that Stewart Brand’s slogan that “information wants to be free” is true in the same way that “nature abhors a vacuum.” The basic architectures of computers, x86 and TCP/IP, are decades old and inherently insecure in the way that they execute code. Cloud services are even worse. We as users don’t own those servers, we don’t even rent them. We borrow them. Google and Facebook aren’t letting us use their services out of the goodness of their hearts, and that data that we enter (personal data in both senses) is the source of their market power. Sure, there are cryptographically secure alternatives (DuckDuckGo, Hushmail and Diaspora come immediately to mind), but their features are lacking and relatively few people use them. Crypto is both hard, and runs directly against the business model of major internet companies. The best way to keep a secret is not to tell anybody, and if you have a real secret, I’d strongly advise you to never tell a computer.
Practically, not even the director of the CIA follows that advice. Unless you’re Amish, you have to tell computers things all the time, which leads to the problem of what the government can do and should not do with all that data. I personally don’t like the “if you’ve done nothing wrong, you have nothing to fear” arguments advanced by advocates of the security state, because the historical record shows plenty of good reasons to distrust American intelligence agencies, ranging from mere incompetence (missing the imminent collapse of the Berlin Wall, Iraq’s non-existent WMDs, the Arab Spring) to outright criminality (CIA backed assassinations in the 60s and 70s, COINTELRPO, Iran-Contra), but this wasn’t some kind of rogue operation: data was collected according to the PATRIOT act, overseen by FISA judges, and Congressional was informed. Certainly it was according to the letter of the law rather than the spirit, but it happened within the democratically elected, bipartisan mechanisms of government just the same. It’s hard to deny that many voters were willing to make that trade of liberty against security.
The dream of counter-terror experts everywhere is some kind of perfect prediction machine, some kind of device which could sift through masses of data and isolate the unique signature of a terrorist plot before it materializes. This is a fantasy. Signals intelligence and social network analysis is immensely useful for mapping a known entity and determining its intentions, but picking ‘lone wolves’ out of a mass of civilians is a different beast entirely. Likewise, data mining can do great work on large and detailed datasets, but since 2001 there have been only a handful of terrorist attacks in America and Europe (local insurgencies have very different objectives and behaviors). There is no signature of a immanent terrorist attack. Realistically, what these systems can do is very rapidly and precisely reconstruct the past, making the history of an event legible to determine the extent of an attack and hunt down co-conspirators.
What’s happening isn’t really surveillance; the millions of people buying 1984 are reading the wrong part of the book. Orwell’s Party is terrifying not because of the the torture chambers in the Ministry of Love, but because it can say “We have always been at war with Eurasia, and the chocolate ration has been increased to 2 oz” (when last month it was 3 oz), and what The Party says is true. Rewriting history is dangerous for nations, but as Daniel Solove has eloquently pointed out, for individuals the proper literary comparison isn’t 1984, its Kafka’s The Trial, where the protagonist is bounced powerlessly and senselessly through an immense bureaucracy.
The political problems of these programs and Big Data are not the same as the problems of secret prisons, torture chambers, and non-judicial executions, though all those things are very real and very dangerous to civil liberties. The more common assaults are the unnecessary audit, the line at the airport, the job application rejected because of a bad credit score, and the utter lack of recourse that we as citizens have against these abuses by many large scale organization, including corporations and governments.
We could mandate laws to force basic changes in how computers work and how data is collected, such as deleting everything as soon as it comes in or packing databases with random chaff. Purely legal solutions to technological problems are almost never effective, and usually add another layer of complexity to the existing mess. As any security expert will tell you, security through obscurity is no security at all, and anonymous data is far from anonymous. Giving up and living as suspects under glass, fugitives in our own lives, is equally unappealing.
The alternative is recognizing that in a world of omnipresent computation, leaving traces behind is inevitable, but that rather than the uncertain shield of privacy, we can wield a sword of truth. To ask for privacy is to ask to be forgotten, something both impossible and generally undesirable. We should have the right to set the record straight, to demand to know what is known about us as individuals and as a population, and to appeal what are currently non-judicial and unaccountable actions. Brandeis’s right to be left alone is not the right to disappear, but rather to demand that those who would try and harass us reveal themselves and defend their actions.
This essay originally published at As We Now Think
I like this article, but I suspect that your proposal of allowing people to retrieve and correct stored personal data would be ineffective.
ReplyDeleteEven if the appeals process is straightforward (and not corrupt), people still must invest time in
(1) learning where and how their personal data is being collected
(2) retrieving this data and reviewing it for errors
(3) learning about the appeals process
(4) filing an appeal.
The investment is substantial, all to correct a situation that should not happen in the first place. An appeals system is still guilt-by-default, and individuals with limited resources and knowledge will be left vulnerable. Furthermore, the appeals process is extremely vulnerable to corruption: it is easy to hamstring an appeals system by making the bureaucracy opaque and inefficient.
It would be as if trials began with "we are sentencing you to five years, you have six months to appeal this decision". We see comparable abuses of civil law with the so called copyright and patent trolls. These predatory lawfirms have made a killing by capitalizing on the fact that the appeals process is more costly than capitulating to exploitation. Recently, we have made some headway in shutting down predatory law-firms, but only because what they were doing was already illegal, and so they can be dealt with in one blow.
Mis-use of personal data can in some ways be *worse* than guilt-by-default. Personal data can be misused against someone silently for years. By the time the effects are realized, it can be too late for an appeal to right the damage.
I suspect that these practices need to be explicitly illegal, so that one class-action lawsuit can wipe the entire database. Requiring that each individual invest time and energy and know-how to "clear their name" effectively legalizes and protects hoarding and exploitation of personal data.
No right to be forgotten? Short version:
ReplyDeleteDespite the public outrage concerning privacy, I predict no meaningful reforms. To protect our privacy the only solutions will be DIY rather than policy solutions.
Long version:
The distinction between corporate right to track and government right to track is in my opinion esoteric. Practically, these two will find ways to cooperate regardless of the law or public opinion. And corporate right to track is, as Biff points out, essentially unassailable. Yes, while I may keep my gas station receipt, you the gas station should be expected to destroy yours? And if not, what possible policy solution could there be? Even if it were feasible to have regulators / third party curators for these databases to oversee what goes on, I serously doubt that it would be effective, or that this could find basis in our body of law.
Some of us may recall how the 9/11 commission report explained that while the government had collected considerable evidence of the plot beforehand, there were strict rules which prevented the CIA and FBI from talking to one another and so on. As incredible as it may seem, it may be that sometimes the government urge to track is weaker than the government urge to do as little as possible and then hide behind red tape. I think the game has changed significantly since then; all those kind of walls have been bulldozed, and politicians understand now that they may actually win or lose elections because of terrorist attacks. Today I think everyone understands that no matter who is in office, the government will actively aggregate all information available to it and continuously investigate all possibilities.
Film and literature from days gone by has immortalized the notion of the frontier, where a man can leave his old life behind, move to a town where no one knows him and begin anew, and likely no one will find him again. In days when dust bowls followed depressions, this was the closest we had to a safety net; the frontier meant that you were entitled to a clean slate. Although it no doubt informed our American concept of privcay, all that is all gone now. In modern times we cannot and will not be able to separate ourselves from our past.
What I think we instead should fight for is this. That no one has a right to be forgotten, that while the government may record my image at intersections without my permission as they please, I should also have the right to film police officers on duty, be they on the street, during an interrogation, what have you. That anything it is legal for me as a private citizen to see with my two eyes, it is legal for me to record with a video camera and upload live to youtube. As I understand it, many states now have laws against filming police officers, but it is crucial that we reverse this to prevent abuses of power, just as protection for whistleblowers is crucial to curb government abuse of power. We may be constantly being tracked and there may be nothing we can do about it, but the flip side is that we may together constantly track the government and make its abuses known.
Going forward, our notion of privacy expectations under the law will simply be updated, and children will grow up used to the idea that any time they said something unfortunate online it will be easily searchable, that anytime you go anywhere and buy anything, someone in or very close to the government is taking note, and so on. At some point we will be asking ourselves the question -- if you left bullying messages on a classmates facebook page when you were twelve, should your potential employer know that during job interviews, and worry if you can work well on a team? The answer will be, despite what should or shound't be, if your potential employer wants to know, they can, and they will evaluate the significance for themselves. Santa Clause will take on a new meaning for children; someone is always watching, he knows when you've been bad or good, and he never forgets.
Rather than policy solutions to privacy issues, cryptography gives the people the tools to do it themselves, and this is probably the only way that it can be done. If you want privacy in your emails, use PGP or something like it, and run the implementation on your machine. Don't trust google, don't trust any business to protect your secrets for you. If you want to transact anonymously, you will have to do something like fund a credit card with bitcoins. If the assumptions crypto is based on are sound, your secrets will be safe. And while no one really knows the truth about these assumptions, they have stood now for decades, and it seems unlikely that the state of affairs can be improved by proving or eliminating the assumptions. Despite the shortcomings, modern crypto seems to be quite robust in practice. This is why you will periodically see things on slashdot about e.g. the Obama administration proposes that all software companies that make crypto software be required to make a special backdoor for the government to sneak in etc.
ReplyDeleteProbably the biggest idea in theoretical crypto that has quite made it into practice is "homomorphic encryption". This is a technical term for "private cloud computing". The idea is that I should be able to send you an encrypted data file, and some encrypted C code, and you should be able to run the code on the data and give me back the output *without decrypting any of it*. You simply combine the ciphertexts, which you cannot read, in some blackbox way, magically the computation happens on the encrypted data, and you are left with a third ciphertext containing the output, which you also cannot read, and which you send back to me. In practice, this could mean that you could use a service like Google docs, but all the while, Google will not be able to read your data, or even know if you are using Google docs or picasa. Obviously, Google seems very unlikely to cooperate with such a scheme; but it's possible that linux types will implement it on a large scale, or that consumers will demand it eventually. Currently the state of affairs is that we can do homomorphic encryption under the same assumptions as we usually do encryption, with runtime that is *asymptotically good* but not really practical.
What does this mean for people today? Privacy protections under the law are going the way of the cowboys, the constitution not withstanding. In the near future, only power users will actually have privacy from the government and others. But the trend for the last 15 years has been that the power users of today are the typical users of tomorrow, and always sooner than you expect. I predict that more and more of this kind of technology will see widespread adoption, especially if advocates make efforts to raise awareness of and supporting these projects.
Thank you Mike and Chris!
ReplyDeleteAs Chris pointed out, what's good for the goose is good for the gander, and we need better mechanisms of accountability and transparency pointed toward government. Whole systems of secret courts are anathema, and the police should always have cameras on them. There's the Open Data Initiiative/Data.gov, which is interesting but has gotten bogged down in tricky details of linking different data standards and database architectures.
Mike: What is "mis-use of personal data" vs "use of personal data"? What shouldn't we know about each other?
I have to admit I don't have good ideas for the practicalities of an appeals process. As much as I wish there were a decent way to demand accountability, in practice it'd be swarmed by schizophrenics demanding the CIA stop gang-stalking them and Sovereign Citizen hucksters. Worst case, the rich and powerful would have a whole separate system to manipulate what is said about them, as with UK libel laws.
Mostly I was just really irritated at the "end of privacy and democracy op-eds".
Useful information. Thanks for sharing
ReplyDeletefda certificaat
thats great information,please share more.......CE Marking Certification
ReplyDelete