CPH:DOX. What Ed said

CPH:DOX. What Ed said

The world’s most famous whistleblower Edward Snowden skyped into Denmark March 23 to comment on a range of matters, from coronavirus to Tonje Hessen Schei’s iHuman to the dangers of ceding news provision authority to the likes of Facebook.

 

Snowden joined Danish journalist and docmaker Henrik Moltke in conversation which kicked off naturally with coronavirus, it being le sujet du jour. Here is a snapshot of what Snowden said.

 

“The coronavirus is a serious problem, but it is a transient problem. We will eventually have the vaccine, or even if we don’t we will eventually have herd immunity, and in two years this problem will be gone,” he said.

 

“But the consequences of the decision that we make now are permanent. And I think this is crucial to bear in mind from the perspective of a free society – a virus is harmful but the destruction of rights is fatal. This is a permanent thing that we don’t give back.” 

 

“When we lose a right that we fought a revolution for, that there was a movement founded for, that took a hundred years of effort to win, and then we lose it in a moment of panic, this is the connection with 9/11.”

 

He added: “We prioritise and order society for the defence of the individual and the common collective good, and this is derived from the protection of rights. If we begin destroying rights, sacrificing rights in order to improve things, we are actually making things worse.”

 

The subject turned to Tonje Hessen Schei’s iHuman, which has been made available for Danes to watch online over the course of the festival. Moltke informed us that Schei had herself contracted coronavirus but she had signalled that “she was fine, under the circumstances.”

 

“AI is promising a lot of things that are not possible,” Snowden observed. “In the film, for example, there is a person who is saying, ‘look, what we can do is look at someone’s face and have a machine determine their sexual orientation’. 

 

“This should be chilling to all of us, but we need to understand  that this is exactly how any kind of large institution prefers to function. 

 

“They want to prioritise efficiency, but in free societies efficiency is actually dangerous. We try to limit the exercise of efficiency because we don’t want probabilistic [reconnaissance]. We don’t want people to be creating algorithms to say that this person is probably a criminal, or this person is probably a member of this disfavoured minority group…. 

 

“[So] we intentionally make the act of policing difficult. We make them get warrants. We make them go through hoops to try and seek evidence. We limit the kind of powers they can use. We limit when they can use force and to what extent, we limit how people can be imprisoned [and] the limits that they can be held under. We stack the deck against power because that is the only way to guarantee a measure of liberty. When you concentrate too much power in too few hands, this is called tyranny, and it seems to me in a lot of ways we are missing this.”

 

“This gets us back to the [question of] can a computer tell someone’s innermost thoughts and the answer is ‘no’. What it is doing is assigning probabilities based on what are effectively stereotypes that are being generated based on models. They are ingesting, for example, the corpus of Facebook photos or something like that, and they go through the profile and all of these people who identify as being of a certain orientation, they [say] that’s Group A and then they take everybody else and say that’s Group B.

 

“The question is… is that gain an efficiency worth the cost to civil and public, and ultimately, individual liberty?” he asked.

 

He then turned to the issue of news provision and the responsibilities which should, or should not, be assigned be granted, assigned or ceded to the new giants of online. 

 

“I think this is actually controversial but there is a little bit of a mistake in the common belief that we need to get tech companies to deal with this misinformation online, because in order to do that tech companies have to be put into the position where they are the ‘deputy’ of truth.” Especially in a world where governments have a vested interest in suppressing or denying what is or isn’t true so as to avoid being exposed politically, Snowden underlined.

 

“The question is: do we want to empower what are already some of the most powerful institutions in the world (Facebook is the internet for an extraordinary percentage of the population) to make these decisions, to give them censorship authority… or do we educate people to help them understand what is reliable information, what is factual information and what is specious rumour… 

 

“If you trust Facebook to tell people what is good for them and what’s bad for them, Facebook will exercise this authority ultimately in the fullness of time to the benefit of Facebook, not to the benefit of society.”

 

Towards the end, Snowden was pinned down on the politics of Russia, where he still lives in exile, and how protest can be voiced there.

 

“The politics in this place are historically problematic, but we have to recognise that the government is constantly going to struggle, the government is constantly going to abuse the people, and the only thing that will create a better government is […long pause…] dissent. 

 

“I mean, its ultimately people standing up [saying] ‘this is too far, I don’t agree with this, I don’t like this, I don’t like you, I don’t like the way you speak, I don’t like this policy, I don’t like, don’t like , don’t like… And I do like this’. That is democracy. That is how we reach consensus. If we don’t express our opinions [and] beliefs, but most especially if we are not willing to stand for our beliefs, then we have no influence.”