SHARE Interview: Ella Jakubowska on biometric mass surveillance
Ella Jakubowska from European Digital Rights (EDRi) spoke to SHARE Foundation’s Filip Milošević and shared her thoughts on the dangers of biometric mass surveillance for human rights and freedoms.
Filip: Hey Ellie. I can’t hear you…
Ella: But I can hear you, wait. – I’ve got a funny desk with different levels but I can use some books just to make sure…- It’s fine.
Filip: So you can just tell us who you are and what you do.
Ella: My name is Ella Jakubowska and I am a policy and campaigns officer at European Digital Rights, EDRi.
Filip: Thanks for being with us, Ellie. Recently in Serbia we started having these issues of possible mass surveillance. They are installing lots of cameras around the city. Seeing your paper I see that you’ve done a good job of defining some of the core problems. There are several of them. So maybe we can have a few sentences about all of them together and try to explain the most core problems that people should understand, when it comes to mass surveillance and facial recognition technology.
Why biometric mass surveillance should be banned
Ella: We see so many risks and threats posed by this growing and uncontrolled use of facial recognition and other biometric processing that it’s almost hard to know where to start. Because, when we think about the type of society that we want to create and the world that we want to live in, it feels like biometric mass surveillance is the complete antithesis to a world that we want if we’re thinking about that in terms of our rights and our freedoms as citizens.
Citizens’ rights and freedoms
Ella: When we start having systems installed throughout our public spaces which we have rights to enjoy, to use as places to express ourselves, to protest, to hold power to account suddenly the whole environment changes if there are cameras trained on us all of the time. It’s being done across Europe and across the world in currently such an unaccountable untransparent way. We know that different cameras are being equipped to process our sensitive data in a really wide range of ways. For example, if that data is combined with other data the power relations that structure our society are changing. What I mean by that is – if we’ve got these cameras being installed throughout our cities and there is a blurring of who is responsible for developing them, for setting the parameters that they might pick up people based upon, and then storing, processing that data and matching it with other data sources – suddenly we no longer know who is capturing that data about us we don’t know which private actors might be involved in that and what sort of influence they might have over our governments and the people whose role it is to keep us safe. And, really, throughout the biometric industry you see a lot of private actors whose motivation is to make money and so when they’re taking sensitive things like our faces and our bodies there are just so many ways that that can be used against us. If we don’t know what’s happening, if we’ve not got clear evidence that these systems have been introduced in ways that are safeguarded and with protections for us as citizens and individuals which right now we’re really not seeing then it really opens the door for a lot of different shady actors to be watching us and building up patterns of our movements. If you belong to a community that already gets overpoliced or watched, surveilled to a high degree, so that could be people of colour, that could be people from certain religious groups that could also be human rights defenders and political dissidents the idea then that both public and private actors can suddenly build up this picture of where you go and who you meet with is actually very dangerous because that can be used to target you even more so. So the real problem for us is that those who already have disproportionate amounts of power stand to gain more and more and those that are already in positions of powerlessness will be made even more powerless by this dynamic of who gets to watch and who is watched. And this is really frightening because it means that these private actors who want to make money from our faces and our bodies and from watching us will have a really high level of control and influence over our governments and may have more technical knowledge than our governments… It’s a really complex web of different actors who are gaining power but the ones that lose power then are the citizens. We no longer have control and freedom in our public spaces lose our ability to be anonymous in public which is really fundamental to our ability to be involved in democracy and to express ourselves freely.
Ella: If we have less diversity in the people that represent us and in the voices that are heard in our communities and all we hear are the rich and the powerful and the highly educated that have the knowledge of these systems it’s really not the kind of world that we want to create or the kind of society that would benefit the vast majority of people. The changes in people’s behaviour when they’re being constantly watched have been well substantiated and if you extrapolate that onto a societal level and you think about how we might all change our behaviour if we know we’re being watched… It doesn’t mean that we were doing anything wrong, but it means that we’ll suddenly become very aware of what we’re doing. That’s where these things start having a chilling effect because if we’re all suddenly aware that there are cameras trained on us all the time that things could be used against us, we no longer feel so comfortable expressing how we feel, we might choose to stop meeting with certain people because how it looks… We will change how we go about our lives.
Freedom of expression
Ella: There’s a real sense of empowerment from being able to express yourself differently and suddenly, if you’re forced to conform, this composes a real threat to your identity. It really challenges your sense of dignity and who you are as a person and who you’re allowed to be in your society in a way that’s very dangerous. What we’ve concluded in our paper, as EDRi, is that that creates a slippery slope towards authoritarian control. Having a mass surveillance society that wants to put us all in boxes will really dangerously disincentivize people from being able to be individuals and instead will create societies of suspicion and fear and a sense that everybody is a potential suspect. Function creep is one of the really big problems that we see with these mass surveillance infrastructures especially when they use our face data or other sensitive data about our bodies, our health and who we are as people. We know for a fact that once these systems and structures are in place they will be reused in ways that they were not initially intended for and that means that safeguards will not have been put in place for these new uses.
Function //// creep
Ella: So we know that even from an economic point of view it looks good for a government to say: “We already got these systems, we can now do all these great shiny technological things with them. Why would we waste it? Why would we not do more and more?” And from a human rights point of view that’s absolutely terrible because that’s being driven by the technology rather than being driven by the sort of societies that we want to create and by thoughts of how we protect people and we create vibrant democratic communities that everyone can take part in. And these techno-solutionist ideas can often be pushed by private actors. Again, that speaks to who’s really in charge. Who’s got the power over our public spaces. And if we don’t know who’s got the power how can we hold them to account? Linked to that, normalisation is another really big problem that we see because even a use that might be, from a fundamental rights perspective, less dangerous like unlocking your personal phone where you control the data, nothing leaves your device, it still creates this sense that our faces are something to be commodified our faces are something that we can use in lieu of a password and actually that’s not the case. Our faces have a really special quality because they’re so linked to our identity that if we start seeing them as interchangeable with a password we start really undermining the value and that actually poses a lot of questions about our dignity as human beings our autonomy as human beings. And as we see more and more private actors coming in to try and find ways to monetize this data that is being collected on us really, it means that by becoming comfortable everyday letting our faces be used in all sorts of applications, we’re giving a carte blanche to private companies to commodify and objectify our faces and use them to sell us things, to infer things about us and predict and make judgements about us which could then be used to control us.
Ella: Once we have all these different systems in place that can track us across time and place we also have the fact that multiple databases can be introduced that can all fit together. So, suddenly, this intimate pictures of who we are, not just showing where we go and who we interact with but they’re linked to our faces and our bodies in a way that means we can never be anonymous because the moment our face is detected suddenly this picture of us and maybe how we walk and therefore what health problems we might have will be linked. It could be linked with our criminal records. It could be linked with our personal data, our health data, our online browsing… There are really massive potentials for these different databases to be layered on top of each other. And suddenly we’ll have these systems, and in some cases we already do have systems that just know vast amounts of things about us that can so easily be used against us. Once you’re being tracked across time and place and especially once various databases are being brought in, in very opaque ways, there is suddenly the possibility for very authoritarian methods of social control. China is a very good example of this. It’s been quite widely reported over the last few years that they have introduced social credit score, linked to people’s identity. They’re then being either rewarded or punished for doing things like buying alcohol or interacting with family compared to interacting with known political dissidents. And people’s scores are then being used to control their access to their fundamental rights. Are they allowed to leave the country? Are they able to get car insurance? So, suddenly, it’s not just that your life is being watched it’s also that your life is being analysed and someone far away is making a judgement about whether what you’re doing is in line with their vision of control. And often it’s not a “someone”, it’s an algorithm which adds a whole other layer of opaqueness and a whole lot of dangers for how biases can be embedded in technology. Any society that looks to stratify people based on how they look, based on their health, based on their data and things about them, is an incredibly authoritarian and sinister society. The societies throughout history that have tried to separate and stratify people based on data about them are the sort of authoritarian societies that we want to stay as far away as possible from. We think that if governments are really going to listen it needs pressure from all parts of society. It needs people to be holding power to account to be calling out surveillance when they see it and contributing to civil society organisations and the activists that are trying to reveal these secretive rollouts and that are trying to make sure that this is something that is there for public debate and for all of us to decide, not for private companies who want to make money out of us to decide, not for police forces who want to save money and cut corners to decide. This is for our societies and communities and it needs to be something that we all collaborate on together.