SHARE Interview: Ella Jakubowska on biometric mass surveillance

Ella Jakubowska from European Digital Rights (EDRi) spoke to SHARE Foundation’s Filip Milošević and shared her thoughts on the dangers of biometric mass surveillance for human rights and freedoms.

Filip: Hey Ellie. I can’t hear you…

Ella: But I can hear you, wait. – I’ve got a funny desk with different levels but I can use some books just to make sure…- It’s fine.

Filip: So you can just tell us who you are and what you do.

Ella: My name is Ella Jakubowska and I am a policy and campaigns officer at European Digital Rights, EDRi.

Filip: Thanks for being with us, Ellie. Recently in Serbia we started having these issues of possible mass surveillance. They are installing lots of cameras around the city. Seeing your paper I see that you’ve done a good job of defining some of the core problems. There are several of them. So maybe we can have a few sentences about all of them together and try to explain the most core problems that people should understand, when it comes to mass surveillance and facial recognition technology.

Why biometric mass surveillance should be banned

Ella: We see so many risks and threats posed by this growing and uncontrolled use of facial recognition and other biometric processing that it’s almost hard to know where to start. Because, when we think about the type of society that we want to create and the world that we want to live in, it feels like biometric mass surveillance is the complete antithesis to a world that we want if we’re thinking about that in terms of our rights and our freedoms as citizens.

Citizens’ rights and freedoms

Ella: When we start having systems installed throughout our public spaces which we have rights to enjoy, to use as places to express ourselves, to protest, to hold power to account suddenly the whole environment changes if there are cameras trained on us all of the time. It’s being done across Europe and across the world in currently such an unaccountable untransparent way. We know that different cameras are being equipped to process our sensitive data in a really wide range of ways. For example, if that data is combined with other data the power relations that structure our society are changing. What I mean by that is – if we’ve got these cameras being installed throughout our cities and there is a blurring of who is responsible for developing them, for setting the parameters that they might pick up people based upon, and then storing, processing that data and matching it with other data sources – suddenly we no longer know who is capturing that data about us we don’t know which private actors might be involved in that and what sort of influence they might have over our governments and the people whose role it is to keep us safe. And, really, throughout the biometric industry you see a lot of private actors whose motivation is to make money and so when they’re taking sensitive things like our faces and our bodies there are just so many ways that that can be used against us. If we don’t know what’s happening, if we’ve not got clear evidence that these systems have been introduced in ways that are safeguarded and with protections for us as citizens and individuals which right now we’re really not seeing then it really opens the door for a lot of different shady actors to be watching us and building up patterns of our movements. If you belong to a community that already gets overpoliced or watched, surveilled to a high degree, so that could be people of colour, that could be people from certain religious groups that could also be human rights defenders and political dissidents the idea then that both public and private actors can suddenly build up this picture of where you go and who you meet with is actually very dangerous because that can be used to target you even more so. So the real problem for us is that those who already have disproportionate amounts of power stand to gain more and more and those that are already in positions of powerlessness will be made even more powerless by this dynamic of who gets to watch and who is watched. And this is really frightening because it means that these private actors who want to make money from our faces and our bodies and from watching us will have a really high level of control and influence over our governments and may have more technical knowledge than our governments… It’s a really complex web of different actors who are gaining power but the ones that lose power then are the citizens. We no longer have control and freedom in our public spaces lose our ability to be anonymous in public which is really fundamental to our ability to be involved in democracy and to express ourselves freely.

Public space

Ella: If we have less diversity in the people that represent us and in the voices that are heard in our communities and all we hear are the rich and the powerful and the highly educated that have the knowledge of these systems it’s really not the kind of world that we want to create or the kind of society that would benefit the vast majority of people. The changes in people’s behaviour when they’re being constantly watched have been well substantiated and if you extrapolate that onto a societal level and you think about how we might all change our behaviour if we know we’re being watched… It doesn’t mean that we were doing anything wrong, but it means that we’ll suddenly become very aware of what we’re doing. That’s where these things start having a chilling effect because if we’re all suddenly aware that there are cameras trained on us all the time that things could be used against us, we no longer feel so comfortable expressing how we feel, we might choose to stop meeting with certain people because how it looks… We will change how we go about our lives.

Freedom of expression

Ella: There’s a real sense of empowerment from being able to express yourself differently and suddenly, if you’re forced to conform, this composes a real threat to your identity. It really challenges your sense of dignity and who you are as a person and who you’re allowed to be in your society in a way that’s very dangerous. What we’ve concluded in our paper, as EDRi, is that that creates a slippery slope towards authoritarian control. Having a mass surveillance society that wants to put us all in boxes will really dangerously disincentivize people from being able to be individuals and instead will create societies of suspicion and fear and a sense that everybody is a potential suspect. Function creep is one of the really big problems that we see with these mass surveillance infrastructures especially when they use our face data or other sensitive data about our bodies, our health and who we are as people. We know for a fact that once these systems and structures are in place they will be reused in ways that they were not initially intended for and that means that safeguards will not have been put in place for these new uses.

Function //// creep

Ella: So we know that even from an economic point of view it looks good for a government to say: “We already got these systems, we can now do all these great shiny technological things with them. Why would we waste it? Why would we not do more and more?” And from a human rights point of view that’s absolutely terrible because that’s being driven by the technology rather than being driven by the sort of societies that we want to create and by thoughts of how we protect people and we create vibrant democratic communities that everyone can take part in. And these techno-solutionist ideas can often be pushed by private actors. Again, that speaks to who’s really in charge. Who’s got the power over our public spaces. And if we don’t know who’s got the power how can we hold them to account? Linked to that, normalisation is another really big problem that we see because even a use that might be, from a fundamental rights perspective, less dangerous like unlocking your personal phone where you control the data, nothing leaves your device, it still creates this sense that our faces are something to be commodified our faces are something that we can use in lieu of a password and actually that’s not the case. Our faces have a really special quality because they’re so linked to our identity that if we start seeing them as interchangeable with a password we start really undermining the value and that actually poses a lot of questions about our dignity as human beings our autonomy as human beings. And as we see more and more private actors coming in to try and find ways to monetize this data that is being collected on us really, it means that by becoming comfortable everyday letting our faces be used in all sorts of applications, we’re giving a carte blanche to private companies to commodify and objectify our faces and use them to sell us things, to infer things about us and predict and make judgements about us which could then be used to control us.


Ella: Once we have all these different systems in place that can track us across time and place we also have the fact that multiple databases can be introduced that can all fit together. So, suddenly, this intimate pictures of who we are, not just showing where we go and who we interact with but they’re linked to our faces and our bodies in a way that means we can never be anonymous because the moment our face is detected suddenly this picture of us and maybe how we walk and therefore what health problems we might have will be linked. It could be linked with our criminal records. It could be linked with our personal data, our health data, our online browsing… There are really massive potentials for these different databases to be layered on top of each other. And suddenly we’ll have these systems, and in some cases we already do have systems that just know vast amounts of things about us that can so easily be used against us. Once you’re being tracked across time and place and especially once various databases are being brought in, in very opaque ways, there is suddenly the possibility for very authoritarian methods of social control. China is a very good example of this. It’s been quite widely reported over the last few years that they have introduced social credit score, linked to people’s identity. They’re then being either rewarded or punished for doing things like buying alcohol or interacting with family compared to interacting with known political dissidents. And people’s scores are then being used to control their access to their fundamental rights. Are they allowed to leave the country? Are they able to get car insurance? So, suddenly, it’s not just that your life is being watched it’s also that your life is being analysed and someone far away is making a judgement about whether what you’re doing is in line with their vision of control. And often it’s not a “someone”, it’s an algorithm which adds a whole other layer of opaqueness and a whole lot of dangers for how biases can be embedded in technology. Any society that looks to stratify people based on how they look, based on their health, based on their data and things about them, is an incredibly authoritarian and sinister society. The societies throughout history that have tried to separate and stratify people based on data about them are the sort of authoritarian societies that we want to stay as far away as possible from. We think that if governments are really going to listen it needs pressure from all parts of society. It needs people to be holding power to account to be calling out surveillance when they see it and contributing to civil society organisations and the activists that are trying to reveal these secretive rollouts and that are trying to make sure that this is something that is there for public debate and for all of us to decide, not for private companies who want to make money out of us to decide, not for police forces who want to save money and cut corners to decide. This is for our societies and communities and it needs to be something that we all collaborate on together.

SHARE has brought Google to Serbia

Any requests and objections which the citizens of Serbia may have regarding their personal data processed by Google, can now be resolved through the company’s representative in Serbia. Google, as one of the first tech-giants complying with the new Serbian law, wrote a letter to the Commissioner for Information of Public Importance and Personal Data Protection, i.e. Serbia’s Data Protection Authority, on May 21st, 2020, stating that their representatives would be “BDK Advokati” from Belgrade.

YouTube, Chrome, Android, Gmail, maps and many other digital products without which the internet is unimaginable, are an important segment of the industry which entirely relies on processing personal data. With a significant delay and numerous difficulties, states have begun bringing some order in this field, which directly interferes with basic human rights. The European Union has set this standard by adopting the General Data Protection Regulation (GDPR), while the new Law on Personal Data Protection in Serbia, in application since August 2019, followed this model too.

Although they have been operating in Serbia for a long time, global tech-corporations observe most developing countries as territories for an unregulated exploitation of citizens’ data. At the end of May 2019, three months before the application of the new Law on Personal Data Protection, SHARE Foundation informed 20 biggest tech companies from around the world about their obligations towards the citizens of Serbia whose data is being processed.

Twitter responded to us by saying that they were working on it. A global platform for booking airline tickets, eSky, contacted us and appointed their representative in Serbia. In December 2019, we filed misdemeanor charges to the Commissioner.

Read more: community strikes back against mass surveillance

Serbian citizens have launched the website as a response to the deployment of state-of-the-art facial recognition surveillance technology in the streets of Belgrade. Information regarding these new cameras has been shrouded in secrecy, as the public was kept in the dark on all the most important aspects of this state-lead project.

War, especially in the past hundred years, has propelled the development of exceptional technology. After the Great War came the radio, decades after the Second World War brought us McLuhan’s “global village” and Moore’s law on historic trends. Warfare itself has changed too – from muddy trenches and mustard gas to drone strikes and malware. Some countries, more than others, have frequently been used as testing grounds for different kinds of battle.

Well into the 21st century, Serbia still does not have a strong privacy culture, which has been left in the shadows of past regimes and widespread surveillance. Even today, direct police and security agencies’ access to communications metadata stored by mobile and internet operators makes mass surveillance possible. 

As appearances matter most, control over the flow of information is a key component of power in the age of populism. We have recently seen various developments in this context – Twitter shutting down around 8,500 troll accounts pumping out support for the ruling Serbian Progressive Party and its leader and the country’s President Aleksandar Vucic. These trolls are also frequently used to attack political opponents and journalists, exposing the shady dealings of high ranking public officials. Reporters Without Borders and Freedom House have noted a deterioration in press freedom and democracy in the Balkan country.

However, a new threat to human rights and freedoms in Serbia has emerged. In early 2019, the Minister of Interior and the Police Director announced that Belgrade will receive “a thousand” smart surveillance cameras with face and license plate recognition capabilities, supplied by the Chinese tech giant – Huawei. Both the government in Serbia and China have been working on “technical and economic cooperation” since 2009, when they signed their first bilateral agreement. Several years later, a strategic partnership forged between Serbia’s Ministry of Interior and Huawei, paving the way to the implementation of the project “Safe Society in Serbia”. Over the past several months, new cameras have been widely installed throughout Belgrade.  

This highly intrusive system has raised questions among citizens and human rights organisations, who have pointed to Serbia’s interesting history with surveillance cameras. Sometimes these devices have conveniently worked and their footage is somehow leaked to the public, and in some cases, they have not worked or recordings of key situations have gone missing, just as conveniently. Even though the Ministry was obliged by law to conduct a Data Protection Impact Assessment (DPIA) of the new smart surveillance system, it failed to fulfil the legal requirements, as warned by civil society organisations and the Commissioner for Personal Data Protection

The use of such technology to constantly surveil the movements of all citizens, who are now at risk of suddenly becoming potential criminals, has run counter to the fundamental principles of necessity and proportionality, as required by domestic and international data protection standards. In such circumstances, when there was no public debate whatsoever nor transparency, the only remaining option is a social response, as reflected in the newly launched website. 

“Hiljade kamera” (“Thousands of Cameras”) is a platform started by a community of individuals and organisations who advocate for the responsible use of surveillance technology. Their goals are citizen-led transparency and to hold officials accountable for their actions, by mapping cameras and speaking out about this topic to the public. The community has recently started tweeting out photos of cameras in Belgrade alongside the hashtag #hiljadekamera and encouraged others to do so as well.

The Interior Ministry has yet to publish a reworked and compliant Data Protection Impact Assessment (DPIA) but the installation of cameras continues under sketchy legal circumstances.

Bojan Perkov is a Policy Researcher at the SHARE Foundation. His interests and areas of work include freedom of expression and online media, as well as all other issues related to online expression such as hate speech, net neutrality, censorship, etc. Twitter: @Bojan_Perkov.

Read more:

The right to privacy in the time of coronavirus: freedom’s last line of defence?

Dr Mihajlo Popesku, Head of Research, Auspex International
Catalina Bodrug, Research Scientist, Auspex International

Earlier this month, Auspex conducted two large-scale online surveys1 in the UK and Italy, focusing on residents’ behavioural and emotional responses to the Covid-19 pandemic and resulting lockdown, with samples of 2,001 respondents in each country – representative by age, gender, region and socioeconomic class. 

As part of our analysis, we were able to identify various groups or segments in each country, with tendencies to engage in either constructive or destructive behaviours: those who panic and despair, those who remain calm and optimistic, and those who thrive and flourish in isolation. A full infographic report of our findings is provided here. One of the most interesting insights, for Share Foundation readers, was that people in both countries strongly reject the idea of data monitoring as a means of tackling the spread of the coronavirus.

We asked both British and Italian respondents to rate the acceptability of eleven actual and potential government interventions. An overwhelming majority of British and Italian residents were prepared to countenance certain measures to contain the epidemic, including closing pubs/restaurants (82% UK, 78% Italy), washing their hands for 20 seconds (84% UK, 73% Italy) and enforced staying at home (both 71%). For Britons, however, the monitoring of personal data ranked as the least acceptable measure, with an approval rating of just 16.7%. In Italy, the situation was not dissimilar, with the measure ranking second to last with an approval rate of 23.2%. What is more, residents of both countries are more likely to favour curfews and remaining in lockdown over having their personal data tracked.

These insights suggest that both Italians and Britons are aware of the importance and sensitivity of data protection rights, and that any attempt to infringe their privacy is generally regarded as the ultimate loss of freedom.

Our next exercise focused on understanding the differences between those people who accept and those who reject the monitoring of personal data. For this purpose, we merged the two samples, to try to find regularities and patterns, regardless of the respondents’ nationality. 

People who accept data monitoring (20%) are members of a more mature segment, with 1 in 3 aged over 65. Their emotional response to the crisis is ambivalent, with a mix of increased anxiety and happiness being most often reported. This is a very alarmed and anxious segment, with 2 in 3 reporting that their country is in a state of emergency. Compared to a month ago, this segment now feels more positive about government authority, wants to spend more time with family, feels more positive toward homeschooling, strongly supports closing borders to foreign visitors, is increasingly interested in social justice and “woke” activism, and feels more patriotic, creative and self-reliant. They believe that the best thing is for the country to remain united, and to get behind the Prime Minister, government and institutions – even if it means taking drastic actions to help tackle the spread of the disease. They see Covid-19 as a very serious situation, in which everyone is at risk. Fighting the coronavirus is a team effort, requiring mass compliance and the use of all means necessary – even if this involves the monitoring of personal data. On average, people in this group are better informed about Covid-19, are more compliant, and have engaged more frequently in constructive behaviours. In the event of institutional collapse/meltdown these people would help others and attempt to repair the damage. Their values are duty and tradition. Their lifestyle centres around travelling, exploration and education.

People who reject data monitoring (80%) make up a younger segment. These individuals report that the pandemic has inspired mostly the worst in them. They are feeling increasingly deflated, bored, stressed or tired as a result of the crisis. They are significantly more concerned with isolation and loneliness. This group is more likely to have experienced negative feelings such as anger or resentment over the government’s interventions, and in the event of civil unrest due to Covid-19, they are more likely to engage in mass protests or to leave the country. They also show significantly less support for and trust in institutions, and are more likely to be lax in complying with official instructions. They seem to be in a vulnerable position, as they were somewhat more likely to be professionally affected by the pandemic. This segment scored significantly higher in neuroticism, which indicates their fragility, sensitivity and irritability. They need to feel safe and secure. People in their social circle are mostly scared of not having enough to live with dignity. This group likes music, history, computer games and lifestyle content. They care about animal and workers’ rights.

Key takeaways:

  • The monitoring of personal data by government or state institutions as a means of tackling the spread of coronavirus is very unpopular, with the vast majority of both Italians and Britons finding it unacceptable.
  • Those who are supportive of their respective government, men, and members of the older generation are more likely to accept the monitoring of their personal data.
  • Those who are most fearful of Covid-19 are also the most likely to accept the monitoring of their personal data.
  • Data privacy appears to be the ‘final line of defence’ of personal freedom, as people are more willing to accept curfews and to be confined to their homes than they are to lose their privacy and have their personal information monitored or tracked – whatever the reason behind it.

1The study was fully anonymous and no personally identifiable information was collected. Our respondents came from a mix of 70 different access panels in the UK, and 73 panels in Italy, adjusting for the sample coverage and sampling frame bias.

Dr Mihajlo Popesku, Head of Research at Auspex International in London, is a marketing scientist working on applied social research and statistical modeling of consumer/voter behaviour.

Catalina Bodrug, Research Scientist at Auspex International in London, is working on research design and statistical data analysis. She graduated in Economy and Business at UCL.

Read more:

A Password Pandemic. How did a COVID-19 password end up online?

The username and password to access the Covid – 19 Information System were publicly available on a health institution’s web page for eight days. This period of time was long enough for the page to be indexed by Google and, although invisible on the web page, it was accessible through a simple search. After discovering the matter on the 17th of April, we immediately informed the competent authorities.

Screenshot of the webpage with login credentials for the Covid – 19 Information System

The Covid – 19 Information System is a centralized software for collecting, analyzing and storing data on all persons monitored for the purpose of controlling and suppressing the pandemic in Serbia.

How did we get this data?

Along with the state of emergency, the Government of Serbia introduced numerous measures to tackle the pandemic, which included collecting and processing personal data in the unprecedented circumstances. The Government also informed citizens about these measures by rendering unclear and undetailed conclusions,  none of which specified who was supposed to process the citizens’ data and how.

In an effort to understand the data flow and implications on citizens’ rights, we explored the new normative framework through publicly available sources. By searching keywords on Google, we accidentally discovered the page containing access information for the Covid – 19 Information System. The data was published on the 9th of April.

In addition, we also managed to obtain manuals with instructions for navigating the centralised system webpage.

Which data was at risk?

As per Government’s Conclusion on establishing the Covid – 19 Information System, a significant number of health institutions is required to use the mentioned software to keep records on cured, deceased and tested persons (whether positive or negative), as well as on persons currently being treated, in self-isolation or put in temporary hospitals, including their location data. This system also contains data on persons who are possible disease-carriers due to their contact with other infected persons. The institutions are required to provide daily data updates, as it’s the basis of the diurnal 15 o’clock report read.

While attempting to clarify how our data is being stored, we could not have imagined that we would discover the access password and thus be able to enter the system – just as anyone else who may have found this webpage. It was immediately clear to us that the most sensitive citizens’ data were endangered and that the crucially important integrity of the system cannot be guaranteed in the fight against the pandemic.

We did not log into the system, which would anyway record such an attempt. Instead, we reported the case to competent authorities: the Commissioner for Information of Public Importance and Personal Data Protection, the National CERT and the Ministry of Trade, Tourism and Telecommunications. Being aware of the risk of misuse arising with the accessibility of citizens’ sensitive data, we have decided to notify the public of the incident only after making sure that the authorities had prevented unauthorized access to the system.

Report of the breach sent to competent authorities by email

How did the competent bodies react?

Less than an hour following our report, we were informed that the initial steps were taken as a response to the incident, makings sure that the web page containing the username and the password is no longer publicly available.

Given the scope of the case, we may expect further action from the competent bodies. The Commissioner has the authority to initiate monitoring in line with the Law on Personal Data Protection, the competent ministry is in charge of the inspection monitoring in line with the Law on Information Security, whereas the National CERT has the obligation to provide advice and recommendations in case of an incident.

Who’s to blame?

Aware of the pressure put on health services at the peak of the pandemic, we agreed that, for now, it would be appropriate not to publish the information on the specific health institution in which the incident took place. On the other hand, there is no doubt that the scale of this incident demands that the responsibility for its occurrence is properly determined.

The national legislative framework provides various mechanisms to prevent these kinds of situations, but the occurrences in practice are often far from the prescribed standards. Although they handle particularly sensitive data, health workers are often unaware of all possible risks present in the digital era. Health institutions are required to appoint a data protection officer, but due to limited resources, persons with insufficient expertise and unrelated primary job concerns are usually appointed to this position. In this specific case, the data protection officer may have been a person who takes care of corona-infected persons on a daily basis.

As today’s data protection demands the involvement of an IT expert, this requirement causes an additional burden to the public health institutions’ budget. Sometimes this means that the same person deals with all technical issues within an institution, while being paid far less than their private sector counterparts and without the opportunity to build further information security expertise.

Covid – 19 Information System established by the Government represents a key point in a complex architecture for collecting and processing all defined data. Data collection occurs through different channels, while a single health institution is only a one system entrance point. In such a system, it is rather difficult to implement protection measures at entrance point level, meaning they should be defined at the central level as it would significantly lower the risk of incidents. Based on this case, we have concluded that only one user account was created for each of the health institutions, which does not enable determining individual responsibility for the system misuse.

What should have been done?

Without doubt, this is an ICT system of a special importance within which special categories of personal data are being processed. As such, it implies the necessity to undertake all measures stipulated by the Law on Information Security and the Law on Personal Data Protection in phases of its development and implementation. SHARE Foundation explored these measures to a great detail in its Guidebook on Personal Data Protection and Guidebook on ICT Systems of Special Importance.

By any means, it is necessary to fully implement privacy by design and security by design principles, which entail the following regarding the access to a system:

  • Every system user has their own access account
  • Every system user has the authorisation to process only the data necessary for their line of work
  • Access passwords are not published via an open network
  • A standard on password complexity is put in place
  • The number of incorrect password entries is limited

Our accidental discovery on Google revealed a breach of security and data protection standards within the health system. The state of emergency instituted due to pandemic cannot serve as an excuse for a job poorly done, nor can it serve as an obstacle for conducting an immediate detailed analyses of compliance of Covid – 19 Information System with security standards.

Read more:

Attachment: Data flow in the Covid – 19 Information System

Facebook is starting to follow electoral and political advertising in the Balkans

Facebook has announced that it will expand its transparency system and confirmation of authenticity of ads about elections and politics starting from mid-March. Namely, Facebook will cover additional 32 countries, including Serbia and North Macedonia where the elections are to take place very soon.

This turn of events follows the efforts of SHARE Foundation and its international partners to point out to representatives of Facebook the problem of Western Balkans countries being excluded from those where Facebook actively monitors political advertising. This issue is very important in light of the election campaigns in Serbia and North Macedonia, having in mind possible manipulations, the lack of transparency of funding of ads and using non-political pages to advertise for political purposes.

Facebook, Inc. will in this manner expand the transparency of political advertising on their main social networking platform and Instagram in the mentioned countries. Until now, such policies were implemented mainly because of suspicion of foreign interference into election processes during the US presidential elections and Brexit referendum in 2016. The Cambridge Analytica scandal, when data of tens of millions of citizens leaked and pressure from states followed, also pressured Facebook to improve the transparency of its platform.

Facebook Ad Library will provide access information on total advertising expenses, number of ads, as well as data about specific ads – demographic target group of the ad, geographic scope of the ad, etc. In order to analyse political advertising, Facebook will enable researchers, journalists and the public access to the Ad Library API. In addition, by the end of April, it will be possible to download a report for the new 32 countries with aggregated data on ads about elections and politics.

All actors, including political parties, candidates and other organisations wishing to post ads about elections or politics on Facebook and Instagram will be required to register as advertisers, so it can be seen who paid for advertisements. It is also necessary for advertisers to confirm the identity with official documents issued by the state where they wish to publish ads, as well as additional information such as local address, telephone number, email and website if they wish to use the name of a Facebook page or organisation in the disclaimer. In case they do not register, Facebook may restrict posting ads about politics and elections during the verification process.

SHARE files complaints against Facebook and Google

SHARE Foundation filed complaints to the Commissioner for Information of Public Importance and Personal Data Protection of Serbia against Facebook and Google for their non-compliance with the obligation to appoint representatives in Serbia for data protection issues. In May this year, before the start of application of the new Serbian Law on Personal Data Protection, SHARE Foundation sent letters to 20 international companies and called upon them to appoint representatives in Serbia, in accordance with the new legal obligations.

Appointing representatives of these companies is not a formality – it is essential to exercising the rights of Serbian citizens prescribed by Law. In the current circumstances, companies like Google and Facebook view Serbia, like many other developing countries, as a territory for unregulated exploitation of citizens’ private data, even though Serbia harmonized its rules with the EU Single Digital Market by adopting the new Law on Personal Data Protection. Namely, these companies recognise Serbia as a relevant market, offer their services to citizens of the Republic of Serbia and monitor their activities. In the course of doing business, these companies process a large amount of data of Serbian citizens and make huge profits. On the other hand, the new law guarantees numerous rights to citizens in relation to such data processing, but at the moment it seems that exercising these rights would face many difficulties.

Among other things, these companies do not provide clear contact points that our citizens can contact – they mostly have application forms available in a foreign language. Our experience has shown that such forms are not adequate because they require advanced knowledge of a foreign language by Serbian citizens, but also because this type of communication is mostly done by programs that send generic automated responses.

Although fines under the domestic Law on Personal Data Protection that the Commissioner may impose, in this case 100.000 Serbian dinars (around $940 or €850), wouldn’t have a major impact on the budgets of these gigantic companies, we believe that they would show that the competent authorities of the Republic of Serbia intend to protect our citizens and that these companies are not operating in accordance with domestic regulations.

Complaint against Google

Complaint against Facebook

Unlawful video surveillance with face recognition in Belgrade

The Impact assessment of video surveillance on human rights, conducted by the Ministry of Interior of Serbia, did not meet the legal requirements. Also, the installation of the system lacks basic transparency. Hence, the process should be suspended immediately and the authorities should engage in an inclusive public debate on the necessity, implications and conditionality of such a system.

The installation of smart video surveillance in Belgrade, with thousands of cameras and face recognition software, has raised public concern. Three civil society organisations (CSOs) – SHARE Foundation, Partners for Democratic Change Serbia (Partners Serbia) and Belgrade Center for Security Policy (BCSP) – published a detailed analysis of the MoI’s Data Protection Impact Assessment (DPIA) on the use of smart video surveillance and have reached a conclusion that the document does not meet the formal or material conditions required by the Law on Personal Data Protection in Serbia.

The Commissioner for Personal Data Protection of Serbia also published his opinion on the DPIA, confirming the findings of the aforementioned organisations. According to the Commissioner, the DPIA was not conducted in line with the requirements of the Law on Personal Data Protection.

The opportunity to address all issues of public interest through the MoI’s DPIA was missed, as well as the obligation to fulfill both formal and material terms required by the Personal Data Protection Law. The DPIA does not meet the minimum legal requirements, especially in relation to smart video surveillance, which is a source  of most interest and concern of the domestic and foreign public. The methodology and structure of the DPIA do not comply with the requirements of the Personal Data Protection Law because The positive effects on crime reduction as described in the DPIA are overestimated, due to the fact that relevant research and comparative practices have been used selectively. It has not been established that the use of smart video surveillance is necessary for the sake of public safety, or that the use of such invasive technology is proportionate, considering the risks to citizens’ rights and freedoms.

The MoI should suspend further introduction of smart video surveillance systems. In addition, the MoI and the Commissioner should initiate an inclusive public debate on video surveillance legislation and practice that will be in line with a charter on the democratic application of video surveillance in the European Union.

Policy brief – Serbian government is implementing unlawful video surveillance with face recognition in Belgrade

Open Letter: Facebook’s End-to-End Encryption Plans

4 October 2019

Dear Mr. Zuckerberg,

The organizations below write today to encourage you, in no uncertain terms, to continue increasing the end-to-end security across Facebook’s messaging services.

We have seen requests from the United States, United Kingdom, and Australian governments asking you to suspend these plans “until [Facebook] can guarantee the added privacy does not reduce public safety”. We believe they have this entirely backwards: each day that platforms do not support strong end-to-end security is another day that this data can be breached, mishandled, or otherwise obtained by powerful entities or rogue actors to exploit it.

Given the remarkable reach of Facebook’s messaging services, ensuring default end-to-end security will provide a substantial boon to worldwide communications freedom, to public safety, and to democratic values, and we urge you to proceed with your plans to encrypt messaging through Facebook products and services. We encourage you to resist calls to create so-called “backdoors” or “exceptional access” to the content of users’ messages, which will fundamentally weaken encryption and the privacy and security of all users.


Access Now
ACM US Technology Policy Committee
American Civil Liberties Union
Americans for Prosperity
Association for Progressive Communications (APC)
Asociación por los Derechos Civiles (ADC), Argentina
Bolo Bhi
Canadian Internet Registration Authority
Centro de Ensino e Pesquisa em Inovação (CEPI), FGV Direito SP, Brasil
Center for Democracy & Technology
Center for Studies on Freedom of Expression (CELE), Universidad de Palermo
Defending Rights & Dissent
Derechos Digitales, América Latina
Digital Rights Watch
Državljan D
Electronic Frontier Foundation
Electronic Privacy Information Center
Engine – for digital rights
Fight for the Future
Free Press
Freedom of the Press Foundation
Fundación Karisma, Colombia
Future of Privacy Forum
Global Forum for Media Development
Global Partners Digital
Hiperderecho, Peru
Human Rights Watch
Index on Censorship
Instituto de Referência em Internet e Sociedade (IRIS), Brazil
Instituto de Tecnologia e Sociedade do Rio de Janeiro (ITS)
International Media Support (IMS)
Internet Society
Internet Society – Bulgaria
Internet Society UK England Chapter
ISUR, Universidad del Rosario, Colombia
IT-Political Association of Denmark
Iuridicum Remedium, z.s.
LGBT Technology Partnership
National Coalition Against Censorship
New America’s Open Technology Institute
Open Rights Group
Paradigm Initiative
PEN America
Prostasia Foundation
R3D: Red en Defensa de los Derechos Digitales
Ranking Digital Rights
Restore The Fourth, Inc.
Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC)
SHARE Foundation
S.T.O.P. – The Surveillance Technology Oversight Project

BIRN and SHARE Join Efforts to Counter Digital Freedom Violations

In Southern and Eastern Europe, where online disinformation campaigns are increasingly endangering guaranteed individual freedoms and a notable decline in internet safety is ubiquitous, BIRN Hub will partner with SHARE Foundation to monitor digital threats and trends in their occurrence, raise awareness about violations of digital freedom and issue policy recommendations.

The organisations will identify the main players involved in disinformation and propaganda by establishing a Digital Monitoring database. The database will cover the state of digital rights in targeted countries by documenting cases of violations of digital rights and freedoms, with descriptions of cases and corresponding sources.

The project, supported by Civitates, will monitor digital freedom violations in Bosnia and Herzegovina, Croatia, Hungary, North Macedonia, Romania and Serbia.

The database will be part of the broader online BIRN Investigative Resource Desk (BIRD), a new resource platform for investigative journalists expected to launch this fall. The interactive database will allow the general public to access data collected through the monitoring system.

The use of SHARE Foundation’s expertise will result in the creation of a detailed methodology and guidelines for monitoring violations of digital rights and freedoms, as well as training for monitors to successfully gather data and file them in the newly created database. A three-day training for monitors will be held in the second half of July in Perast, Montenegro.

In parallel, BIRN journalists will produce and publish five investigations related to the topic. On the basis of monitoring activities, a one-of-a-kind cross-regional report will be produced, to be presented at the closing event.

The database will provide the data for periodical reports on the state of digital rights and freedoms in targeted countries. In terms of outcomes, the cross-regional report will compile collected data in order to introduce public to trends in violations of digital freedoms.

Continuous monitoring and reporting on digital threats will contribute to BIRN’s wider efforts to promote accurate and unbiased information. It will strengthen the capacities and skills of the network’s journalists, as well as exposing and countering threats that journalists and other engaged individuals face on a regular basis.