If the aim of social protection projects is to increase coverage and emergency protection, especially for the most vulnerable members of our communities, we think it’s important to take into account already well-documented problems associated with integrating things like automation and social registry databases – such as the risk of data being leaked, or used for political profiling, persecution or discrimination. For example, where the World Bank finances the integration of automation, there are well-known serious concerns about maintenance and sustainability of systems which rely on up-to-date and accurate data, especially in countries where social protection services are severely under resourced. Finally, where the World Bank integrates digital and biometric verification with access to urgent cash payments or other protection needs, those who need support the most might be inadvertently excluded, which is a widely-documented risk associated with such practices.
To find out more about some of the potential concerns and problems highlighted in the table above, here are further resources:
Does technology always improve social protection systems?
In some contexts, technology may contribute to accelerating processes, increasing efficiency, and providing important solutions in crisis and emergency situations. For example, implementing digital payment systems allowed agencies to disburse benefits without risking contact and transmission while countries were going into lock-down.
However, certain technologies also bring with them serious risks to fundamental rights, equality and accessibility. This is particularly concerning when important safeguards for privacy and data protection are not built in by design. These technologies also bring challenges such as costs to the environment, and, as development agencies have previously pointed out, increased financial cost, technical complexity, and challenges in relation to maintenance and sustainability.
Prior work by PI and our global partners has documented how the lack of careful consideration of the implications of increased digitalisation, automation, and intrusive data collection in social protection programmes is likely to result in the arbitrary surveillance, targeting, profiling of those seeking assistance. Importantly, this may further hinder efforts to reach those most in need, leading to exclusion and further exacerbating existing inequalities.
The programmes we looked into for this piece raise similar concerns as to whether the introduction of the technology will improve the enjoyment of the rights to social protection, dignity, and autonomy, or whether new risks have emerged which have gone unaddressed and unmitigated.
For example, in Mozambique an independent observer undertook an analysis of the emergency social protection eligibility criteria. They found that in many communities, people were frustrated by the lack of information on the program including who’s eligible, how much they’re entitled to, and the frequency with which the benefit is distributed. Additionally, throughout the roll-out of the cash-transfer programme, there were reports of people being threatened and harassed over the phone by individuals demanding that benefit claimants hand over the phones to which the benefit is linked. It is not clear how these people got hold of beneficiaries phone numbers.
In Lebanon, SMEX found that the platforms used to sign up for COVAX vaccines, request permits to move around during lock-down and store sensitive information including passport numbers and home addresses, were not properly secured.
In Angola, and in connection with a cash transfer-based social protection project which was supported by the World Bank, the government undertook a process of ‘data validation’, where government teams sought to validate provisional lists of beneficiaries which had been ‘systemically generated’ by visiting specific neighbourhoods. The government described these visits as an opportunity for people to learn if their name is on the beneficiary list or not, but the eligibility criteria and the way these lists where generated remained undisclosed.
So, what does this have to do with surveillance, privacy, and equality?
Setting up systems which continuously collect massive amounts of personal data without implementing parallel human rights safeguards, including effective data protection, increases government agencies’ and 3rd parties’ ability to surveil specific individuals and communities.
These risks are not theoretical or anecdotal, they reflect well-documented systemic concerns with the use of data and technology in the design and deployment of social protection programmes. The incredible efforts of civil society, academics, and investigative journalists around the world have documented various examples evidencing the risks associated with digitalising social protection programmes. This includes exclusion and invasive and undignified surveillance of people in our communities who are often living through precarious conditions.
This is especially concerning in countries such as Haiti, Lebanon, Nigeria, Jordan, and Morocco where political dissidents, human rights defenders and journalists have faced repression, work under the threat of violence and arbitrary detention and there is limited respect for the rule of law. In these contexts, in order to protect their safety and security, activists and human rights defenders who may be targeted by security forces have a particular need to protect their personal and sensitive data. They should not be prevented from accessing urgent social protection because they are unwilling to provide biometric data, for example or fear information-sharing between state agencies.
Additionally, PI has previously highlighted how minoritised communities – most notably women, trans and gender diverse people are impacted by welfare surveillance and ID systems. It is important to note that people from persecuted or marginalised communities, and people who have been subjected to discrimination and racism will often need urgent social protection in times of crisis. This includes LGBTIQ+ persons, people from ethnic minorities that have been subjected to persecution, victims of gender-based violence, people threatened with deportation (such as undocumented refugees and migrants), and people who need to access sensitive medical care such as abortions or HIV treatment. For example, in Malaysia, activists from the civil society group Justice For Sisters have highlighted that individuals from the LGBTIQ+ community are disproportionately impacted by poverty “due to criminalisation, social stigma and discrimination”. They therefore argue that systems which provide access to welfare must also protect the right to privacy in order ensure LGBTIQ+ individuals are able to access welfare without increasing their vulnerability to persecution.
Biometric data collection and information-sharing across government agencies may disincentivise people from accessing much-needed social protection – particularly people who are facing extreme poverty, but at the same time, want to protect their identity from groups, institutions, or public officials that could use this data to cause harm or violate their rights. This has already been acknowledged by other development agencies such as GIZ, the German development agency.
Whilst the World Bank adopts an apolitical approach to its financing and priorities, we believe that accounting for the context in which these programmes are deployed is crucial. Not doing so fails to consider the power dynamics at play which should inform how to design and implement a project. This includes, for example, a good understanding of the communities at risk in certain contexts, and proper identification of the necessary, sometimes context-specific, safeguards required, depending on the regulatory and legal context and the role of the rule of law – which many of the safeguards built into these projects depend on in order to be effective.
Strengthening safeguards (1): due diligence, privacy, and human rights impact assessments
The World Bank has repeatedly incorporated references to data protection in its social protection work, acknowledging the importance of protecting the right to privacy in the context of progress to more equal, fair, and human rights based social contracts. Notably, based on the additional information provided by the World Bank’s Social Protection and Jobs team to PI, we also know that as part of any project planning and design process, the World Bank will undertake wide-ranging legal due diligence exercises, including around data protection legislation. In countries where the Bank considers that the data protection legislation which is in force conforms with international data protection standards, that legislation is deemed adequate. In countries where no data protection legislation has been implemented, the Bank will generally include contractual obligations requiring relevant government and implementing agencies to adhere to data protection standards.
While the existence of national legislative frameworks and contractual obligations implementing international data protection standards are certainly necessary and represent an initial step towards strengthening people’s rights to control how their data is used and processed, we are concerned with whether these legal safeguards and contractual provisions are, in fact, practically enforced or enforceable. For example, it is not generally clear which domestic authority within a state is responsible or indeed accountable when it comes to upholding these standards. Additionally, of the case studies we looked at, only Nigeria, Angola, Lebanon, and Morocco have implemented data protection legislation.
Strengthening safeguards (2): practical ways of mitigating risks to personal data and protecting human rights
Through our research, we also came across provisions within the World Bank’s project implementation documents which require grievance mechanisms to be set-up as part of the social protection programmes. This is an essential tool for individuals and communities to be able to challenge decisions around eligibility or report mistreatment.
Whilst these measures are welcome, at the same time, based on our review of publicly available project implementation documents and the related social impact assessments for the countries we looked into, it is not clear whether or not the World Bank put in place the necessary mitigations or safeguards to limit harms which result from issues such as inter-agency unauthorised use of sensitive personal data, data leaks, algorithmic bias, inaccurate data on registry systems, lack of enforcement of data protection laws, lack of transparency around automated eligibility criteria, or the lack of alternative (non-digital) means of accessing social safety-net payments.
The hardship that people who work within informal economies, those who are unemployed, and anyone who is undocumented in their state of residence faced throughout the global Covid-19 pandemic uncovered severe gaps in governments’ ability to uphold people’s socio-economic rights to health, food, and housing. We recognise that the World Bank, along with other development banks finance a diverse range of projects aimed at strengthening states’ infrastructure to deliver these rights – especially during crises. At the same time, our goal is to ensure that in the process of designing and implementing financing agreements which seek to solve these problems, we are not inadvertently laying the foundations for intrusive welfare systems.
Whilst we appreciate the urgency faced by Covid-19 and the specific needs which emerged, the rationale and approach of many of these projects were not new. Organisations like the World Bank and others have been deploying social protection projects for decades. The safeguards we are calling for should have been part of the default approach to designing and implementing social protection projects to protect people and their data which we have yet to see being put into place. Had these safeguards been already built in by design and default they would have set a better starting point to face and respond to the unprecedented socio-economic crisis which emerged with the Covid-19 pandemic.
They can get it right. It’s a choice.
Social protection should not have to come at the cost of people’s fundamental freedoms and rights to dignity and equality. We can build agile and resilient social protection systems, without accepting disproportionate interferences with people’s fundamental rights as an opportunity cost.
Safeguards already exist and can help mitigate the risks and concerns which have been documented: from undertaking comprehensive human rights due diligence, to enforcing existing legal and regulatory obligations – including data protection and equality laws. These are just a few examples of how development institutions can reign in the ambitions of governments and companies to surveil and exploit people and their data. As part of a broader and systematic approach and governance of social protection programmes, we need to see these being built in by design and default.
Unfortunately we have not seen international institutions implementing these safeguards, even prior to the pandemic. They should be ensuring that the benefits which may emerge from technological advancements are also designed to empower and serve individuals and communities. Technology must be a tool which advances people’s enjoyment of their fundamental rights and freedoms equally, freely and with dignity.
We must see entities like the World Bank (and other international agencies working on social protection and development) demonstrate how their approach and decision-making processes, in terms of design, financing, and technical assistance, are underpinned by a commitment to ensuring that social protection programmes serve the needs and realise the rights of all. In particular, they must serve the needs of the people these organisations aim to assist “the poorest and most vulnerable.”” This includes ensuring that the governments they work with are committed to upholding their obligations to protect and respect people and their rights, in addition to taking pro-active steps to progressively realise these rights.