PI’s response to DCMS’ consulation on data protection reform in the UK.


Now is the time to strengthen not weaken data protection to keep us all safe. Here we outline some areas of our consultation response that highlight the impact of the proposed loss or weakening of many important protections:

Broadening Consent and Purpose Limitation:

PI urges caution with regard to provisions that seek to potentially undermine the strict conditions around obtaining consent. The GDPR placed stronger conditions on obtaining consent and in our work we have seen how this is constantly sought to be undermined by various actors. Introducing concepts such as “general” or “broad” consent might inevitably result in people’s (sensitive) personal data being used for purposes that go far beyond what they might have originally foreseen.

PI has investigated this issue extensively; we are shocked at how intrusive and harmful data collection has become under the cover of “consent”, including health data that could, under this proposal, be interpreted as “scientific research”. For example, see Your Mental Health For Sale and An Unhealthy Diet of Targeted Ads.
PI’s investigation, No Body’s Business But Mine: How Menstruation Apps Are Sharing Your Data, found that several apps were sharing sensitive health data with third parties, which was not explicit in their privacy policies.

Purpose limitation is one of the core principles of data protection law. Application of the principle ought to consider factors listed in Article 6(4). The question of purpose limitation is intrinsically linked to what one can expect
to be done with their personal data.

For example, in the complaints that PI filed against Clearview AI, together with 3 other organisations across the EU, PI demonstrated that re-use of even publicly available personal data, such as facial images posted on social media or websites, for processing in a biometric database clearly falls outside of such expectations.

UPDATE: Since we submitted these complaints, many data protection authorities have announced they agree with us, including a £7.5 million fine issued by the ICO in the UK, and a €20 million fine issued by the Italian data protection authority.

Removing the “balancing exercise” for applying legitimate interest:

Legitimate interests of the controller or a third party may provide a legal basis for processing, provided that the interests or the fundamental  rights  and  freedoms  of  the  data  subject  are  not  overriding. The use of this legal basis for processing fundamentally requires controllers to carry out a balancing exercise between the specific interests they seek to protect and the impact of the latter on data subject’s rights and freedoms.

Essentially the balancing exercise lies at the heart of using legitimate interests as a legal basis for processing personal data. Depriving legitimate interests of the balancing exercise will in most cases result in processing operations that bear a disproportionate or onerous impact on data subjects’ rights.

In its submission before the ICO, PI illustrated how the legitimate interests legal basis esnures a fair processing of individuals’ personal data as well as how facial recognition companies often abuse it by failing to take the implications of their processing operations for data subjects’ rights into consideration or by engaging in disproportionate data exploitation practices.

AI and Machine Learning: Removing Article 22 of UK GDPR

The protection afforded by Article 22 of the UK GDPR cannot be overstated, and the suggestion of its removal constitutes a grave threat to individuals. Article 22 is designed to guard against the risks of automated-decision making. These risks are identified by the ICO as follows:

– Profiling is often invisible to individuals;

– People might not expect their personal information to be used in this way.

– People might not understand how the process works or how it can affect them.

– The decisions taken may lead to significant adverse effects for some people.

In relation to the risk of errors, government must consider that Article 22 exists to guard against mistakes which could be time-consuming and costly to government entities. Here we present some examples from PI’s report, “Benefitting whom? An overview of companies profiting from “digital welfare”:

“We filed a series of FOI requests to four London councils (Ealing, Islington, Camden, Croydon) concerning the London Counter Fraud Hub, a system designed by the Chartered Institute of Public finance & Accountancy (CIPFA) in order to detect fraud in applications for the council tax single person discount. The system was meant to process large amounts of data to identify fraudsters. The system was a cause of great concerns when it was first revealed in the media. With more and more dicussions on algorithmic bias and the revelations that the system had a 20% failure rate, many feared they would see their benefits cut unfairly.”

Introducing a fee for Subject Access Requests

This would be very problematic, particularly for the gig-economy sector and other lower income individuals. It is only through subject acess requests they are able to obtain information that numerous companies like Deliveroo, Uber, Amazon etc collect about them. Data collection by delivery companies is very opaque and the workers are not told how much data is collected about them, how this data is later used. The only way for them to protect themselves is through data subject acess requests. However, considering the fact that gig-economy workers tend to be much lower paid, fees for data subject access request will have a significant negative impact on their ability to protect their rights. This is very concerning in light of the inherent power imbalance that exists between delivery platforms/employers and their workers. See PI’s case study, The Gig Economy and Exploitation.

Removing consent for analytic cookies:

We disagree with the framing of the proposal that analytics cookies are harmless and consent notifications are bothersome for users.

“Analytics cookies and similar technologies” are currently a gateway to personal data collection and processing for micro-targeted advertising, and much more. PI’s research into data collection from mental health websites revealed that answers to depression tests were shared with third parties as a result of these technologies being blindly deployed, without a real assessment of how much data they can collect and for which purpose. Our investigation into diet ads online revealed similar issues.

Given the complexity of online advertising and its heavy reliance on tracking and other invasive data collection processes, removing the need for consent would open a door to indiscriminate surveillance practices by private companies. Our devices and the web are already full of tracking and spying technologies and consent is currently the only protection that users have at their disposal to somewhat limit how they are being tracked and monitored.

The question posed here should not be about removing consent requirements, but rather what can be done to reign in such gratuitious data collection in the first place.





Source link

Leave a Reply

Your email address will not be published.