This piece is a part of a collection of research that demonstrates how data-intensive systems that are built to deliver reproductive and maternal healthcare are not adequately prioritising equality and privacy.
Digital health apps of all kinds are being used by people to better understand their bodies, their fertility, and to access health information. But there are concerns that the information people both knowingly and unknowing provide to the app, which can be very personal health information, can be exploited in unexpected ways.
Apps that support women through pregnancy are one example where data privacy concerns are brought sharply into the spotlight.
There are more pregnancy apps than for any other medical topics. And there is some evidence that apps may be a useful part of maternal and reproductive healthcare. For example, an app used as part of a larger telehealth intervention was found to improve asthma control in pregnancy and another found that one smartphone-based reminder system was helpful in the management of post-natal urinary incontinence.
However, these apps were both part of specific studies and specifically designed medical interventions. That’s not true of most of the apps that you might come across in the wild, nor of the increasing number of apps that are being developed and released by private companies, charities, even by the UN.
A Wired report revealed that pregnancy apps have serious content and disinformation problems, but they don’t stop there – PI has found persistent problems with apps, including reproductive cycle apps, data sharing practices, and even with basic access to certain apps.
PI has conducted its own research into data privacy and smartphone apps before. For example, when we looked at some of the most popular apps in the world, we found that 61% transferred data to Facebook the minute the user opened the app. When we turned our attention to menstruation-tracking apps, we found that many of the companies we looked at did not take adequate precautions with the health data that people enter into the apps. In fact two shared extensive and deeply personal sensitive data with third parties, including Facebook.
Apps that track women’s fertility cycles and provide reproductive health information have proliferated over the past few years. The Flo menstruation app alone has a global user base of 200 million.
Apps like Flo aim to give users visibility into their menstrual cycle, apps that aim to take the place of or complement contraception, apps that aim to give people heightened visibility into their fertility, apps that give you information on your foetus’ development, and beyond.
So what data privacy protections are these apps and the companies developing them putting into place to protect the oftentimes very personal information users entre into the app? And what sort of data are these apps collecting, without users necessarily being aware?
It’s not a surprise that the proliferation of reproductive health apps has reportedly resulted in some apps providing subpar content. It has been reported that some of the pregnancy apps you are most likely to find in the wild have a serious content problem. A recent Wired investigation found them to be “a fantasy-land-cum-horror-show, providing little realistic information about the journey to parenthood. They capitalize on the excitement and anxiety of moms-to-be, peddling unrealistic expectations and even outright disinformation to sell ads and keep users engaged”. Even saying that “they are yet another way the internet and America’s health care system are failing pregnant people”.
Many of the currently available apps are created by companies who exist not to support pregnant people, but rather to make money. And one way to make money from apps is to collect and share or sell user’s personal data. 72% of the apps reviewed by one study did not cite any actual medical literature at all.
Nina Jankowicz, the disinformation researcher who wrote the wired report, found this to be the case in her research. For example, one of the apps she gave her email address to, a mandatory step to access the app, subscribed her to emails from “Pottery Barn Kids”, from which she couldn’t unsubscribe from. This could mean that Nina’s email and other information about her, which she is not aware of, was shared with Pottery Barn Kids and potentially others as well.
Below we have looked at two reproductive health apps to understand potential concerns arising from how they treat user’s personal data and the overall design of such services.
Badger Notes is a UK maternity app that allows women to view their medical notes, self-refer to the maternity department, and more.
Badgernet is intended to form part of the maternity care process: it forms part of the maternity care pathway and is used both by doctors and patients. It includes the Badgernotes app, which allows a person to see their future appointments, postnatal notes, and a summary of the baby’s care and more.
It’s designed to be used by both a maternity care team and pregnant women. It is directly involved in the provision of medical care. As of 2018, Badgernet was being used in 250 hospitals around the UK.
In 2018 Clevermed – the company which owns and operates Badgernet – were surprisingly candid about the risks of their system.
When they were using an older broadband solution, their managing director Peter Badger said:
“We would have situations where we had to do a national system update, which involved downloading about 50MB to 10,000 workstations at once, and it would flat-line the bandwidth for three or four hours at a time, and nobody could select a patient to view until everybody had downloaded the updates.”
According to a 2018 report by Computer Weekly: “In that three- to four-hour window, clinicians were unable to access patient records. In an acute obstetric scenario where, for example, this data might be vital in determining whether or not to deliver a baby via an emergency caesarean section, this could result in unforeseen challenges for the clinical team, and potential risks to the patient.”
That’s a terrifying potential consequence of a platform like Badgernet. Though the limited bandwidth of the NHS’s old broadband system has since been upgraded, the fact that updates to the Badgernet platform might have put women and their babies at risk is a concern.
Another potential concern with Badgernet is that the platform can hold very personal information about people.
The information shared with Badgernet is health information, which is a special category of data under UK law and requires extra protections.
The ICO explain on their website that a DPO must be appointed if your core activities consist of large scale processing of special categories of data like health data.