Mishcon de Reya page structure
Site header
Main menu
Main content section
woman sitting on stairs using phone

Mind your own business: menstruation tracking apps and Roe v Wade

Posted on 9 November 2022

Millions use menstruation tracking apps worldwide, and the decision to overturn Roe v Wade has increased the risk to many users of these apps.

In this post, we discuss how the way technology funding is secured in 2022 encourages app developers to undermine the privacy of menstruation tracking app users. We consider the underlying structures which put pressure on app developers to share user data, inconsistent promises made in privacy policies, and the impact this has on user privacy. We've also sought to make some suggestions for app developers and users of menstruation tracking apps looking to better protect their privacy.

The value of menstruation tracking apps

Menstruation tracking apps claim to have a meaningful impact on their users' health far beyond simply monitoring their menstruation cycle. Nutritional and exercise plans, as well as medication schedules that are sensitive to users' menstruation cycles aim to be instrumentally more effective than equivalent plans which ignore users' cycles.

Those using menstruation tracking apps trust app developers with information that they’d usually only give to a medical professional or write in a diary. You can trust your diary not to work with third parties to nudge you into spending more time writing in it, but the same isn't always true of an app. Menstruation tracking apps have come under much more scrutiny since the decision to overturn Roe v Wade – and rightly so. A key concern following the decision is that law enforcement in US states which have banned abortion could demand that the companies running apps (or their embedded trackers – more on that later) hand over user data as part of investigating whether somebody has unlawfully had an abortion.

Most of the apps we've reviewed when writing this article make two claims: (i) we don't share your data with third parties, and (ii) we will never sell your data. Let's break these statements down:

Establishing the issue: we don't share your data with third parties

There's a disconnect between what developers mean when they say "We don't share your data with third parties", and what most people would understand that statement to mean.

Most users would assume this statement means their data is only held by the company that’s made the app. But typically the servers from Amazon, Microsoft or Google where your data is stored and which developers need to run their services aren't considered as being third parties. We also regularly see examples of mobile apps where this statement is made in a template privacy policy, but the providers are using all sorts of embedded cookies and software development kits from third parties – this promise about not sharing data has simply been left in the template.

Establishing the issue: we will never sell your data (somebody else will)

We will never sell your data. This is where the developers often get things wrong, and fail to adequately look into the other businesses they work with, who often do sell data. When this statement is made, it's likely that developers genuinely don't intend to sell their users' data. A key issue here is with the third-party tracking tech integrated into health apps.

App developers are often under pressure to secure funding, and key talking points when pitching for investment are how "engaged" users are with the app, and how the user base is growing. To obtain the stats for user engagement and growth, apps regularly build in technology from third parties to generate information such as how long the menstruation tracking app is being used for, when the user has logged their cycle for a whole month, and the user's approximate – or sometimes precise – location. The tracking technology also enables app developers to understand their user base in extreme detail, so developers can make tweaks to their apps, making them easier to use and hopefully preventing them from trying out different apps instead of theirs.

The integrated tracking technology is a particular problem from a privacy perspective when information gleaned about you – or your device – is combined with other information. If a menstruation tracking app integrates with a social network, for example, that network may well know that you haven't had a period for the last two months. Start searching for morning sickness advice on websites that share information with the same social network, and this information can be combined with your app analytics data to profile you as someone who is likely to be pregnant. Next thing you know, you'll be receiving targeted advertising for pregnancy support. A key issue here is that even though the developer may not 'sell' your data directly, the trackers they are integrating into their apps DO sell, or at the very least share, data. There is real and obvious potential for harm here – particularly, for example, following a miscarriage.

The ability to challenge law enforcement requests

This type of tracking and data sharing becomes even more of an issue if the third party trackers don't have a particularly robust stance on law enforcement requests. Even if the company that develops your app won't share data with law enforcement without a subpoena, if the third party trackers embedded into the app are happy to hand over the same data to save the time and cost of resisting a law enforcement request, the initial commitment from your app developer doesn't mean very much.

The process of integrating trackers into apps has been happening for years – but the stakes are necessarily higher when dealing with health data, and have been raised again with the overturning of Roe v Wade. Following the decision there is now a compelling moral weight on developers to fully understand the trackers they're integrating into their apps. The decision of whether to share user data isn't just in the hands of the company that made the menstruation tracking app – it's with every technical stakeholder involved in the chain, from the company hosting your data right the way through to the trackers.

It's not just the developers at fault here though – there is an entire system of moving parts which all contribute to the risk to user privacy. It's the investors who love to see a growing and engaged user base, the developers who need investment and want to keep customers, and the trackers with vast sales teams of people expertly trained to explain how useful their technology is (but who often give short, template answers which fail to adequately answer questions about the privacy issues associated with their business model).

Suggestions for developers

  1. Restrict what data you collect from the user. Remove registration requirements, don't collect identifiers from your users, and give them control over how long their data is retained for.
  2. Design your systems such that even if you were asked to hand over user data, you wouldn't be able to hand over anything meaningful.
  3. Don't include third party trackers in your app.
  4. If you're including trackers in your app, limit your providers to privacy friendly options which aren't US companies and don't store data in the US. Make sure you're being transparent with users about this, and seriously consider what data you're transferring to the trackers. Many trackers enable developers to choose exactly what data you allow them to collect, and how the data is categorised. One menstruation tracking app integrated Facebook's tracking technology and configured the app to report to Facebook when the user was pregnant – don’t be like that developer.
  5. Avoid operating your app from a US company, and store encrypted user data either locally on the user's device, or in privacy friendly clouds.
  6. Spend time considering your position on law enforcement requests and publish it.
  7. Build a narrative of privacy and user trust into your pitch decks and conversations with investors.

Suggestions for users

Given what's been discussed so far, it's unreasonable to suggest that the app users are the ones who should bear responsibility for fixing things – and that isn't the intention here. There are a few things users can do to either better understand how their data is used, or start to put pressure on the app developers:

  1. Some brief online research will often give a sense of how well an app handles their user data. This isn't a complete fix to the issue but can help you avoid the known ‘privacy unfriendly’ options.
  2. You could install a VPN app designed to pick up and block trackers, such as Lockdown, and use that to test the menstruation tracking app to see if it's loaded with trackers. This is an additional admin burden for sure, and these apps often drain your battery quickly and have privacy concerns of their own, but they can be an effective tool to understand what's really going on with other apps.
  3. Be vocal. Contact customer support at the apps you use, post on Twitter if they ignore you and complain when they don't get it right. Falling user engagement and consistent complaints are two of the most effective ways to spark change.

This post barely scratches the surface in terms of the relevant issues here. But overarching all of this is a system that encourages app developers to focus on user engagement and growth at the expense of privacy.

For apps which collect only small amounts of data, the risk is lower. But for menstruation tracking apps, the cost of inadequately considering privacy is much higher than for other sectors. We've already started to see a series of positive steps being taken by some menstruation tracking apps, but generally only in response to user backlash sparked by a dedicated few who've really dug into the app's practices and gone public with their findings. With the overturning of Roe v Wade, the need to do better has never been more acute.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else