You Probably Shouldn’t Follow Medical Advice From Your Period Tracker
Credit to Author: Hannah Smothers| Date: Mon, 28 Oct 2019 21:54:04 +0000
For being a relatively simple thing, period tracking apps are somewhat riddled with controversy; in the past few years, they’ve been accused of selling private health data to Facebook, taking funding from anti-abortion sources, and causing scores of unintended pregnancies. And now a new report from the New York Times identifies yet another flaw in two period trackers that are trying to mine their piles of data to predict health problems: Both Clue and Flo seem to be leading a disproportionate number of users to believe they have polycystic ovarian syndrome, or PCOS, thanks to faulty screening methods and thin science.
Clue and Flo recently introduced tools to evaluate user risk for PCOS, a hormonal imbalance disorder that often causes irregular periods. But as the New York Times reported on Sunday, neither app performed high-level clinical studies to test the accuracy of their PCOS assessments, nor did they properly study the potential for over-diagnosis. There are other issues, too: Clue relied on virtual patients, rather than human ones, to test its assessment tools; Flo neglected to ask about eating disorders and workout routines (two things that affect menstrual regularity); and Flo’s questions were found to be worded in a way that pushes users toward false positives.
Because of all these problems, an inordinately high percentage of users have been told they may have PCOS, when they’re actually either completely healthy, or just have a benign hormonal imbalance. The developer of Clue didn’t disclose the number of patients the app suspected of having PCOS, but Flo reportedly recommended that about 38 percent of users who completed the health assessment see their doctor about PCOS. For context, studies estimate the prevalence of PCOS is somewhere between 4.6 and 8 percent.
The issues with Clue and Flo’s proprietary health assessments are indicative of a growing problem in digital health tracking: As the New York Times pointed out, the FDA generally doesn’t have to vet the effectiveness for consumer apps or tools that merely make suggestions, and aren’t used for actual diagnostics. While apps that rely on user-reported data may be well-suited to help a person predict when their next period will be, for example, they’re still sort of in the wild west stages of being able to suggest health risks or outcomes. Aside from the stress and chaos of a false positive (women the New York Times spoke with said they were briefly concerned they’d be unable to get pregnant), this can lead to unnecessary testing and associated costs—especially with something like PCOS, a disorder that’s extremely difficult to diagnose, and which doesn’t have a totally agreed-upon set of characteristics.
Even when the FDA does grant approval to consumer apps and tools, there might still be major flaws that are less likely to exist in a doctor’s office. Earlier this year, several studies found issues in suggestions made by at-home DNA kits like 23andMe and Ancestry, which were discovered to be delivering a high number of false positives for sometimes devastating health conditions. While the FDA authorized DTC kits to report on certain mutations in the BRCA breast cancer genes in 2018, it did so with many qualifiers—signaling that a spit-tube isn’t a replacement for in-office medical testing. Experts also question the ability of Apple Watch’s ECG feature to warn people about AFib, even though Apple said it worked with the FDA to develop the feature.
As more consumer apps and services join the extremely popular game of using their mounds of data to push users toward healthcare, it seems like the scrutiny applied by the FDA should be more severe. To be fair, the existing brands aren’t suggesting they can be used in lieu of a doctor; they’re merely suggesting that, Hey, maybe you should see one? But tools that ultimately funnel people into the healthcare system and (potentially) into medical debt shouldn’t be able to skate by with thin evidence of effectiveness just because they’re making suggestions instead of firm diagnoses. If consumer apps are going to use their technology’s ability to tell you about your own health, based on the data you fork over, as a marketing technique, the recommendations should be as valid as possible. Right now, it certainly appears that they’re not.
Sign up for our newsletter to get the best of VICE delivered to your inbox daily.
Follow Hannah Smothers on Twitter.
This article originally appeared on VICE US.