If the world of cybersecurity seems like a free-for-all without rules or grown-ups, that’s because it kind of is. The United States doesn’t have a federal data-privacy law. But the Federal Trade Commission (FTC), a government agency, can fine companies that lie or are deceptive about how they use data. (Remember the Facebook Cambridge Analytica scandal? The FTC was able to serve some justice on that.) Maneesha Mithal, the group’s associate director of the division of privacy and identity protection, explains how the FTC is holding companies accountable and what still needs to be done. (Ahem, pass some legislation!)
Marie Claire: How does the FTC regulate privacy?
Maneesha Mithal: We sue companies that don’t do enough to protect your data. We’ve brought cases against Uber, the credit bureau Equifax, and hotel chain Wyndham for not adequately protecting their customers’ sensitive data, which has led to large-scale breaches. In the Equifax case alone, we said that the company’s relaxed practices let hackers access the social security numbers and other data of more than 147 million people. We’ve also charged companies with making false claims about how they collect, use, and share data. For example, we charged Facebook with sharing people’s information with certain apps, which violated the privacy settings those people had chosen. We obtained a $5 billion penalty from the company [in 2019]—the largest privacy penalty ever—and imposed major changes in how the company approaches privacy.
MC: Research supports the notion that data-privacy concerns disproportionately impact women, people of color, and other minorities. How can we better protect these populations?
MM: To protect women’s privacy, we’ve brought cases against websites that post revenge porn, stalkerware apps (through which abusers install apps on the phones of their unsuspecting victims), and apps that secretly sell your location or health data to others. As for protecting minority populations, one practice we’ve tried to address is algorithmic discrimination. Companies can amass large data sets and apply algorithms to detect patterns. Those in turn can give them valuable insights into their customers or job applicants. While these algorithms can help companies spot systemic biases, they can also lead to discrimination. We’ve urged companies to test their algorithms rigorously to eliminate unlawful bias; if they don’t, they could be subject to a fine.
MC: What do you think is the most pervasive privacy issue today?
MM: Important privacy issues have come to the forefront as a result of the coronavirus pandemic. For example, we’re hearing concerns about the privacy of videoconferencing services, which we’ve all come to rely on. We’ve put out consumer education on how to make sure unauthorized people don’t show up at your meetings, how to make sure your video and audio are not inadvertently on, and how to avoid security problems by not clicking on unknown links.
More broadly, we’ve seen concerns about the sharing of health data to facilitate contact tracing. We need to balance our privacy with the need to use data to further better health outcomes. In using consumer data for public health purposes, we’ve urged companies to implement privacy-protective technologies, use anonymous aggregated data where possible, and delete data when the pandemic is over.