Californians will soon have sweeping digital-privacy rights stronger than any seen before in the U.S., posing a significant challenge to Big Tech and the data economy it helped create.
So long as state residents don’t mind shouldering much of the burden of exercising those rights, that is.
Come Wednesday, roughly one in 10 Americans will gain the power to review their personal information collected by large companies around the world, from purchase histories and location tracking to compiled “profiles” that slot people into categories such as religion, ethnicity and sexual orientation. Starting January 1, they can also force these companies — including banks, retailers and, of course, tech companies — to stop selling that information or even to delete it in bulk.
The law defines data sales so broadly that it covers almost any information sharing that provides a benefit to business, including data transfers between corporate affiliates and with third party “data brokers” — middlemen who trade in personal information.
It remains unclear how it will affect the business of targeted advertising, in which companies like Facebook amass reams of personal data and use it to direct ads to specific groups of people. Facebook says it doesn’t share that personal information with advertisers.
Still, because it applies to any company that meets a threshold for interacting with state residents, the California law might end up serving as a de facto national standard. Early signs of compliance have already started cropping up in the form of “Don’t sell my personal information” links at the bottom of many corporate websites.
“If we do this right in California,” says California attorney general Xavier Becerra, the state will “put the capital P back into privacy for all Americans.”
California’s law is the biggest U.S. effort yet to confront “ surveillance capitalism,” the business of profiting from the data that most Americans give up — often unknowingly — for access to free and often ad-supported services. The law is for anyone ever weirded out when an ad popped up for the product they were just searching on, or who wondered just how much privacy they were giving up by signing into the briefly popular face-changing tool FaceApp.
But there are catches galore. The law — formally known as the California Consumer Privacy Act, or CCPA — seems likely to draw legal challenges, some of which could raise constitutional objections over its broad scope. It’s also filled with exceptions that could turn some seemingly broad protections into coarse sieves, and affects only information collected by business, not government.
For instance, if you’re alarmed after examining the data that Lyft holds on you, you can ask the company to delete it. Which it will legally have to do — unless it claims some information meets one of the law’s many exceptions, among them provisions that allow companies to continue holding information needed to finish a transaction or to keep it in a way you’d “reasonably expect” them to.
“It’s more of a ‘right to request and hope for deletion,’” says Joseph Jerome, a policy director at privacy group Common Sense Media/Kids Action.
A more fundamental issue, though, is that Californians are largely on their own in figuring out how to make use of their new rights. To make the law effective, they’ll need to take the initiative to opt out of data sales, request their own information, and file for damages in the case of data breaches.
“If you aren’t even reading privacy agreements that you are signing, are you really going to request your data?” asks Margot Kaminski, an associate professor of law at the University of Colorado who studies law and technology. “Will you understand it or sift through it when you do get it?”
State residents who do make that effort, but find that companies reject their requests or offer only halting and incomplete responses, have no immediate legal recourse. The CCPA defers enforcement action to the state attorney general, who won’t be empowered to act until six months after the law takes effect.
When the state does take action, though, it can fine businesses up to $7,500 for each violation of the law — charges that could quickly add up depending on how many people are affected.
The law does offer stronger protection for children, for instance by forbidding the sale of data from kids under 16 without consent. “The last thing you want is for any company to think that we’re going to soft on letting you misuse kids’ personal information,” Becerra, the attorney general, said at a press conference in December.
Many of the CCPA’s quirks trace back to the roundabout way it became law in the first place. A few years ago, San Francisco real estate developer Alastair Mactaggart asked a friend who worked at a tech company if he should be concerned about news reports on how much companies knew about him. He expected an innocuous answer.
“If you knew how much we knew about you, you’d be terrified,” he says the friend told him .
With help, Mactaggart produced a ballot initiative that would let California voters implement new privacy rules. Although initially a long shot, the proposal quickly gained steam amid news of huge data breaches and privacy leaks.
That drew the attention of Silicon Valley, whose big companies considered the ballot initiative too risky. Moving the proposal into the normal legislative process would give them influence, the chance to pass amendments, and above all time to slow down what seemed to be a runaway train.
“I always knew I was signing up for a fight,” Mactaggart says .
The developer agreed to pull the initiative off the ballot and have it introduced as a bill. In slightly changed — or weakened, per critics — form, it passed. Gone, for instance, was a provision that would have allowed people to sue when companies improperly declined to hand over or delete data.
The coming year will provide the first evidence of how much protection the CCPA actually offers — and how thoroughly Californians will embrace it.
Among other limitations, the law doesn’t really stop companies from collecting personal information or limit how they store it. If you ask a company to delete your data, it can start collecting it again next time you do business with it.
Mary Stone Ross, incoming associate director of the Electronic Privacy Information Center and co-author of the original ballot initiative, worries that CCPA might just unleash a firehose of data on consumers. “A business could actually drown a consumer in information so the important pieces are lost,” she says.
There’s a way to avoid that by just asking for which categories of information a company holds, such as demographics, preferences or interests. But it’s not clear how many will know to do that.
The law’s biggest impact, in fact, may lie in how it requires companies to track what data they have, where they keep it, and how to get it to people when requested, says Jen King, director of consumer privacy at Stanford Law School’s Center for Internet and Society. That effort alone, which can be substantial, might cause corporations to reconsider how much data they decide to hold onto.
That may lead to some unintended consequences and even corporate attempts to discourage people from using the law. The job-search site Indeed.com, for instance, now explains that when anyone opts out of data sales under CCPA, it will also ask them to delete their associated accounts and all personal information.
Such people will still be able to use the website without logging in. Indeed said in a statement that it routinely transfers personal information such as job-seeker resumes to employers as part of its service. Because it believes that such transfers may qualify as “sales” under CCPA, Indeed will not hold such information for people who opt out of data sales under the law.