California Enforces Privacy Rules, France Sets Priorities, FTC Warns on AI Training
By Robert Bateman and Privado.ai
In this week’s Privacy Corner Newsletter:
California privacy agency wins appeal: CPRA regulations are now enforceable
The California Privacy Protection Agency (CPPA) has successfully appealed against a decision by the Sacramento Superior Court to delay the enforcement of regulations promulgated under the California Privacy Rights Act (the “CPRA regulations”).
⇒ What’s the background to this case?
The CPRA amended the CCPA in several important ways. For our purposes, the CPRA:
The law said that the CPPA’s “timeline” for making these regulations was July 1, 2022, and that the regulations would be enforceable on July 1, 2023.
⇒ Did the CPPA make the regulations as required?
The CPPA has only finalized regulations covering 12 out of the 15 areas specified in the CPRA. And most importantly, they were approved nine months late—on March 29, 2023.
Once the regulations were approved, CalChamber sued the CPPA. The group pointed out that the CPRA set a deadline of July 2022 for regulations in all 15 areas and set an enforcement date exactly 12 months later.
CalChamber asked the court to delay the enforcement of the regulations until 12 months after regulations in all 15 areas had been approved. The group argued that businesses should not have less time to prepare just because the CPPA was so slow in making the rules.
⇒ What did the trial court say?
At trial (this decision has been appealed), the court granted a partial victory to CalChamber, holding that:
If the July 2023 deadline stood, the trial court pointed out, the CPPA could pass regulations with instant or even retroactive effect, giving businesses no time to prepare.
⇒ What did the appeal court say?
The appeal court sided with the CPPA. As such:
The appeal court found that the trial judge had interpreted the CPRA as providing a time period (12 months) rather than two dates that happened to be 12 months apart. The appeal court also referenced information provided to Californians when they voted on the CPRA
⇒ So which regulations are now ‘live’?
The CPPA now has the power to enforce these regulations, covering areas such as privacy notices, the sale of personal information, consumer rights, and much, much more.
These regulations interpret the CCPA (as amended by the CPRA). So, if you’ve put time and effort into CCPA compliance, you should be pretty close to meeting the requirements of the CPPA’s regulations.
But remember—a violation of the regulations is a violation of the CCPA. And the regulations contain a lot more detail than the CCPA itself.
Three other sets of regulations—on risk assessments, cybersecurity audits, and automated decision-making technology—are still in draft but will be enforceable immediately once finalized and approved.
French regulator sets out 2024 GDPR investigations priorities
The French Data Protection Authority (DPA), known as the “CNIL”, has announced its “priority topics” for the coming year.
⇒ What do these priorities mean in practice?
The CNIL picks its priority topics each year and focuses on enforcement in the relevant areas over the following 12 months.
Recommended by LinkedIn
Last year, the CNIL conducted 340 investigations, down slightly from 345 in 2022. The regulator’s priority topics in 2023 were smart cameras, mobile apps, and bank and medical records.
The CNIL has issued some of the largest penalties under the GDPR and ePrivacy Directive.
The regulator is particularly active in the area of cookies, having issued large cookies-related fines against companies such as Google, Meta, Microsoft, and French adtech firm Criteo in recent years.
⇒ So what are this year’s priorities?
The CNIL’s priorities for 2024 year will be:
The CNIL says that around 30% of its investigations will focus on these topics.
FTC: Want to train AI on user data? Changing your terms might be unlawful
The US Federal Trade Commission (FTC) has warned that companies seeking to train AI models on user data must not change their terms and conditions without providing proper notice or getting consent.
⇒ There’s an AI training case from 2004?
The 2004 case referenced by the FTC involved Gateway Learning Corporation (GLC), which developed the “Hooked on Phonics” range of educational software.
This early example of FTC privacy enforcement did not involve AI training but a retroactive change to GLC’s privacy notice that purportedly allowed the company to share its users’ data with third parties.
⇒ So how is that illegal?
In GLC’s case, the violations were rather obvious:
The FTC decided this was an unfair and deceptive practice under the FTC Act.
⇒ How does this relate to AI?
The same principle applies to AI training, the FTC says.
“Even though the technological landscape has changed… the facts remain the same: A business that collects user data based on one set of privacy commitments cannot then unilaterally renege on those commitments after collecting users’ data.”
In other words: Don’t say one thing and then do another. Significant changes to your terms of use or privacy notice might require consent.
⇒ What if we never told people we wouldn’t use their data to train our AI?
Good question.
Answer: “It depends” (sorry)—on which laws apply to you, what you said in your privacy notice, and what exactly “training your AI” means.
If you’re bound by the “purpose limitation” principle in laws like the GDPR or CCPA, you’ll likely need consent before using people’s information as training data.
If your privacy notice or terms and conditions provide an exhaustive list of the ways in which you use your customers’ data, you might need consent before expanding that list to AI training.
But “getting consent” doesn’t have to be prohibitive. If some AI feature will truly benefit your users, perhaps many will want to opt in.
But tread carefully—and remember Zoom’s bad press last year.
What We’re Reading