New HIPAA rules on health data, Belgian bank wins AI case, and UK in hot water

New HIPAA rules on health data, Belgian bank wins AI case, and UK in hot water

By Robert Bateman and Privado.ai

In this week’s Privacy Corner Newsletter:

  • The OCR releases renewed guidance about using tracking technologies under HIPAA
  • The Belgian DPA issues a surprisingly permissive decision on training AI models
  • The UK government gets sanctioned for a sub-standard DPIA
  • What we’re reading: Recommended privacy content for the week.


New HIPAA guidance: Don’t disclose health data via trackers

The Office for Civil Rights (OCR) has revised its guidance on using tracking technologies under the Health Insurance Portability and Accountability Act (HIPAA).

  • The OCR has updated guidance from last July to “increase clarity for regulated entities and the public” regarding cookies, pixels, and other tracking technologies.
  • The updated guidance reiterates that HIPAA-covered entities and business associates may not use tracking technologies “in a manner that would result in impermissible disclosures” of protected health information (PHI).
  • The OCR emphasizes that using tracking technology in a way that does not result in the disclosure of a user’s “past, present, or future health, health care, or payment for health care” is not necessarily an impermissible HIPAA disclosure.

⇒ Is it possible to use tracking technologies without violating HIPAA?

The OCR doesn’t rule out all uses of tracking tech. This latest guidance explains what is and is not allowed under HIPAA.

The guidance is nuanced and distinguishes several types of tracking scenarios. However, one statement is particularly clear:

“...disclosures of PHI to tracking technology vendors for marketing purposes, without individuals’ HIPAA-compliant authorizations, would constitute impermissible disclosures.

⇒ What constitutes a disclosure of PHI?

That’s where the guidance gets more complicated. The OCR distinguishes several tracking use cases, each with different compliance obligations. We’ll focus on these two scenarios:

  • Tracking on “user-authenticated webpages”
  • Tracking on “unauthenticated webpages”

⇒ Tracking on user-authenticated webpages

“User-authenticated webpages” require users to log in prior to access. The OCR’s guidance is strictest in this context because, in the regulator’s view, using tracking technologies on authenticated webpages can lead to the disclosure of PHI.

The OCR says tracking technologies can access PHI on user-authenticated webpages, including:

  • IP address
  • Medical record number
  • Home or email addresses
  • Dates of appointments
  • Diagnosis and treatment information
  • Prescription information
  • Billing information

Because the user has logged in, it’s easier for vendors to link their identities with information about their health conditions.

As such, if you’re using tracking tech on a user-authenticated webpage, you must meet HIPAA requirements (we’ll look at these below).

⇒ Tracking on unauthenticated webpages

Unlike user-authenticated webpages, people don’t sign into unauthenticated webpages.

The OCR says that tracking technologies on “many” unauthenticated webpages do not have access to PHI. But they might—and if they do, HIPAA applies.

The important thing is whether the information disclosed to vendors (e.g., Facebook or Google) is related to an individual’s “past, present, or future health, health care, or payment for health care.”

For example:

  • A student visits a hospital’s oncology webpage to research a term paper. A pixel discloses the student’s IP address to Facebook, along with information about the fact that the student visited the page. The student isn’t visiting the webpage in the context of seeking healthcare, so this would not be a disclosure of PHI.
  • But the next day, a cancer patient visits the same webpage, and the tracker does the same thing. This scenario likely would involve a disclosure of PHI.

⇒ How am I supposed to know whether the user is a student or a cancer patient?

While the OCR doesn’t say this outright, you probably can’t distinguish what types of people are visiting your unauthenticated webpages without tracking them.

That’s why covered entities should exercise caution when using tracking technologies.

If a webpage simply displays your facility’s opening hours, it’s unlikely that trackers would disclose PHI to vendors. But if a webpage displays more sensitive information about specific health conditions, consider whether PHI might be involved.

If there’s a risk of PHI disclosures, HIPAA compliance kicks in, which means (among other things):

  • Ensuring any disclosure of PHI is covered either by a HIPAA Privacy Rule permission or the individual’s valid authorization (the OCR says simple “accept/reject” cookie banners are not sufficient)
  • Putting in place Business Associate Agreements (BAAs) with any tracking vendors receiving PHI
  • Carrying out risk analysis and management procedures before using the tracking technologies
  • Providing breach notification to individuals if you have disclosed PHI via tracking tools without meeting HIPAA requirements

Given how often the OCR has talked about tracking technologies over the past 12 months, we might expect some enforcement in this area soon.


Belgian DPA greenlights bank’s AI model training under ‘legitimate interests’

The Belgian Data Protection Authority (DPA) has found that a bank was justified in relying on “legitimate interests” to train AI models on customers’ transaction data without consent.

  • The bank collected the transaction data as part of a discount program based on customers’ consent.
  • The bank argued that the further processing of the transaction data to train the banks’ AI models was compatible with the original purpose for which the personal data was collected.
  • The Belgian DPA found that the model-training purpose was not compatible with the original purpose—but could be justified under the bank’s legitimate interests given the minor impact on the data subject.

⇒ What are the facts of the case?

Please bear in mind that this analysis is based on an automated translation of this case provided by the good people at DeepL. The original is available here.

There are two parties to this case:

  • The complainant, “Mr X”
  • The defendant, “Y Bank”

Y Bank runs a program offering personalized discounts based on people’s transaction data. Mr X gave his consent to participate in the program.

Y Bank also uses the transaction data to train its AI model. The bank claims that this processing is for “research or statistical purposes” and is compatible with the purpose for which it collected the information.

Mr X withdrew his consent from the discount program and objected to the use of his transaction data to train the AI model.

Y Bank complied with Mr X’s request after one month. In the meantime, the bank was allegedly using Mr X’s transaction data to train its AI model.

Mr X said that this meant it was “de facto impossible” to object to the processing of his transaction data, as the purpose of the processing (training the AI model) had already been achieved by the time his request was fulfilled.

⇒ What did the Belgian DPA say?

The Belgian DPA considered whether training the AI model was compatible with the original purpose

The DPA found that training the AI mode was not a “statistical or research purpose” as there was no link to any scientific, historical, or statistical goal. 

Therefore, the DPA found that Y Bank required a separate legal basis for this processing.

So, the DPA considered whether relying on “legitimate interests” is a suitable legal basis for this activity and concluded that it is.

Here’s how the Belgian approach the “legitimate interests assessment” on the bank’s behalf:

  1. Purpose: Getting better insights into customers’ activities to offer discounts on third-party products can be a legitimate purpose.
  2. Necessity: Y Bank’s training of the AI model can be considered necessary to meet that purpose.
  3. Balancing: There was an appropriate balance between Mr X and Y Bank’s interests.

In its analysis of the “balancing test”, the Belgian DPA noted that 

  • There “should never be any attempt” to re-identify individuals whose data is included in the training set.
  • The AI models were “merely algorithms that no longer contain personal data”.
  • No data was passed to third parties.
  • No special category data was involved.
  • Data subjects could opt out (the DPA does not appear to have been bothered by the one-month delay cited by Mr X).
  • The impact on Mr X was “extremely small”.

As such, the Belgian DPA found that Y Bank could rely on “legitimate interests” to train its models on the transaction data collected via its discounts program.


UK government receives GDPR warning and enforcement notice over ankle-tagging immigration program

The UK’s Information Commissioner’s Office (ICO) has issued a warning and an enforcement notice to a government department, the Home Office, following a scheme that involved the GPS-tracking of migrants.

  • The Home Office scheme required people who had arrived in the UK via “unnecessary and dangerous routes” to wear ankle tags in order to be granted bail.
  • The ICO found that the Home Office had failed to conduct an adequate Data Protection Impact Assessment (DPIA) and had violated the GDPR’s “accountability” principle.
  • The ICO has ordered the Home Office to re-write certain documentation and notices associated with the scheme within 28 days.

⇒ What’s the background to this case?

The Home Office was running a pilot scheme intended to reduce the number of people it detained for entering the UK via dangerous routes (for example, on “small boats” across the English Channel).

Under the pilot, people would be offered immigration bail if they agreed to wear ankle tags to track their location via GPS.

The ICO did not find that the scheme was illegal but found that the Home Office had failed to carry out a proper DPIA.

⇒ What were the issues with the Home Office’s DPIA?

The ICO found that the Home Office’s DPIA:

  • Failed to set out all the processing operations and purposes in a sufficiently clear and consistent way
  • Did not include an assessment of the necessity and proportionality of the scheme
  • Failed to demonstrate that there were no less intrusive alternatives to ankle tagging
  • Did not include an objective assessment of the risks to people’s rights and freedoms
  • Did not propose any measures to address the risks

The ICO found that a warning and an enforcement notice were the correct remedies in this situation and has given the Home Office 28 days to submit revised documentation.


What We’re Reading

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics