Stream: patient empowerment
Topic: In the news
Michele Mottini (Sep 03 2019 at 13:12):
https://www.nytimes.com/2019/09/03/technology/smartphone-medical-records.html
Dave deBronkart (Sep 03 2019 at 16:38):
Thanks @Michele Mottini. This is going to be a big issue in the US at least.
The point of the article is that although the HIPAA law provides some privacy protection for personal health data held by "covered entities" (doctors, insurers, etc), NO SUCH PROTECTION exists for health data once it's outside those walls: if I download it to anywhere, all HIPAA protection ends.
So, the article points out, patients will be vulnerable to misuse of that data, e.g. by insurance companies or employers.
GDPR question: My uneducated impression is that GDPR forbids downstream use of such data. Is that correct?
John Moehrke (Sep 03 2019 at 16:47):
This is just a re-statement of fact.. once an individual gets a copy of THEIR data, they can do what they wish with that data. This is true of GDPR as well.
John Moehrke (Sep 03 2019 at 16:53):
The main difference in GDPR is that outside of the individual, all others must be clear what the purpose of their use will be and they can not do anything other than the approved purpose. This is not something said explicitly in USA, but is an expectation of all Privacy Principles... that is, when data are gathered it is gathered with a defined and approved purpose, AND that is the only thing that can be done with the data. Expanding the purpose must be done explicitly by going back to the individual to get authorization to do the new purpose. Exceptions do exist, for example if the data are de-identified. In the USA it is rather common for data to be used for other purposes through simply changing the privacy policy silently. This reuse is often what people find objectionable and creepy.
Dave deBronkart (Sep 03 2019 at 16:58):
This is just a re-statement of fact.. once an individual gets a copy of THEIR data, they can do what they wish with that data. This is true of GDPR as well.
In practice, though, it's not JUST a restatement of fact. An important reality in the US at least is that there are data brokers with no ethics at all who grab data from anywhere they can, aggregate it into blobs of undefined quality, and sell it to buyers with unknown intent. And, as my friend Andrea quoted in that article says, once it's escaped from the fence, there's no way to know what else gets done to it.
I know I'm mixing up different uses and data types here. My point is that it's messy in every sense.
Perhaps the more important question in the long run will be whether in reality most people care! I've been astounded at the shrugs I've seen from people of a newer generation than me.
In any case it's something to be aware of.
John Moehrke (Sep 03 2019 at 17:18):
The newer generation cares alot, but approach it very differently than us old-folk.
Jenni Syed (Sep 03 2019 at 17:19):
I feel like the newer generation is more aware of it but generally tooling fails horribly at making it "easy" to find the lines, nor determine who is the good actor nor bad actor, and lack of teeth when a bad actor does something
John Moehrke (Sep 03 2019 at 17:19):
What you describe is within what I said... In GDPR the coverage of Privacy is absolute, where in the USA the healthcare privacy is only imposed upon those that are "Covered Entitites". Thus as you indicate an intermediary can scrape data -somehow- and do with what they want without any action against them
Jenni Syed (Sep 03 2019 at 17:20):
EG: the response of "you gave your data to the app" in the US is currently what that article calls out
John Moehrke (Sep 03 2019 at 17:22):
the "you gave your data to that app" is true in GDPR too... the main difference is in transparency (and other privacy prinicples) that the app is mandated to follow because of GDPR... where as in the USA a malicious app is free to NOT explain everything to the user, and does not need to abide by what they do say. this is a big difference.... but in both cases the patient (individual) CAN share their data with what/whom ever they want.
John Moehrke (Sep 03 2019 at 17:25):
which is why there are flashlight apps that harvest your data and location behind the scene... the user has no idea that the app is anything other than a nice UI to turn on and off the flash light... This is where Google/Apple/etc do step in and de-register this app as not following BUSINESS governance that their app-store requires... so, one can say that the same thing is provided by business governance... but I would say that simply does not feel the same
Grahame Grieve (Sep 03 2019 at 19:18):
Grahame Grieve (Sep 03 2019 at 19:20):
and new one right now: https://thehealthcareblog.com/blog/2019/09/03/patient-controlled-health-data-balancing-regulated-protections-with-patient-autonomy/
Dave deBronkart (Sep 03 2019 at 20:48):
Re those blog posts - I know we can get anything we want published on THCB, if we decide we should. A hot topic is always a good place to point to something like FHIR, where it's relevant.
Grahame Grieve (Sep 05 2019 at 22:58):
Josh Mandel (Sep 06 2019 at 02:45):
@Dan Gottlieb FYI, re: Procure
Michele Mottini (Sep 06 2019 at 13:29):
I wrote to Common Health expressing our interest in connecting to their API
Dave deBronkart (Sep 10 2019 at 15:44):
Continuing from last week -
Re those blog posts - I know we can get anything we want published on THCB, if we decide we should. A hot topic is always a good place to point to something like FHIR, where it's relevant.
Do we have any sort of publicity / social media function, in HL7 at large or FHIR specifically? If not, we need one. Real-time, hot, busy.
Last updated: Apr 12 2022 at 19:14 UTC