FHIR Chat · Consent Fatigue · patient empowerment

Stream: patient empowerment

Topic: Consent Fatigue


view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 05:05):

While Consent Fatigue is still not a real common expression,
https://trends.google.com/trends/explore?date=today%205-y&q=%22consent%20fatigue%22

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 05:08):

anyone else worrying about this? Anyone thinking like me when I see
a) in a hospital "thanks for checking in. Do you consent to the use of your data with anyone in this hospital" (and then realizing that some people can use their role and my "consent" in hospital to send data to the insurance companies)

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 05:10):

b) when some web sites use dark patterns like a green shiny "Click here to accept all data processing" vs "There is no Reject all, please deselect any of these 960 data sharing by clicking them manually one by one".

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 05:11):

does anyone else feel that as patients, we cannot go through this when we are fragile?

view this post on Zulip René Spronk (Feb 22 2020 at 09:18):

Definitely. The Dutch initially had an approach where one had to consent to the sharing of 182 potential categories of data. That's unworkable. Some may indeed be designing such systems on purpose, to obfuscate what one is consenting to, to use 'consent fatigue' to get people to agree to things which they wouldn't have agreed to when they would have really understood what is was all about.
As such it's up to legislators, and to architects of national data exchanges, or creators of national consent registries, to come up with something coarse grained that's understandable to the average person.
IMHO it's better to have a limited number of coarse grained policies that can be parametrized (using XACML) than to have a zillion fine grained policies. Consent fatigue will sure set in in the latter case.

view this post on Zulip Brendan Keeler (Feb 22 2020 at 16:13):

Had many discussions while in the Netherlands about the drawbacks and dangers of fine grained consent. Fatigue due to volume is one but fatigue due to complexity is up there as well.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:26):

Yes.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:26):

This (patient empowerment) seems the right stream for a provocative statement for feedback: as a patient I also don't want to be burdened by complex consent. I do not want my GP to explain to me all that can happen to my information and ask me if I agree. I must have a trusted system that will only share the data that is lawful to share, for the lawful purposes and for the lawful time.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:26):

In other words, I want to be able to trust that complex part of the system - if needed, with my eyes closed.

view this post on Zulip Josh Mandel (Feb 22 2020 at 16:37):

The closest story I've heard about how to make this workable is to provide a substrate of fine-grained controls but allow common roll-up packaging, so you can sign on to a bundle of consent choices recommended by a consumer advocacy group, etc.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:44):

Josh Mandel said:

The closest story I've heard about how to make this workable is to provide a substrate of fine-grained controls but allow common roll-up packaging, so you can sign on to a bundle of consent choices recommended by a consumer advocacy group, etc.

Hmm that still relies on the patient having to consent, right? And for those cases where patient-granted consent is unnecessary/redundant?

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:47):

GDPR does provide some permissions to process data that the patient does not need to sign. I'm basing my requirements on that (and the fact that consent may be flawed if it is not free or explicit)

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:49):

I'm thinking there are 2 different things:

  1. The patient consenting to some data exchange that needs their consent
  2. The patient still being able to see all the other exchanges that they did not need to consent

view this post on Zulip Josh Mandel (Feb 22 2020 at 16:51):

Makes sense. I'm specifically thinking about the subset of cases where consent is required, and how to provide a rich level of choice without providing an overwhelming->meaningless experience.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:55):

Agree. Overwhelming means eventually meaningless - also legally meaningless.
For example, I am covered by GDPR, and I kind of trust any consent that is given for me to sign, because I know that if anything sketchy is given for me to sign, this is not valid consent.

view this post on Zulip Jose Costa Teixeira (Feb 22 2020 at 16:56):

And that is what I think would empower patients. Not having to be experts in privacy.

view this post on Zulip Abbie Watson (Feb 22 2020 at 17:39):

Having implemented a few consent management systems now (including one on hyperledger), I'm a little skeptical about the extent to which opt-in consent is even feasible.

In some quarters, there seem to be attempts to legislate good outcomes, as if good health itself is a right. i.e. I didn't consent to cancer. I didn't consent to being given the flu. I didn't consent to abuse or being assaulted. etc. As if the remedies that are applied in case of bad actors can also be applied to viruses, bacteria, fellow travelers, old age, etc. Which the cynic in me thinks is futile.

In the broader scheme of things, there's a counter argument to opt-in consent models that travel and entry into defined spaces can act as a type of consent in itself. You go to a bar, and there's some level of consent to be in the same space as others who are imbibing alcohol. If you travel by airplane, you're sort of consenting to listen to kids scream and maybe get this year's flu. And so forth.

So, I'm generally in favor of consent/privacy models where a) the patient maintains their own records (ID card, smartphone, dogtags, smart prosthetics, etc), that b) throttle on/off entire channels of data (FHIR resources, perhaps streaming via websockets), that is c) location aware. That is, the Apple model.

As such, I'm very interested in NFC and RFID and Bluetooth models of consent. Consent by proximity, as it were.

If done correctly, geofences and geotag perimeters and such could provide automated roll-up packaging for more fine-grained controls, like what Josh is describing above.

view this post on Zulip John Moehrke (Feb 24 2020 at 14:26):

For clarity we should not use "opt-in" and "opt-out" as terms of discussion as these terms are indistinguishable from their noun vs verb meaning. Thus understanding "Implicit Consent" vs "Explicit Consent" and the act of Asserting some choice to Permit vs Deny.

view this post on Zulip John Moehrke (Feb 24 2020 at 14:28):

Context aware consents is an interesting point, and a point that changed HIPAA into the "Implicit Consent" that we have at the federal level. This presumes that the individual is clear on what they are doing by their choice to put themselves into a specific context. This is a good presumption but fails often because the obvious context does not include the implied consent that the healthcare organization wants to be enabled to do (imagine an iceberg).

view this post on Zulip John Moehrke (Feb 24 2020 at 14:31):

The concept of consenting or dissenting to a set of "medical categories" is next logical, but also fails around the edges of the category definition. That is to say that the definition in the mind of the individual of what is within that category is sometime is not exactly the same as the boundary that the healthcare organization understands the boundaries of that category. permissive vs restrictive interpretation of a category... (this is built into Consent resource today, presuming some SLS can determine the category boundary)

view this post on Zulip Mikael Rinnetmäki (Feb 27 2020 at 07:24):

@Abigail Watson your point about implicit consent is a good starting point. But on the other hand, one of the main reasons we have GDPR is that with tools of today, it is often not implicit enough for people. They take a quiz on Facebook without realizing that the information they give out will be weaponized and used to affect votes. The Cambridge Analytica case. With all the digital tools and data brokers, it is just not transparent enough.

view this post on Zulip Mikael Rinnetmäki (Feb 27 2020 at 07:26):

As for @Josh Mandel's point, I do believe some kind of labels like we have for fair trade items and Creative Commons would be very usable and required in this space.

view this post on Zulip Mikael Rinnetmäki (Feb 27 2020 at 07:27):

It still does not fully answer @Jose Costa Teixeira's original concern of people being asked for their consent when they are vulnerable, and of the power balance issues. But even in those cases, I believe having a set of clear labels would give patient advocacy groups a better starting point for a conversation with a provider regarding their policies.

view this post on Zulip John Moehrke (Feb 27 2020 at 14:09):

I feel left out... what about my statements... :-)

view this post on Zulip Mikael Rinnetmäki (Feb 27 2020 at 17:35):

My most sincere apologies @John Moehrke! The main reason was that I did not find an obvious angle to tie what you said to what I was advocating for: the clear and recognizable labels describing data use. I could have used the iceberg... :)

If it helps you make feel any better, I also did not comment on René's statements. If that does not help, I can state that I fully agree with @John Moehrke on the above. If you still crave more, I can come up with some nitpicking about the terms "opt-in" and "opt-out" still having their place in some of the discussion, because of their wide spread use - especially in how the question for a consent or a permission is formulated, and especially with the set defaults in policies. ;-)

Happier?

view this post on Zulip Mikael Rinnetmäki (Feb 27 2020 at 17:50):

Regarding the labels, I know of https://www.me2balliance.org/ and of a few like initiatives.

view this post on Zulip Dave deBronkart (Feb 27 2020 at 19:09):

Seems to me this concept needs to be an article in JAMIA! The topic will certainly be eye-catching on social media, and for good reason. Who wants to lead and/or do the work?

view this post on Zulip Abbie Watson (Feb 27 2020 at 19:16):

Possibly me. I've been an on-again-off-again AMIA member, and I've been preparing some journal articles as I prepare to extend my conference participation beyond HL7. I'm not looking to be some sort of expert on consent issues (a role that people have previously tried to dump on me), but it's one of the topic areas we've covered in past projects and want to write about. Mostly just want to cover the topic for peer-review purposes, and then let others own that space.

view this post on Zulip Jose Costa Teixeira (Feb 27 2020 at 22:58):

Good idea @Dave deBronkart and good initiative @Abigail Watson

view this post on Zulip Jose Costa Teixeira (Feb 27 2020 at 23:00):

I'm helping prepare a proposal for a permission approach that may or not rely or consent and covers (the key areas of) GDPR.
If that helps, I would love to share ideas and content.

view this post on Zulip Jose Costa Teixeira (Feb 27 2020 at 23:02):

I will be looking at cases like court-ordered access to data, or relayed permission...

view this post on Zulip Jose Costa Teixeira (Feb 27 2020 at 23:05):

A key point (will continue to discuss in Security stream) is the shift from "implicit consent" into "explicit permission":
When a hospital, without patient consent(but under general policy, legal requirements, or any valid purpose), allows data to be shared with another party, that is a conscious, deliberate decision - someone has decided to open that flow.

view this post on Zulip Jose Costa Teixeira (Feb 27 2020 at 23:06):

As a patient, I want more of those permissions in place, not being asked to "click here".

view this post on Zulip Virginia Lorenzi (May 15 2020 at 22:29):

I think that when I touch the healthcare system and they ask for a consent I shouldnt trust them. I feel they should just know to do what is right. You sign because you need the care and are afraid of what you signed.

view this post on Zulip Jose Costa Teixeira (May 15 2020 at 22:35):

yes, that is the other side of it:
IMO, the hospital does not need my consent if they need the data to treat me.
If they need data, this is a very legitimate use of data, no consent needed.
If they don't need it but ask me a consent when I am suffering, this is not a legally acquired consent.
(note I am not a lawer, this is my reading of the things in GDPR that are sensible)

view this post on Zulip Jose Costa Teixeira (May 15 2020 at 22:35):

So I don't trust them blindly, but I trust they have to obey the law

view this post on Zulip Mikael Rinnetmäki (May 18 2020 at 19:05):

We have an active discussion going on regarding this and patient-generated health data, i.e., the data that would be available from all the wearables you as a citizen/patient/person use. That data is not accessible to the healthcare system by default. And in some cases the healthcare system is interested in that data. How would they get access to it, and what would be the legal basis for accessing and processing that data? Consent seems the most viable option for this, but there are concerns over power balance. Any other views?

view this post on Zulip Jose Costa Teixeira (May 18 2020 at 19:56):

  1. the data subject has given consent
  2. performance of a contract
  3. compliance with a legal obligation
  4. to protect the vital interests of the data subject
  5. public interest or in the exercise of official authority
  6. legitimate interests of the controller

view this post on Zulip Jose Costa Teixeira (May 18 2020 at 19:56):

I think from GDPR, the one that seems reasonable here is indeed consent

view this post on Zulip Terrie Reed (May 18 2020 at 20:29):

Could we also put a 'pin' in a need to transmit the UDI-DI and UDI-PI of the wearable if it is a medical device. Right now efforts to use UDI of medical devices on the network are limited due to proprietary nature of these transmissions. I am hoping that wearable technology groups would be more open with their data transmission and could obtain the value of being able to standardize device identification where not always possible with traditional networked devices

view this post on Zulip Lloyd McKenzie (May 19 2020 at 01:06):

I don't know that consent would be necessary if the patient pushed the data to the practitioner. (It would typically be needed if the provider were pulling it from another repository though.)

view this post on Zulip Mikael Rinnetmäki (May 19 2020 at 21:40):

In GDPR terms you need to state the legal basis that you have to process the data. The ones listed by Jose above. Even if the patient pushes the data. You might argue that the consent is implicit, but the language in GDPR is quite clear that an explicit consent is required. It also means that the healthcare organization needs to inform the patient how the data will be used if the patient chooses to push the data.


Last updated: Apr 12 2022 at 19:14 UTC