A.I in medicine and Hippa laws

I recently had my annual mammogram and when I received the results, at the bottom of the page it said that AI was used in the screening.
Nobody asked my permission to use AI. I didn’t sign any consent to use AI.
I am not happy about this. I am wondering where the data from my medical test is going and who might see it.

I feel that using AI without my consent is a violation of Hippa. This was done at John Hopkins imaging…supposedly one of the best hospitals in the world. How do you all feel about this? Would you be bothered?
 

AI in these situations is believed to drastically reduce the risk of false positives, I believe. Humans, remember, are prone to making errors, regardless of how experienced they are.
 

Maybe a twist of irony here, but I asked your question to AI. The answer:
No, hospitals generally cannot use AI for analysis without patient consent, as this would likely violate patient privacy laws like HIPAA, which require explicit consent to collect, store, and use sensitive medical data, including when using AI technology for analysis; if de-identification is not possible, informed consent is usually required.
 
I spoke to them about it and they acted as if I were nuts. Nobody had any answers for me as to what happens to my data. They said nobody had ever questioned them about it before.
 
I’m not sure that I understand how using AI would change who sees or has access to the information.
AI learns by collecting data. I don’t know if the system they use is shared by other hospitals or if it all stays with them or what. I also read a recent article about the use of AI at Johns Hopkins and the article touted all that AI can do but said that it wasn’t ready yet to be used in a clinical setting.
 
AI learns by collecting data. I don’t know if the system they use is shared by other hospitals or if it all stays with them or what. I also read a recent article about the use of AI at Johns Hopkins and the article touted all that AI can do but said that it wasn’t ready yet to be used in a clinical setting.
I would be surprised if they haven’t been sharing the information for years in order to build a body of statistics.

I would assume that they would strip away identities as part of that process.

I wouldn’t be concerned but it would be interesting to understand exactly how it works.
 
I called another imaging facility and found out that they get consent to use AI in mammography but they also charge $40 to use it because they said insurance doesn’t cover it yet.
Hopkins didn’t charge me anything(which only makes me more suspicious)
 
I would be surprised if they haven’t been sharing the information for years in order to build a body of statistics.

I would assume that they would strip away identities as part of that process.

I wouldn’t be concerned but it would be interesting to understand exactly how it works.
Yes, I might assume too but I would like to know, not just assume…. And I might have consented to it’s use had I been asked and been able to get some info about my anonymity in the process
 
At this juncture of AI I would never want AI to be used, without my specific consent.

NOR to use AI to make a diagnosis on myself or anyone else for that matter!

Call me old school if you will, but I have to SEE a patient, be IN THE ROOM with them, feel their demeanor, see and touch their skin, see things up close, the whole package if you will; before making life changing judgements and diagnoses or ruling out conditions. I don't know how providers do it today..I really do not.

AI engineers can't even get AI to stop hallucinating, let alone make a diagnosis for life threatening issues! :rolleyes:

AI is only as good as the info is going in. Remember "GIGO" back in the 70s and 80s?

As far as it being a HIPPA violation? As long as the AI used to manage your care for you is part of the larger system that is providing your health care, using your information to do so, no it is not a violation.

When I was actively working full time, one of my "extra duties" was a compliance "HIPPA" medical officer.
For both Fed and State level entities.

I know: woo hoo! Not. I used to have nightmares about HIPPA! Fear that personnel would possibly violating HIPPA

I was like a commander in a "MASH" unit the "Colonel Potter" if you will. Except I was not only in MASH type deployable units but also in rather big hospitals. >250 beds civilian and military ones. We'd go to war or source national emergencies with either size resources.

Sigh. I miss it so much!
 
I would NOT be happy if AI were used to analyze my mammogram, but I'm not sure it's a violation of HIPAA. But now your posts have me thinking. . .

In my opionion, they definitely should have let you know ahead of time, and not just in fine print.
 
In medical circles over here, "AI", is looked upon as
a wonderful thing, they say, that it can spot things
than humans miss, I don't know if that is true, but
it is how it is advertised!

Mike.
 


Back
Top