Are you prepared for your doctor or nurse practitioner to be replaced by an algorithm? Well, H.R. 238 by Rep. David Schweikert (R-Ariz.), could do just that. Here is the bill as it has been introduced:
To amend the Federal Food, Drug, and Cosmetic Act to clarify that artificial intelligence and machine learning technologies can qualify as a practitioner eligible to prescribe drugs if authorized by the State involved and approved, cleared, or authorized by the Food and Drug Administration, and for other purposes.
Reducing the human equation in medical care may seem like a good idea since all of us, at one time or another, have fought with our medical insurance companies, and everyone has more than enough reason to distrust Big Pharma. But if you are tired of dealing with faceless entities that see you only as a number, wait until you try to appeal a decision by AI. Wait until you have to sue AI for malpractice.
Just for kicks (and a little dose of irony), I asked ChatGPT if there would be any problems with an AI prescriber. Why not ask AI about AI? Well, ChatGPT did not think that an AI doctor was all Schweikert thinks it may be cracked up to be. Here is the response:
- **Lack of Human Judgment**: AI may not consider the nuances of a patient’s unique medical history, emotional state, or lifestyle factors that a human healthcare provider would.
- **Data Bias**: If the AI is trained on biased datasets, it may not make accurate recommendations for all patient populations, leading to health disparities.
- **Errors in Dosage or Interaction**: AI might miscalculate dosages or fail to recognize harmful drug interactions, potentially putting patients at risk. […]
— Read More: pjmedia.com
-
Learn the TRUTH about Gold IRAs and how most precious metals companies play dirty.