The use of artificial intelligence (AI) systems in health care, particularly those designed to combat the
rise of opioid poisonings, is raising concerns about transparency, bias, and unintended consequences in
patient care. These systems, like Bamboo Health’s NarxCare, are used to assess patients’ risk of
controlled prescription medication misuse and may influence the prescribing of medications by
healthcare providers. However, the algorithms behind these systems are not publicly disclosed, making
it challenging to evaluate their accuracy and potential bias.

According to a KFF Health News article, patients and providers have reported negative impacts, with
some patients feeling stigmatized and cut off from needed pain relief, while some providers claim they
have been unfairly threatened in their medical practice due to data-driven flagging. The Centers for
Disease Control and Prevention warns that these systems can lead to patients being dismissed from
clinician practices, potentially leaving them untreated or undertreated for pain. The stakes are high for
pain patients who are prescribed controlled medications for their condition, as rapid dose changes can
increase the risk of withdrawal, depression, anxiety, and even suicide.

There is a growing call for greater transparency, independent testing, and regulation to ensure these
technologies work as intended and do not harm patients. The American Medical Association expressed
concerns about providers having their prescribing privileges suspended without due process based on
unreviewed algorithms, potentially harming patients in pain due to delays and denials of care. As AI
continues to shape healthcare, balancing its potential benefits with ethical and regulatory challenges
remains an ongoing concern.

Read the KFF Health News article.