The widespread implementation of artificial intelligence (AI) in hospitals raises the question of whether humans or a machine are making healthcare decisions. Academics and tech experts argue that, when AI is used alongside humans in the process of diagnosing, assessing, and treating patients, it is a powerful tool. Yet, nurses opine that these tools can be flawed and utilized without proper training or flexibility, which may put patient care at risk. Some clinicians feel as though they must defer to the algorithm due to pressure from hospital administration.  

In a survey conducted by National Nurses United, 24 percent of respondents reported that a clinical algorithm prompted them to make choices they felt were not in the best interest of their patients. Of these individuals, 17 percent said they were allowed to override the decision, while 31 percent were not permitted to, and 34 percent needed permission from a doctor or supervisor to do so. 

At UC Davis Medical Center, an oncology nurse revealed that hospital rules require nurses to abide by protocol when a patient is flagged for certain conditions, such as sepsis. UC Davis, however, stated that protocols prompted by technology tools are not required, but rather recommended, adding that nurses and physicians have the ultimate decision-making authority. 

Many AI applications are intended to ease the burden on healthcare workers by taking over tedious and time-consuming tasks. Yet, whether a nurse decides to override an algorithm is often dependent on hospital policy, and clinicians who are penalized for making an incorrect decision may defer to a computer more easily. 

Read the full article here.