Eticas

The gold standard in automated bias detection + responsible AI compliance.

NarxCare, an algorithm to predict the risk of narcotic

Opioid abuse has, over the last few decades, grown to become a crisis. Different responses have been developed to handle the crisis with one solution, NarxCare having a positive intention but with concerning implementation. Developed by Appriss and deployed in the United States, the algorithm combs through the multi-state Prescription Drug Monitoring Program (PDMP) database and develops a risk report which indicates the likelihood that a person is at risk for misusing or abusing opioids. The score ranges from 000-999 and a high number indicates a higher likelihood of abuse. The algorithm takes into consideration several factors including the number of doctors that have prescribed the medication to the person, the amount of medication they’re taking, the number of pharmacies the person goes to fill out their prescription and any other factors such as overlapping prescriptions and medication which is likely to worsen or increase the risk of abuse or misuse (London, 2021). The risk score is presented in the form of a ‘Narx Report’ which includes the risk score, any red flags in the patient’s prescription (which may put them in danger of an unintentional overdose) or other adverse events. The score is available to pharmacists, doctors and hospitals and is intended to assist them in identifying patients who might be at risk for substance abuse (Ray, 2021).

The algorithm does not take into account several social and logistical factors that hinder its good intentions. To begin with, prescription drug monitoring can make physicians less likely to prescribe opioids where patients need them, for fear of facing repercussions. This means that patients are not able to adequately get pain relief and will endure a lower quality of life as they are unable to manage pain. Secondly the algorithm considers factors such as buying prescriptions in different pharmacies as a risk-increasing factor so a patient that moves addresses or needs to access cheaper care in different pharmacies is erroneously targeted (Szalavitz, 2021).

In addition, several patients have reported not knowing that the algorithm was being used against them and being denied prescription medication. This brings up problems of gaining patient consent as well as the groups that this information is distributed to which includes pharmacies, physicians and law enforcement(Szalavitz, 2021).

In addition, the algorithm is a blackbox and the company does not disclose the entirety of factors that are used in determining the risk score and have not disclosed several of the features of the model that would aid in transparency and accountability, only mentioning that its predictive model not only draws from state drug registry data, but “may include medical claims data, electronic health records, EMS data, and criminal justice data.” At least eight states, including Texas, Florida, Ohio, and Michigan have signed up to incorporate this algorithm into their monitoring programs. While the algorithm’s risk score is not intended to supplant the determination of a medical professional on what a patient needs, the patient and doctor are put at risk of prosecution as the information is available to law enforcement as well (Siegel, 2022).

Despite the promise of such a database, the risks associated with its deployment come at a heavy cost to patients who require them for pain relief. The exact determination of its risk scores is also murky and despite the sensitive medical information that it contains, is available to several persons including law enforcement officers (Szalavitz, 2021).

You can find information about NarxCare in its entry in the OASI Register, where as of now you’ll find almost 110 different algorithms. You can also read more about the social impact of algorithmic systems in the OASI pages. And you can tell us about an algorithm that we are missing by submitting a simple online form on our website.