As health care payers increasingly rely on artificial intelligence (“AI”) to speed up patient claim adjudication and prior-authorization determinations, providers should be on the lookout for algorithms designed to deny claims with minimal human oversight. Two putative class action lawsuits filed against Cigna and UnitedHealth in November 2023 allege that the payers’ respective AI models, “PxDx” and “nH Predict,” were used in place of real medical professionals to wrongfully deny medically necessary care.
The lawsuit against Cigna alleges that Cigna uses its PxDx (procedure-to-diagnosis) algorithm to automatically review claims without conducting a thorough medical review of the claims as required by law and contract. [1] The lawsuit alleges that over a period of two months in 2022, for example, Cigna used PxDx to automatically review and deny over 300,000 claims, spending an average of just 1.2 second per claim. [2] Despite insurance laws and regulations in many states requiring physicians to review claims before payers may deny them on the basis of medical necessity, the lawsuit suggests that physicians employed by payers to review insurance claims (often called “medical directors”) are signing off on denials without even looking at them.
The suit against UnitedHealthcare is specific to Medicare Advantage plans. It alleges that UnitedHealth’s AI model, nH Predict, has a known 90% error rate and denies claims by forcing unrealistic expectations of Medicare Advantage patients’ recovery in post-acute care settings. The algorithm overrides the opinions of real life treating physicians who deemed the underlying care was medically necessary.[3]
Both lawsuits rely on only a small percentage of patients appealing wrongfully denied claims. Indeed, where only about 0.2% of policyholders appeal denied claims, the majority will end up paying out-of-pocket costs or forgoing the remainder of prescribed post-acute care. [4] In Medicare Advantage cases specifically, this can mean the elderly are forced out of care facilities or must deplete their savings to continue receiving medically necessary care where an algorithm has disagreed with their treating physician.
Not only are these algorithms being used to deny claims, they also are being used to change human behavior. The UnitedHealthcare suit alleges that the defendants have set up employee targets in order to keep skilled nursing facility stays within 1% of the days projected by nH Predict, and that “[e]mployees who deviate from the nH Predict AI Model projections are disciplined and terminated, regardless of whether a patient requires more care.” [5]
While this behavior is concerning, patients and providers alike have recourse through appealing claims and by proactively seeking to negotiate contracts with payers to prohibit these practices. Providers may also already have agreements in place that allow for providers to seek recourse through litigation against the payer for such use of AI. Thompson Coburn’s Health Care Group has a team of attorneys available to assist with these issues.
[1] See Amy Snyder, et al. v. The Cigna Group, Cigna Health and Life Ins. Co., and Cigna Health Mgmt., Inc., 3:23-cv-01451-OAW (D. Conn. Nov. 2, 2023); see also Patrick Rucker, et al., How Cigna Saves Millions by Having Its Doctors Reject Claims Without Reading Them, ProPublica (Mar. 25, 2023),
[2] Id.; see also Patrick Rucker, et al., How Cigna Saves Millions by Having Its Doctors Reject Claims Without Reading Them, ProPublica (Mar. 25, 2023),
[3] See Estate of Gene B. Lokken, et al. v. UnitedHealth Group, Inc., et al., 0:23-cv-03514 (D. Minn. Nov. 14, 2023).
[4] Karen Pollitz, et al., Claims Denials and Appeals in ACA Marketplace Plans in 2021, KFF (Feb. 9, 2023).
[5]See Estate of Gene B. Lokken et al. v. UnitedHealth Group, Inc., et al., 0:23-cv-03514 (D. Minn. Nov. 14, 2023).