AI Will Soon Have a Say in Approving or Denying Medicare Treatments
Taking a page from the private insurance industry’s playbook, the Trump administration will launch a program next year to determine how much money an artificial intelligence algorithm could save the federal government by denying care to Medicare patients.
The pilot program, designed to eliminate wasteful, “low-value” services, represents a federal expansion of an unpopular process known as prior authorization. This process requires patients or their medical teams to seek insurance approval before proceeding with certain procedures, tests, and prescriptions. Starting January 1 and running through 2031, it will impact Medicare patients and the healthcare providers in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington.
This initiative has raised eyebrows among politicians and policy experts. Traditionally, Medicare, which covers adults aged 65 and older and some individuals with disabilities, has largely avoided prior authorization. However, this practice is prevalent among private insurers, particularly in the Medicare Advantage market.
The timing of the announcement was surprising: it came just days after the Trump administration revealed a voluntary effort by private health insurers to revamp and reduce their own use of prior authorization, which can significantly delay care. Mehmet Oz, administrator of the Centers for Medicare & Medicaid Services, stated, “It erodes public trust in the healthcare system. It’s something that we can’t tolerate in this administration.”
Critics, including Vinay Rathi, a doctor and policy researcher at Ohio State University, have accused the administration of sending mixed messages. “On one hand, the federal government wants to borrow cost-cutting measures used by private insurance. On the other, it slaps them on the wrist,” he said.
Rep. Suzan DelBene, a Washington Democrat, echoed these concerns, stating, “It’s hugely concerning.” Patients, doctors, and lawmakers have criticized what they perceive as delay-or-deny tactics, which can impede access to care, potentially causing irreparable harm or even death.
“Insurance companies have made it their mantra to take patients’ money and then do their utmost to deny care,” said Rep. Greg Murphy, a North Carolina Republican and urologist. “That goes on in every insurance company boardroom.”
Insurers argue that prior authorization helps reduce fraud and wasteful spending while preventing potential harm. Public dissatisfaction with insurance denials surged in December when the shooting death of UnitedHealthcare’s CEO led many to view his alleged killer as a folk hero.
A July poll by KFF found that nearly three-quarters of respondents considered prior authorization a “major” problem. Oz noted during a June press conference that “violence in the streets” prompted the administration to address prior authorization reform in the private insurance sector.
Despite this, the administration is expanding prior authorization in Medicare. CMS spokesperson Alexx Pons stated that both initiatives aim to protect patients and Medicare dollars.
Unanswered Questions
The pilot program, known as WISeR—short for “Wasteful and Inappropriate Service Reduction”—will test the use of an AI algorithm for prior authorization decisions on certain Medicare services, including skin and tissue substitutes, electrical nerve stimulator implants, and knee arthroscopy.
The federal government claims these procedures are particularly susceptible to “fraud, waste, and abuse” and could be better managed through prior authorization. While other procedures may be added, inpatient-only, emergency, or those posing substantial risks to patients if delayed will not be assessed by the AI model.
Historically, Medicare has used prior authorization sparingly, relying on contractors who lack incentives to deny services. Experts believe this federal pilot could change that dynamic.
Pons assured KFF Health News that no Medicare request would be denied without review by a “qualified human clinician,” and vendors are prohibited from compensation tied to denial rates. However, critics like Jennifer Brackeen, senior director of government affairs for the Washington State Hospital Association, warn that shared savings arrangements could create incentives to deny medically necessary care.
Rathi expressed concerns that the plan is “not fully fleshed out” and relies on “messy and subjective” measures. He noted that the model depends on contractors assessing their own results, which could lead to questionable outcomes.
Pons emphasized that the use of AI in the Medicare pilot will be “subject to strict oversight to ensure transparency, accountability, and alignment with Medicare rules and patient protection.” He added, “CMS remains committed to ensuring that automated tools support, not replace, clinically sound decision-making.”
Experts agree that AI has the potential to streamline a cumbersome process often marked by delays and denials that can harm patients. Insurers argue that AI can eliminate human error and bias while saving money. However, skepticism remains regarding the extent of human review in these decisions.
A 2023 report by ProPublica revealed that doctors at Cigna spent an average of only 1.2 seconds reviewing payment requests over two months. Cigna clarified that it does not use AI to deny care or claims.
Class-action lawsuits against major health insurers have alleged that flawed AI models undermine doctor recommendations and fail to consider patients’ unique needs, forcing some to bear the financial burden of their care.
Meanwhile, a survey by the American Medical Association found that 61% of physicians believe AI is “increasing prior authorization denials, exacerbating avoidable patient harms and escalating unnecessary waste.”
Chris Bond, a spokesperson for the insurers’ trade group AHIP, stated that the organization is focused on implementing commitments made to the government, including reducing the scope of prior authorization and improving communication with patients regarding denials and appeals.
‘This Is a Pilot’
The Medicare pilot program highlights ongoing concerns about prior authorization while raising new ones. While private health insurers have been opaque about their AI usage and prior authorization practices, policy researchers suspect these algorithms are often programmed to deny high-cost care.
“The more expensive it is, the more likely it is to be denied,” said Jennifer Oliva, a professor at the Maurer School of Law at Indiana University-Bloomington. She explained that when a patient is expected to die within a few years, health insurers may rely heavily on algorithms, increasing the likelihood of denial as appeals drag on.
Oliva emphasized that the primary goal is to make it exceedingly difficult for patients to access high-cost services. As AI usage in healthcare expands, these algorithms represent a “regulatory blind spot” that requires closer scrutiny, according to Carmel Shachar, faculty director at Harvard Law School’s Center for Health Law and Policy Innovation.
The WISeR pilot is “an interesting step” toward ensuring Medicare dollars are spent on high-quality healthcare, but the lack of details complicates the assessment of its potential effectiveness.
Politicians are grappling with similar questions. Rep. DelBene has raised concerns about how the pilot will be tested and how its effectiveness will be measured. Murphy, co-chair of the House GOP Doctors Caucus, acknowledged that many physicians worry the WISeR pilot could interfere with their medical practice if the AI algorithm denies recommended care.
Recently, House members from both parties supported a measure proposed by Rep. Lois Frankel to block funding for the pilot in the fiscal 2026 budget of the Department of Health and Human Services.
While AI in healthcare is here to stay, it remains uncertain whether the WISeR pilot will save Medicare money or exacerbate existing issues related to prior authorization. “This is a pilot, and I’m open to see what’s going to happen with this,” Murphy stated, “but I will always, always err on the side that doctors know what’s best for their patients.”
Taking a page from the private insurance industry’s playbook, the Trump administration will launch a program next year to determine how much money an artificial intelligence algorithm could save the federal government by denying care to Medicare patients.
The pilot program, designed to eliminate wasteful, “low-value” services, represents a federal expansion of an unpopular process known as prior authorization. This process requires patients or their medical teams to seek insurance approval before proceeding with certain procedures, tests, and prescriptions. Starting January 1 and running through 2031, it will impact Medicare patients and the healthcare providers in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington.
This initiative has raised eyebrows among politicians and policy experts. Traditionally, Medicare, which covers adults aged 65 and older and some individuals with disabilities, has largely avoided prior authorization. However, this practice is prevalent among private insurers, particularly in the Medicare Advantage market.
The timing of the announcement was surprising: it came just days after the Trump administration revealed a voluntary effort by private health insurers to revamp and reduce their own use of prior authorization, which can significantly delay care. Mehmet Oz, administrator of the Centers for Medicare & Medicaid Services, stated, “It erodes public trust in the healthcare system. It’s something that we can’t tolerate in this administration.”
Critics, including Vinay Rathi, a doctor and policy researcher at Ohio State University, have accused the administration of sending mixed messages. “On one hand, the federal government wants to borrow cost-cutting measures used by private insurance. On the other, it slaps them on the wrist,” he said.
Rep. Suzan DelBene, a Washington Democrat, echoed these concerns, stating, “It’s hugely concerning.” Patients, doctors, and lawmakers have criticized what they perceive as delay-or-deny tactics, which can impede access to care, potentially causing irreparable harm or even death.
“Insurance companies have made it their mantra to take patients’ money and then do their utmost to deny care,” said Rep. Greg Murphy, a North Carolina Republican and urologist. “That goes on in every insurance company boardroom.”
Insurers argue that prior authorization helps reduce fraud and wasteful spending while preventing potential harm. Public dissatisfaction with insurance denials surged in December when the shooting death of UnitedHealthcare’s CEO led many to view his alleged killer as a folk hero.
A July poll by KFF found that nearly three-quarters of respondents considered prior authorization a “major” problem. Oz noted during a June press conference that “violence in the streets” prompted the administration to address prior authorization reform in the private insurance sector.
Despite this, the administration is expanding prior authorization in Medicare. CMS spokesperson Alexx Pons stated that both initiatives aim to protect patients and Medicare dollars.
Unanswered Questions
The pilot program, known as WISeR—short for “Wasteful and Inappropriate Service Reduction”—will test the use of an AI algorithm for prior authorization decisions on certain Medicare services, including skin and tissue substitutes, electrical nerve stimulator implants, and knee arthroscopy.
The federal government claims these procedures are particularly susceptible to “fraud, waste, and abuse” and could be better managed through prior authorization. While other procedures may be added, inpatient-only, emergency, or those posing substantial risks to patients if delayed will not be assessed by the AI model.
Historically, Medicare has used prior authorization sparingly, relying on contractors who lack incentives to deny services. Experts believe this federal pilot could change that dynamic.
Pons assured KFF Health News that no Medicare request would be denied without review by a “qualified human clinician,” and vendors are prohibited from compensation tied to denial rates. However, critics like Jennifer Brackeen, senior director of government affairs for the Washington State Hospital Association, warn that shared savings arrangements could create incentives to deny medically necessary care.
Rathi expressed concerns that the plan is “not fully fleshed out” and relies on “messy and subjective” measures. He noted that the model depends on contractors assessing their own results, which could lead to questionable outcomes.
Pons emphasized that the use of AI in the Medicare pilot will be “subject to strict oversight to ensure transparency, accountability, and alignment with Medicare rules and patient protection.” He added, “CMS remains committed to ensuring that automated tools support, not replace, clinically sound decision-making.”
Experts agree that AI has the potential to streamline a cumbersome process often marked by delays and denials that can harm patients. Insurers argue that AI can eliminate human error and bias while saving money. However, skepticism remains regarding the extent of human review in these decisions.
A 2023 report by ProPublica revealed that doctors at Cigna spent an average of only 1.2 seconds reviewing payment requests over two months. Cigna clarified that it does not use AI to deny care or claims.
Class-action lawsuits against major health insurers have alleged that flawed AI models undermine doctor recommendations and fail to consider patients’ unique needs, forcing some to bear the financial burden of their care.
Meanwhile, a survey by the American Medical Association found that 61% of physicians believe AI is “increasing prior authorization denials, exacerbating avoidable patient harms and escalating unnecessary waste.”
Chris Bond, a spokesperson for the insurers’ trade group AHIP, stated that the organization is focused on implementing commitments made to the government, including reducing the scope of prior authorization and improving communication with patients regarding denials and appeals.
‘This Is a Pilot’
The Medicare pilot program highlights ongoing concerns about prior authorization while raising new ones. While private health insurers have been opaque about their AI usage and prior authorization practices, policy researchers suspect these algorithms are often programmed to deny high-cost care.
“The more expensive it is, the more likely it is to be denied,” said Jennifer Oliva, a professor at the Maurer School of Law at Indiana University-Bloomington. She explained that when a patient is expected to die within a few years, health insurers may rely heavily on algorithms, increasing the likelihood of denial as appeals drag on.
Oliva emphasized that the primary goal is to make it exceedingly difficult for patients to access high-cost services. As AI usage in healthcare expands, these algorithms represent a “regulatory blind spot” that requires closer scrutiny, according to Carmel Shachar, faculty director at Harvard Law School’s Center for Health Law and Policy Innovation.
The WISeR pilot is “an interesting step” toward ensuring Medicare dollars are spent on high-quality healthcare, but the lack of details complicates the assessment of its potential effectiveness.
Politicians are grappling with similar questions. Rep. DelBene has raised concerns about how the pilot will be tested and how its effectiveness will be measured. Murphy, co-chair of the House GOP Doctors Caucus, acknowledged that many physicians worry the WISeR pilot could interfere with their medical practice if the AI algorithm denies recommended care.
Recently, House members from both parties supported a measure proposed by Rep. Lois Frankel to block funding for the pilot in the fiscal 2026 budget of the Department of Health and Human Services.
While AI in healthcare is here to stay, it remains uncertain whether the WISeR pilot will save Medicare money or exacerbate existing issues related to prior authorization. “This is a pilot, and I’m open to see what’s going to happen with this,” Murphy stated, “but I will always, always err on the side that doctors know what’s best for their patients.”
