[ad_1]
Patients enrolled in Medicare Advantage (MA) plans will have more protection from the threat of AI-related bias, according to a new rule from the Centers for Medicare and Medicaid Services (CMS). A federal agency tasked with overseeing Medicare, Medicaid, and children’s health insurance. Programs and Health Insurance Marketplace.
In a policy memo sent to insurers on February 6, CMS states that health insurers should not rely entirely on AI or algorithmic systems to make patient care and coverage decisions or change coverage standards over time. It is prohibited to change it. The agency defines AI as “a machine-based system that can make predictions, recommendations, and decisions that affect a real or virtual environment against a set of human-defined goals,” and requires insurers to “compensate It warns against using “algorithms that determine coverage based on content.” Instead of individual patient medical histories, physician recommendations, or clinical records, utilize larger datasets. ”
In November, two patients told health insurance company Humana that its use of an AI model (known as nH Predict) to make care decisions was fraudulent, ignored doctors’ recommendations, and unfairly treated elderly beneficiaries. filed a lawsuit alleging damage caused to the company. The patient was covered by a Medicare Advantage plan. A similar lawsuit involving the same AI model was filed against his UnitedHealth insurance group.
“Algorithms or software tools may be used to assist in determining coverage for Medicare Advantage plans,” the agency explained in a recent memo. “However, it is the responsibility of the MA organization to ensure that the algorithm or artificial intelligence complies with all applicable regulations regarding how coverage is determined by the MA organization.” , admission cannot be refused or downgraded to observation admission solely on the basis of algorithms or artificial intelligence; the individual circumstances of the patient must be considered.”
The FCC has determined that these realistic AI robocalls are illegal.
Medicare Advantage is an additional federal insurance option that allows Medicare-approved contract private companies to provide health insurance benefits to Medicare-eligible individuals.
In November 2023, members of the House of Representatives issued an open letter to CMS calling for the use of AI and algorithms in guiding Medicare Advantage coverage decisions, citing ongoing issues with prior authorization reporting under Medicare. We asked CMS to monitor it. The letter argues that the use of AI and algorithmic software is exacerbating these problems.
“Medicare Advantage plans are tasked with providing medically necessary care to their enrollees, and CMS has recently made significant progress in ensuring this happens, especially through the use of AI. “More work needs to be done when it comes to curbing the inappropriate use of prior authorization by MA plans in the case of “algorithmic software,” the congressman wrote.
Across the industry, healthcare and insurance organizations are using enhanced AI-powered tools to help patients find and purchase health plans, predict patient health status for Medicare beneficiaries, and expedite payments and services. We have been exploring the possibilities of AI. But concerns about inevitable bias and contradictions have led many observers to call for further scrutiny.
“Furthermore, we are concerned that algorithms and many new artificial intelligence technologies have the potential to exacerbate discrimination and bias,” CMS wrote in the memo. “We remind Medical Advantage organizations of the non-discrimination requirements of Section 1557 of the Affordable Care Act, which states that certain health care programs and activities do not discriminate against race, color, or national origin. , prohibits discrimination on the basis of gender, age, or disability. Medicare Advantage organizations must: “Before implementing an algorithm or software tool, ensure that the tool does not perpetuate existing bias.” Make sure you are not making it worse or introducing new biases. ”
[ad_2]
Source link