Medicare’s Push for More “Skin in the Game”

Authors: 

Richard G. Stefanacci, DO, MGH, MBA, AGSF, CMD

Series Editor: Barney S. Spivack, MD, FACP, AGSF, CMD

Dr. Stefanacci served as a CMS Health Policy Scholar for 2003-2004, is associate professor of health policy, University of the Sciences, and a Mercy LIFE physician, Philadelphia, PA; and is chief medical officer, The Access Group, Berkeley Heights, NJ.

Dr. Spivack is associate physician editor of Clinical Geriatrics, and is Medicare Medical Director, OptumHealth Care Solutions, United Healthcare, Westport/Trumbull, CT; founder, Connecticut Geriatrics Society; and consultant in geriatric medicine, Greenwich Hospital, Greenwich, CT, and Stamford Hospital, Stamford, CT.

 


 

 

Although Medicare spending in 2010 saw a deceleration from the growth of 7.0% observed in 2009, it still grew 5% that year, reaching $524.6 billion.1 It appears that 2012 will see a continuation of the discussion of Medicare cost-cutting that occurred in 2011 and previous years, as Medicare continues to face significant growth. This growth can be attributed to the increasing number of baby boomers reaching retirement age and the proliferation of cutting-edge diagnostic tools and innovative treatments, which have increased longevity. While some of these new tools in the diagnostic and treatment arena have improved morbidity and mortality rates, others are often used without any evidence that they work better than existing and less expensive options.

Given the severe pressure on resources and the inability to simply increase revenue to cover exploding Medicare costs, the only possible routes to control Medicare spending are payment reductions and/or decreases in healthcare utilization. This is because Medicare costs are a factor of payments multiplied by utilization; thus, Medicare is looking to have patients and providers decrease healthcare costs by controlling use of services and by increasing their financial responsibility, or having them put more of their “skin in the game.” The “skin in the game” idiom was coined by world-renowned investor Warren Buffett, who used it to refer to a situation in which an individual makes a financial investment in the company he or she is running. In healthcare, “skin in the game” refers to paying more out of pocket for healthcare services.

Placing decisions in the hands, or pockets, of patients and providers forces them to make value determinations. This may be an easier option, at least politically, than the alternative of having Medicare make these determinations. The Clinical Guidelines Committee of the American College of Physicians2 attempted to explain that efforts to control expenditures should focus on both the value and costs of healthcare interventions. Determining whether an intervention provides high value requires assessing whether its health benefits justify its costs. While high-cost interventions may provide good value because they are highly beneficial, conversely, low-cost interventions may have little or no value if they provide little benefit. Given the challenges in determining the benefit of certain interventions, it is often easier for Medicare to pass this responsibility onto patients and providers by increasing their skin in the game.

 

Not Rationing

Donald Berwick, MD, previous acting administrator for the Centers for Medicare & Medicaid Services (CMS), was quoted as saying, “The decision is not whether or not we will ration care—the decision is whether we will ration with our eyes open.”3 The “we” in Berwick’s statement may not have been “CMS.” For both political and regulatory reasons, Medicare may be unable to reduce utilization through rationing. This in part is because Medicare is forced to cover any and all services that are deemed medically necessary. The original statutory language that established Medicare in 1965 instructed that CMS pay for services that are reasonable and necessary for the diagnosis or treatment of any illness or injury or to improve the functioning of a malformed body member.4 This definition of medically necessary cannot take into account the value of a particular service.

To make the determination of coverage, CMS is required to use a formal and transparent process, with reliance on a review of the best available evidence about the effectiveness of the treatment and technology under consideration. CMS can also seek expert advice from the Medicare Evidence Development & Coverage Advisory Committee (MEDCAC), which consists of 100 experts in medicine, biological and physical sciences, public health administration, patient advocacy, healthcare data and information management and analysis, healthcare economics, and medical ethics. Over the past decade, CMS has only issued about a dozen National Coverage Determinations each year. This approach has been limited because of political and other obstacles that prevent its widespread use.

Rather than Medicare taking the lead in controlling utilization, Medicare’s approach has focused on forcing patients and providers to bear more financial responsibility. This is an attempt to battle overutilization that results from moral hazard, which is defined by economic theory as behaviors that encourage overutilization because the result of such behaviors have little, if any, financial consequences for the individuals who participate in them. For example, when comparing eating practices of patrons at an all-you-can-eat buffet with those at a restaurant where individuals are required to order à la carte, it has been found that people consume significantly more at the all-you-can-eat restaurant. Such a situation also exists with insurance;  namely, the greater the insurance coverage, the higher its use, a finding Medicare has recognized for some time.

Because the Medicare Part D benefit was established with a predetermined allocation of $400 billion over a 10-year period, it was virtually impossible to provide uniform benefits to all Medicare beneficiaries, catastrophic coverage to those with the highest drug costs, and additional subsidies to provide significant assistance to low-income beneficiaries without a significant gap in coverage. A Medicare Prescription Drug Benefit that provided comprehensive coverage to all Medicare beneficiaries would have cost at least twice as much over the 10-year period. Congress and virtually every stakeholder knew that the $400 billion budget allocation would necessitate a significant gap in the Prescription Drug Benefit at the time of the earlier budget debate. This coverage gap resulted in a period in which Medicare beneficiaries were 100% responsible for the cost.5

The “skin in the game” expression became popular during the debate regarding Medicare’s Prescription Drug Benefit. The belief in having beneficiaries be responsible for 100% of the cost of medications was that they would adjust their medication use to include only those medicines that they truly valued or needed. The Medicare Payment Advisory Commission (MedPAC) noted that without significant cost sharing, Medicare beneficiaries are hidden from the prices, leading to higher use of resources, both those that are necessary and unnecessary.6

Despite Medicare’s desire to force even more skin in the game on the part of beneficiaries, Medicare is tempered in this approach from political pressure. This pressure has resulted in this approach being reversed, as demonstrated by the filling in of the Medicare Part D coverage gap and the elimination of any out-of-pocket payments for several preventive services, such as mammograms. The concept behind the reversal was that patients’ increased skin in the game could result in underutilization of needed services.

Because the elimination of moral hazard requires a consumer who can pay for services, those individuals without the ability to pay are taken out of this equation. Medicaid provides a safety net against skin in the game applications for the poorest Americans. For example, those with Medicare and Medicaid have no coverage gap in their prescription drug coverage through Medicare Part D. This situation presents a case ripe for overutilization. Again, not surprisingly, utilization levels for dual eligibles are significantly higher than for Medicare beneficiaries. Since out-of-pocket expenditures cannot be used as a way to encourage more appropriate utilization, payers are forced to use other techniques, which may include prior authorization, step therapy, and gate-keeping, which often become burdensome to providers.

 

Patients’ Skin in the Game

One of the areas where there is a significantly high level of moral hazard is in the Medicare Part B space. Medicare Part B covers physician services, diagnostic testing, and medications purchased and administered by providers. Medicare pays 80% of the cost of Medicare Part B services, while the remaining 20% is the responsibility of patients. This is blocked, however, by Medigap insurance. Medigap is possessed by the majority of Medicare beneficiaries and covers the additional 20%, eliminating any patient out-of-pocket expenses for any Part B service. When patients incur no out-of-pocket expenses, they have no financial incentive to forego a service. To put this issue into perspective, consider a diagnostic radiopharmaceutical indicated for positron emission tomography (PET) imaging to detect beta-amyloid aggregates in the brain. A negative florbetapir-PET scan is clinically useful in ruling out the presence of pathologically significant levels of beta-amyloid in the brain; thus, this test could be used to directly detect the hallmark pathology of Alzheimer’s disease. Although the test costs thousands of dollars, it could be offered at no cost to the millions of Medicare beneficiaries at risk for dementia, costing Medicare several billion dollars.

To prevent such scenarios, several proposals are under consideration to control the growth in Medicare spending, with most proposals advocating higher cost-sharing for patients. One such proposal attempts to achieve savings by restricting coverage under Medigap plans to require enrollees to pay a larger share of the costs of Medicare-covered services.7 In 2008, about one in six Medicare beneficiaries, which equates to more than 7 million people, had purchased a Medigap supplemental insurance policy and had no other source of supplemental coverage. Medigap policies cover some or all of Medicare’s cost-sharing requirements. Medigap coverage provides “first dollar” coverage, which is the type of an insurance policy feature that provides full coverage for the entire value of a loss without a deductible. Some analysts contend that comprehensive “first dollar” coverage from Medigap leads enrollees to obtain unnecessary services because of a moral hazard and lack of skin in the game, resulting in excess Medicare spending.8 The Congressional Budget Office (CBO) has proposed a plan that would prohibit Medigap policies from paying the first $550 of enrollees’ cost-sharing and require that they cover no more than half of Medicare’s additional required cost-sharing up to a fixed out-of-pocket limit.9 CBO estimates that this approach would produce savings of $3.7 billion in 2013 and $53.4 billion over a 9-year period from 2013 to 2021.

Another way that CMS is forcing patients to have more skin in the game is through the use of least costly alternatives (LCAs). The rationale behind LCAs is that Medicare should not pay more for a service when a similar service can be used to treat the same condition and produce the same outcome at a lower cost.8 This has led to Medicare setting a reference price for a particular product or service. Beneficiaries can still obtain the more costly item, but they must pay the difference between the approved payment amount for the reference item and the amount for the one they choose. This approach enables Medicare to control utilization of items outside of the referenced item by forcing patients to have skin in the game when they select a more expensive service. Without such restrictions in place, there would be no incentive for a patient to choose the referenced item over a more expensive alternative.

Add new comment