• Drug Coverage
  • Hypertrophic Cardiomyopathy (HCM)
  • Vaccines: 2023 Year in Review
  • Eyecare
  • Urothelial Carcinoma
  • Women's Health
  • Hemophilia
  • Heart Failure
  • Vaccines
  • Neonatal Care
  • NSCLC
  • Type II Inflammation
  • Substance Use Disorder
  • Gene Therapy
  • Lung Cancer
  • Spinal Muscular Atrophy
  • HIV
  • Post-Acute Care
  • Liver Disease
  • Pulmonary Arterial Hypertension
  • Safety & Recalls
  • Biologics
  • Asthma
  • Atrial Fibrillation
  • Type I Diabetes
  • RSV
  • COVID-19
  • Cardiovascular Diseases
  • Breast Cancer
  • Prescription Digital Therapeutics
  • Reproductive Health
  • The Improving Patient Access Podcast
  • Blood Cancer
  • Ulcerative Colitis
  • Respiratory Conditions
  • Multiple Sclerosis
  • Digital Health
  • Population Health
  • Sleep Disorders
  • Biosimilars
  • Plaque Psoriasis
  • Leukemia and Lymphoma
  • Oncology
  • Pediatrics
  • Urology
  • Obstetrics-Gynecology & Women's Health
  • Opioids
  • Solid Tumors
  • Autoimmune Diseases
  • Dermatology
  • Diabetes
  • Mental Health

Opinion: It’s time to rethink how we manage, pay for care for the sickest 10%

Article

Health insurance has served different purposes over the years. It may be time to rethink today’s system.

Health insurance has served different purposes over the years. Congress established the U.S. Marine Hospital in 1798 as a way to care for U.S. seamen, who were required to contribute a portion of their wages in return for access to services. However, most health insurance policies developed over the next century were designed to protect against income loss due to accidents rather than covering medical services.

It wasn’t until near the end of the 19th century that companies from selected industries began providing coverage for medical services.

In the early part of the 20th century, progressive social reformers began advocating for a compulsory health insurance system. Reformers wanted to redistribute sick peoples’ medical costs and wage losses among the rest of society through social insurance. 

Until the 1950s, most health insurance plans were mini-med plans, by today’s standards. Benefits were limited to specific circumstances, such as a specific number of days, hospital stays or episodes of care, etc.

Major medical plans were later introduced to supplement basic medical expenses and protect against extended illness or injuries.

Health insurance got a significant boost in the 1960s with the introduction of Medicare and Medicaid. Employer coverage also increased due to the tax exclusion conferred by Congress in the early 1950s.

Is it time to rethink the system?

It has been illegal for decades to vary employee contributions based on health status in group coverage. In 2010 the Affordable Care Act (ACA) also banned underwriting-basing premiums on individual health status-in the individual market and discriminating against enrollees with pre-existing conditions. The ACA also did away with annual and lifetime limits on benefits. All health plans became an all-you-can-eat buffet.

As the benefits covered by health insurance grew more comprehensive, premiums rose. The average employer-sponsored health plan now costs $6,435 per individual and $18,142 for a family plan, according to the Kaiser Family Foundation. To slow the rise in premiums, cost-sharing has also increased over the past few years. From 2006 until 2015, average deductibles for employee coverage increased from $303 to $1,088.

About half of all workers now face deductibles of $1,000 or more. Ten years ago, only about 4% of those with employer coverage had high-deductible health plans; that figure is nearing one-third today (29%). Deductibles of $3,000, $4,000, $5,000 or higher are not uncommon in the individual market.

Americans are increasingly enrolling in health plans that reimburse none of their medical bills, but are costly nonetheless. Many day-to-day medical care costs are paid entirely out-of-pocket. Indeed, about half the U.S. population incurs little-if any-medical expenses in a given year, according to an Agency for Healthcare Research and Quality report. Half the U.S. population accounted for less than 3% of health expenditures in 2012. The healthiest 80% of Americans only consumed 20% of healthcare dollars.

Health insurance makes it easier to finance costly medical interventions. Having a reliable funding source also stimulates the development of medical technology. As technology increases, so does the cost of care.

Premiums continue to rise, but cover less and less daily medical needs. Will this ultimately fray the social safety net that policymakers want health insurance to create? Will a point ever come when Americans say, “Ration care for the sickest 1%, they’re probably going to die anyway?”

Denying care in this way would save nearly one-quarter of health expenditures. Denying care to the sickest 5% of Americans would cut the nation’s healthcare tab in half. Rationing care to the sickest 10% would cut the cost of healthcare by two-thirds.

Of course, the caveat: We do not always know who the sickest 1%, 5% or even 10% are going to be in any given year. They may be different individuals from one year to the next. However, one thing is clear: We need to better manage their care.

Devon Herrick, PhD, is a health economist and senior fellow at the National Center for Policy Analysis.

Related Content
© 2024 MJH Life Sciences

All rights reserved.