Public Health News Watch Wednesday: Report for April 12, 2017
This week’s Public Health News Watch focuses on healthcare costs and provides a brief comparison of the US healthcare system to Canada’s healthcare system, highlighting a few pros and cons of each.
The US healthcare system has been in the political cross-hairs for decades. Beginning with Lyndon Johnson’s push for the creation of Medicare and Medicaid in the 1960s right through the passage of the Affordable Care Act in 2011—and the recent attempts to overhaul it this year—elected officials have sought the cure for what ails the delivery of medical care most in this country: the cost. According to a World Health Organization (WHO) analysis published in 2015, the United States spent 17% of its gross domestic product (GDP) on healthcare—by far more than any of the other 193 countries included. And, despite our reliance on private-pay insurance coverage, the federal government devoted 20% of its budget to healthcare that year.
Yet, the United States ranked 43rd in life expectancy in 2015. Although, in fairness, mortality is just one measure of health-system efficacy, albeit an important one.
As many advocates for a single-payer system frequently point out, most, if not all, of the countries ranked higher than the 50 states on life expectancy—and far behind on spending—have national healthcare services, which are designed to provide all citizens with at least baseline care. Canada has such a system, and spent 10.9% of its GDP on healthcare in 2012 (healthcare represented 18.5% of its national government’s total spend that year), the WHO reports. The Great White North, for the record, ranks 18th in the world in life expectancy.
But as stark as this statistical contrast is—between two nations that share a border and a close political and economic relationship—Canada’s Globe & Mail newspaper recently reported on something its home country’s health system currently lacks: financial incentives designed to encourage drug manufacturers to develop novel treatments for rare diseases—and provisions in place for the national health service to cover the costs of those treatments for patients. The United States, of course, passed its own Orphan Drug Act in 1983 and, as the Globe & Mail notes, the benefits of the legislation are obvious in the numbers: in the decade prior to the law’s passage, the US Food and Drug Administration (FDA) approved just 10 drug treatments for rare diseases; since its enactment, the FDA has approved more than 600.
As usual, of course, it all comes down to money: If left alone, drug companies would have little incentive to develop new products for rare diseases, because of the perception that they would have difficulty profiting from them. As a result, as the Globe & Mail chronicles, treatments for diseases such as hypophosphatasia and X-linked hypophosphatemia are often unavailable or out of reach financially for many north of the border—due in large part to the government-run system’s reluctance to cover their high price tags.
Meanwhile, in the United States, drugs designed to treat dozens of diseases, including relatively rare infections associated with cystic fibrosis as well as necrotizing soft tissue infections, cytomegalovirus, tuberculosis, and others, have been brought to market in recent years. They are still expensive—and not all are covered by private insurance or Medicare/Medicaid—but they are available.
So is the US healthcare system perfect? Far from it. But it is not all bad either.
Brian P. Duleavy is a medical writer and editor based in New York. His work has appeared in numerous healthcare-related publications. He is the former editor of Infectious Disease Special Edition.