Luk Arbuckle

Posts Tagged ‘news’

Misleading Americans about public health care

In news on 22 February 2009 at 7:45 pm

Canadians often wait months or even years for necessary care. For some, the status quo has become so dire that they have turned to the courts for recourse. Several cases currently before provincial courts provide studies in what Americans could expect from government-run health insurance.

At least that’s story told by the Fraser Institute in an op-ed in the Wall Street Journal. “As we inch towards nationalized health care,” reads the subtitle, ” important lessons from north of the border.”  With a couple of dire tales, and a couple of national averages, Americans are led to believe that introducing government-run public health insurance will drastically increase wait times in U.S. health care.

Where problems lie
Making an appropriate comparison between wait times in the U.S. and Canada is not trivial. How do you deal with those people that can’t get treatment in the U.S. because of inadequate or nonexistent medical insurance (infinite wait times)? Even comparing specific treatments is tricky because disease coding between the U.S. and Canada differs (ICD-9-CM is currently used in the U.S., and ICD-10-CA in Canada). And then you have to consider subgroups to see how population trends change for socioeconomic classes, say, and to ensure they aren’t reversed entirely (Simpson’s paradox).

Take, for example, a study that found that “socioeconomic status and breast cancer survival were directly associated in the U.S. cohort, but not in the Canadian cohort.”  Also note that “this study replicated the finding of advantaged Canadian cancer survival in smaller metropolitan areas that had been consistently observed in larger metropolitan areas.”  Although it’s possible there are other (confounding) factors influencing these results, it shows that socioeconomic status needs to be considered when comparing medical treatment and outcomes in the U.S. and Canada.  And, therefore, it is likely to affect wait times as well.

Instead of dealing with technical details, however, the article in the WSJ uses stories in which Canadians wait months for treatment.  There’s nothing inherently wrong with this—it is, after all, an op-ed piece and not a journal article—but you have to ask yourself about the choice of stories.  Are they representative of public health care in Canada, or extreme cases?  Also, we don’t know whether the  individual that “paid for surgery that may have saved his life”, rather than wait for treatment in Canada, was in immediate need of treatment.  These are, nonetheless, compelling stories that should not be disregarded—but they don’t prove a trend.

The basic argument put forward is that Canadians wait a long time for treatment under the public health care system.  But what’s considered a “long” wait time, and how does it depend on the condition and severity?  Notice that there’s no mention of wait times in the U.S., even for those that have appropriate health coverage.  Instead we’re given some specific average wait times, but why cataract surgery or hip and knee replacements, and not others?  How much do these wait times vary based on treatment, location, socioeconomic class, and how do they compare with U.S. figures?  We’re left with more questions than answers.

The real confounder
Ultimately, to consider how wait times would increase in the U.S. with the introduction of publicly run, universal health coverage—that is, health coverage for all, as in Canada—there is one factor that would need to be disassociated from wait times in Canada. This factor, not unique to Canada but certainly rare, is not stressed enough in the article.

The Supreme Court of Canada found that Canadians suffer physically and psychologically while waiting for treatment in the public health-care system, and that the government monopoly on essential health services imposes a risk of death and irreparable harm.

Disregarding the inflamed rhetoric, the important point here is that there’s a “government monopoly on essential health services” in Canada.  In other words, there’s no competing private system for health services deemed medically necessary, and the government funds and regulates the public health care system (although the government doesn’t operate it).  You could probably argue that this monopoly is equivalent to price fixing for those services the government decides it’ll pay for.  This is likely the main reason “care is rationed by waiting”—there is, after all, no alternative (besides paying for treatment in the U.S.).

It’s probably only a matter of time before Canada allows for a parallel private system for most, if not all, health services. Private spending currently represents about 30% of the average provinces total health care spending (mostly for medications and services not covered by the public system, such as dentists, optometrists, and physiotherapists).  But until a parallel private system exists for all services in Canada, or the monopoly in essential services is taken into account, it’s disingenuous to suggest that wait times are simply because “individuals bear no direct responsibility for paying for their care.”

Bottom line
Many factors impact health care and wait times.  You can’t look at just one aspect or descriptive statistic and know whether the system works as intended.  It would be like judging a person’s health based on blood pressure alone.  I agree with the author regarding comments he made in the past about improving Canada’s health care system.  But making inferences into a public health care system in the U.S. based on the results from a couple of average wait times in Canada, where other factors confuse these results and make them unreliable to begin with, is inappropriate and misleading at best.

Advertisements

The sexy job in the next ten years

In news on 1 February 2009 at 6:10 pm

Hal VarianGoogle’s chief economist and author of arguably the two most popular textbooks in microeconomics (one at the undergraduate level and the other intro graduate), shared the following during an interviewed for The McKinsey Quarterly:

I keep saying the sexy job in the next ten years will be statisticians. People think I’m joking, but who would’ve guessed that computer engineers would’ve been the sexy job of the 1990s? The ability to take data—to be able to understand it, to process it, to extract value from it, to visualize it, to communicate it—that’s going to be a hugely important skill in the next decades, not only at the professional level but even at the educational level for elementary school kids, for high school kids, for college kids. Because now we really do have essentially free and ubiquitous data. So the complimentary scarce factor is the ability to understand that data and extract value from it.

I think statisticians are part of it, but it’s just a part. You also want to be able to visualize the data, communicate the data, and utilize it effectively. But I do think those skills—of being able to access, understand, and communicate the insights you get from data analysis—are going to be extremely important. Managers need to be able to access and understand the data themselves.

You always have this problem of being surrounded by “yes men” and people who want to predigest everything for you. In the old organization, you had to have this whole army of people digesting information to be able to feed it to the decision maker at the top. But that’s not the way it works anymore: the information can be available across the ranks, to everyone in the organization. And what you need to ensure is that people have access to the data they need to make their day-to-day decisions. And this can be done much more easily than it could be done in the past. And it really empowers the knowledge workers to work more effectively.

It’s nice to hear that your skills may become a hot commodity.  I came across the article from Gelman’s post on What should an introduction to statistics be like?  What I enjoyed most was the discussion that followed.  Like how “the time of playing with integrals, density functions and demonstrations is over”, and that we should “focus on coding and implementation”.

Bad blood at the American Red Cross

In news on 18 July 2008 at 11:31 pm

For years the U.S. Food and Drug Administration (FDA) has been after the American Red Cross to improve the way it collects and processes blood. Fifteen years ago they arrived at a settlement under court order that outlined how the American Red Cross would have to strengthen quality control and training, and improve its ability to identify, investigate and record problems. But, as detailed in the New York Times, only modest improvements have been made since that time.

As rightfully pointed out at the STATS blog, pharmaceutical companies would never be allowed the leeway that has been given to the American Red Cross. They failed to inform the FDA of their mistakes, let alone investigate the results of them, and as a results no one really knows how many serious health problems they are responsible for. As the old mantra goes: you can only manage what you measure.

Also disconcerting is that two-thirds of their revenue at the American Red Cross comes from blood services. But how much of that revenue goes back into blood services, or go to subsidize disaster-relief efforts? They don’t know, because of a antiquated financial systems.

Vampires in Canada
There are a lot of similarities between this story and Canada’s tainted blood scandal involving the Canadian Red Cross, which came to a head during public hearings fifteen years ago. The phrasing of the New York Times article suggests the Canadian Red Cross chose to break off its blood services operations, but they were forced out. The only choice they were given was whether to stay involved to help in promoting blood donations, but they decided to break all ties given the bad press and mismanagement.

The old blood services run by the Canadian Red Cross was plagued with non-professional committees that “took a long time to absorb new information.” But when Canadian Blood Services took over collection and distribution of blood products, Health Canada was put to the task of “strengthen[ing] the regulatory role with better resources, integrated data collection and analysis, and being more aggressive in raising standards.”

Health Canada organized a working group “to develop a plan and design a program for a comprehensive blood surveillance system for Canada.” The emphasis is clear from their final report in 1999—data collection and management. Previous focus was on data collection for operational rather than surveillance purposes. As a result of this focus, and a lack of standardization and integration between organizations and jurisdictions, a lot of important data was not being collected.

It’s time Americans demand their blood services be separated from the Red Cross and its bias towards disaster relief. Blood services should be managed by appropriately trained health professionals, with strong ties to federal regulators. Being at arms length from the government, yet closely tied with Heatlh Canada, has allowed Canadian Blood Services to increase standards in ways that could never have been achieved through the Red Cross.

Bayesian and the brain

In news on 4 June 2008 at 3:48 pm

Researchers in computational neuroscience want to come up with a single theory to explain how the brain works—Bayesian statistics may provide the answer.  An article in NewScientist asks: Is this a unified theory of the brain? (although a subscription to NewScientist is required to access the article, the Mind Hacks blog found a link to a copy of the article posted elsewhere).

Neuroscientist Karl Friston and his colleagues have proposed a mathematical law that some are claiming is the nearest thing yet to a grand unified theory of the brain. From this single law, Friston’s group claims to be able to explain almost everything about our grey matter. […]

Friston’s ideas build on an existing theory known as the “Bayesian brain”, which conceptualises the brain as a probability machine that constantly makes predictions about the world and then updates them based on what it senses.

The article goes on to explain the Bayesian brain and how it is a group of related approaches that use Bayesian probability theory to understand different aspects of brain function.  What Friston has done is introduce the framework for a “unifying theory”—a theory that ties everything together—using the idea of a prediction error (to minimize surprise) as “free energy”.  Friston describes the theory as follows:

In short, everything that can change in the brain will change to suppress prediction errors, from the firing of neurons to the wiring between them, and from the movements of our eyes to the choices we make in daily life.

Many researches aren’t yet convinced that the theory will be unifying—although they aren’t denying the possibility—and concerns have been raised that the theory may not be testable or be used to build machines that mimic the brain.  But experiments are being proposed to help advance and prove the theory, and many agree that it has tremendous potential.