Friday 26 August 2016

Where is the most desirable place to do a medical residency (2016 edition)?


There is a cognitive dissonance between selecting students for admission and then producing doctors. The traits that make you a slam dunk admission were no longer all that important once you got into the program.  You play violin in the symphony? You’re in! But you should really stop playing and study more. You’re an Olympic gymnast? Here’s your white coat! But there’s no time to practice, you should study more. You volunteer at an orphanage for victims of arson? Welcome to medical school! But you should really stop helping them and study more.

To some extent, medical schools don’t really care about what extra-curricular activities you salt your resume with as long as you breach a certain threshold. The signal they get from a padded resume is that you’ve been busy and you can remain busy and this is good because medical school is busy. Rather than having much of an interest in students with hobbies, they’re selecting for students who can tolerate a heavy work load. This theory seems far more consistent with what then happens in medical school, which as one staff physician put to me, was to “make all of you unique little snowflakes into snow”.

This homogenization process would manifest itself regularly in pre-rotation orientation sessions when the attending physicians would inevitably ask the group of senior medical students to tell the group about “one thing you like to do in your free time”. A common response was “I like to travel”, which I think was a reflection of how few extra-curriculars most of us had at that point and also how most pleasurable thoughts revolved around getting the hell away from medical school. The attending physician would then say some generic pleasantry about traveling and free-time and, without a trace of irony, pass out the call-schedule. Welcome to internal medicine/pediatrics/surgery/obstetrics, you’re working for the next fourteen days straight!

This lack of intellectual diversity became more of a problem when it came to the residency match. Everyone had gotten into medical school based on their grades (which were uniformly good in undergrad) and their extra-curriculars (which no one had any more because of medical school). So almost everyone was a generic medical student to the higher-ups. Medical school in Canada is also unique in that most residency programs don't care what your grades were in medical school as long as you pass, so there was little to distinguish medical students on this basis. I was an enormous beneficiary of this policy so I'm not sure I have a whole lot of grounds to malign it, but it did induce some perverse incentives.

Specifically, it induced medical students to "work together as part of the medical team". This may sound like a good thing when one of the deans says it, but it is more colloquially known as ass-kissing. Marks and special skills mattered far less and so it became more about who liked you when you worked for them. Since rotations were only several weeks at a time, intense, adoring flattery was the best way to snag a good reference or to gain favor with whatever program director. As it was another thing I wasn't very good at in medical school, I found the process extremely exhausting. It also gave a lot of power to people higher up the food chain in medicine. Medicine is a profession known for its abuses of learners, and sometimes you can see why when there are imbalances like the ones inherent in the match process.

Anyway, this culminated in the fourth year of medical school during the residency match, where universities across the country listed their preferred medical students. If you made the cut, you got into your program and you got the privilege of taking crap from senior physicians for a paycheque. About a year ago I had matched and they (probably) couldn’t fire me because I had a contract, so I decided to turn the tables on the medical schools and rank them. 
 
As I know too many people in medicine, this was my most popular blog to date. Not even the sex one beat it, despite it being WAY more interesting, better-written and full of dirty word-play. To capitalize on the upcoming match process and boost traffic to my blog, I am releasing a new and improved medical school rank.

*********************************************************************************

My ultimate goal last year was to base the relative rank of a medical school on the desirability of the school to medical students. I didn’t want to base the rankings on journal citations, academic staff, or anything tangible at all. I wanted to use the wisdom of the crowds to determine ranking. Places that do a better job at making their residents happy for whatever reason would attract more medical students. That may be because they are located in nicer cities, they are more academically impressive, or they treat their residents well. I don’t care how or why they are more desirable, just whether they are more desirable.

It's a little more difficult to assess desirability in a system where there is a cap on demand for spots. There are only so many residency spots to go around and when they fill up that's it. In a well-functioning market, desirability is something that can be revealed (arguably) by volume in a short-term sense and in price in a longer-term sense. This doesn't work with residency spots and so my logic last year was to see what universities seemed to attract the medical students from across the country. The argument for revealing desirability was that a desirable school is universally desirable and should attract a high volume of applicants as well as the best applicants from other medical schools. Undesirable schools would not receive as diverse an array of applicants and so this would show up in a limited number of outside medical students going to those schools.

Given the data available, I still think that this is the best way to rank medical schools, but the methodology I used was incomplete. This previous statistic for a medical school was essentially the average percentage of other medical school classes that went to the medical school in question. Although I hinted at the problems with this, I couldn’t come up with a solution and so I really didn’t do anything about it. I was also going to the University of Toronto and it was number one, so it was a result that coincided with my prior expectations.

The major problem had to do with the size of the accepting medical school as compared to the size of the donating medical school. When you have a large pool to accept into, you can take a large number of people from other medical classes. Since the University of Toronto has some 300-odd residency spots, it can easily accept 10% of the medical class from Memorial University. But Memorial is a small medical school and its residency class is similarly small. Memorial University couldn’t really take 10% of the University of Toronto’s medical class even if it was the most desirable medical school in the country. This puts smaller medical schools at a disadvantage relative to larger medical schools.

This year I tweaked the methodology to try and account for this. My small insight into this was to imagine what the allocation of medical students would look like if every medical school in the country was equally desirable. Medical students would be indifferent in going to either Memorial or Toronto or any other medical school for a residency. In this utopian scenario they would all apply to all of the universities and get accepted at roughly equal rates to each university (assuming a roughly similar distribution of talent across the medical classes). The result would be that each medical class would allocate a percentage of their medical students in proportion to the size of the residency class at the accepting medical school. Under these conditions, if the University of Toronto has a residency class that comprises 10% of the total spots across the country, then it should take 10% of the medical class from Memorial (and 10% from UBC, and 10% from Western etc.)  If Memorial has a residency class that comprises 3% of the total residency spots across the country, then it should take 3% of the medical students from Toronto (and 3% from UBC and so on).

The way to measure desirability is then to estimate this baseline scenario and then to see how far real life deviates for each medical school. Deviations above the baseline mean that the medical school is more popular that it would be in a scenario where all medical schools were equally popular. Deviations below the baseline mean that the medical school is less popular than it would be in a scenario where all medical schools were equally popular.

The other little tweak to the model is to get a measure of the effective spots that a residency class consists of and that a medical school contributes to the total pool of medical students. This tries to acknowledge that a medical class consists of leavers and stayers. A medical student who stays at their medical school for residency means one fewer medical student from outside who can get into that class. A medical student who stays also means one fewer medical student in the pool of prospective medical student to go to other schools. I also added the unmatched spots into the pool of total residency spots available which was something I did not do last year.

The statistic for ranking the medical schools is then based on the following set of equations. The utopian benchmark case where every medical school is equally popular is defined as:

 
For a medical school a, the utopian benchmark value B is the ratio between the effective spots at a medical school and the total residency spots available at all medical schools. The effective spots at a medical school is the difference between the total residency spots at that medical school (R) and the medical students from medical school a that go to that school for residency (r). Add to this the unmatched residency spots, U, at the university for a total number of effective residency spots at the medical school. The denominator is the total number of available residency spots across the country. This is all of the unmatched spots, plus the open spots at each university - the difference between each residency class and the number of medical students that stay at their home medical school for residency.

The actual case that we observe is described by the following relationship:


r is the number of medical students from school b who go to school a. Divide this by the total number of medical students available from medical school b which is the difference between the total medical students at school b and the number of them that stay at school b for residency (r).

The ranking statistic for school a is then the difference between the benchmark scenario, B, and the actual real life scenario A



Add up all the of these deviations for all of the medical schools in the country (excluding the medical school in question) and take the average and you get a measure of a medical schools desirability for residency.

***********************************************************************************

So onto the results and some interpretation is warranted here. First, the more desirable a medical school is for residency, the more negative its desirability score will be. This is a result of the difference between the baseline utopian scenario and what we observe in real life. A more negative number means that medical students are going to these universities above what we would expect. These universities are punching above their weight. A university that has a positive desirability score has a baseline utopian scenario score above what we observe in real life. It is a less desirable place to do a residency. A university that has a desirability score of zero means that the utopian scenario is roughly what we observe in real life.



Using this new methodology for the 2016 match (and 2016 match data from CaRMS), UBC places first and Laval places last. My own university, Toronto places second while my old medical school, Manitoba, places second last. Now onto the speculation. Why do these rankings look the way they do?

There are a number of obvious reasons as to why people go to places for residency that probably influence these rankings. First is the location. Like the 2014 rankings, the top schools are located in cities and towns with a reputation for being more exciting than the locations of other universities. They also are known for being universities with decent research and clinical reputations. Conversely the prairie universities and the more remote universities (NOSM, Memorial) are difficult to get to, cold, and are not known for their night life. They also, probably because of scaling issues, have less robust research reputations. This makes it difficult to attract and keep residents in these places.

As per last year, Quebec schools have a hard time attracting residents from across the country likely because of the language barrier. Montreal is the one exception here because it seems to be able to attract residents mostly from other Quebecois medical schools. McGill, as the one major English medical school in Quebec, ranks pretty low because you still need to learn French to go there and pay in Quebec is (still) shockingly bad for residents. The difference between a first year resident's salary in Quebec and in Ontario is over $10,000 and English speaking residents have better alternative options outside of Quebec. As a result, McGill suffers disproportionately in the rankings. Residents who speak exclusively French on the other hand, don't have an option to leave the province, which is why Montreal does so well.

Now, how have these rankings changed over the last year? I went back and repeated the same exercise on 2015 data (which was my match year) to see how these rankings have fluctuated.


Since last year there have been a couple of major changes. Memorial and Alberta have both dropped in these rankings. I suspect this may be due to the ongoing fallout from the low price of oil. Previously, the provincial governments could throw money at their medical schools and residents could count on a decent salary over the long run. Both provincial governments have indicated a need to reign in expenditures. Health is one of the bigger portfolios and thus a target for cuts. I suspect residents are pricing this fact in for this year's rankings. The exception to this is Calgary (also being in the province of Alberta) which had a largely stable ranking. The University of Calgary has one significant thing going for it, which is that it is neither located in Edmonton or St. John's. Nevertheless, its ranking may start to collapse over the next couple of years unless oil prices rebound.

Montreal, Queen's, and to a smaller extent, McGill, climbed in this year's rankings. Neither Montreal nor McGill did anything to warrant the bump in rankings as their desirability scores stayed almost constant. They are beneficiaries of other universities falling in desirability. Queen's had a true measurable increase in its desirability score for reasons that escape me.

So I leave you with this little disclaimer. There is a certain wisdom in crowds revealing desirability, but it's wrong to say that this ranking suggests that a medical school will be a better place for your own residency. A lot of decisions go into picking a medical school for residency and the wisdom of the crowds is only one small input into that decision process. A plurality of medical students find that their best option is their own medical school. For every medical school other than Queen's and Ottawa, over 35% of the medical class stayed for residency and Ottawa matched about 30% of its class to its residency program. Queen's, for the third year running, matched the lowest percentage of its own medical students to its residency program at a dismal 14.5%. This in itself should give you some pause about the accuracy of these rankings. If medical students who have been at Queen's for four years have a near-universal interest in leaving, you have to wonder about the desirability of that residency program no matter what these rankings say.

The last point that I'll make is that it really doesn't matter what medical school you match to for a number of reasons. Thanks to accreditation, teaching and learning is largely standardized across programs. Each university has its own research strengths and agenda, but if you want to make it a part of your career, they will bend over backwards to accommodate you. Medical schools love research even if most of what is done is basically useless.

Finally, if you're worried about a medical school based on its location or quality of life, just remember this: for those of you going into a specialty program, it doesn't really matter because you'll all be looking at the inside of a hospital for the next five years anyway. And if you're going into family medicine and you get stuck at an undesirable medical school, you can just move away after two wasted years. Don't sweat it!

But life is funny. I didn't get into any one of the top three programs I applied to and looking back this was a blessing in disguise. If I had gotten my top program I would be in a different city and in a specialty program that would be entirely unsuited to my personality. Instead I got into a program that fits me perfectly. I didn't get the program that I thought I desired the most and it worked out for the best. It'll probably work out for you too.

No comments:

Post a Comment