Tuesday, September 30, 2014

More about offshore med schools and residency prospects

Back in April, I blogged about the prospects for graduates of Caribbean medical schools matching in categorical surgical positions and estimated that graduates of two of the more prominent Caribbean schools, St. George's and Ross, had a 2.5 to 3% chance.

What about some of the other Caribbean schools? Hard data are difficult to obtain since most of the schools do not publish match statistics, and in particular, the number of graduates who don't match in any specialty.

Here is what one recent commenter on that April post had to say:

My girlfriend studied at University of Medicine and Health Sciences (UMHS)-St.Kitts in the Caribbean. She is a very hard worker and studied well. All of my savings are gone and extra bank loans add up. No match, no residency, and no more hope. Applied for medical lab tech and waiting. In my opinion, IMG is not an option, try local medical schools and if not try something else.

The UMHS website says 59 of its graduates matched in a specialty in 2014, 2 in preliminary surgery and 2 in general surgery, presumably categorical. The number of graduates of UMHS is not listed although the school apparently has three graduations per year reflecting its three different starting dates for students per year.

Another school, Medical University of the Americas on the island of Nevis, had about 90 matched graduates for 2014, 2 of whom obtained positions in surgery—both preliminary.

An additional commenter on my April post, who turned out to be the owner of a different Caribbean school, said this:

Caribbean medical school is best platform and nice and informative….. Successful communication is key in every successful business…. Understanding your subject and having good knowledge on your blog topic is always essential for a successful blog… Thanks for this post…..

Normally I would have blocked this comment as spam, but before I did so, I googled his school, the American Global University School of Medicine, located in the Central American country of Belize. The International Medical Education Directory lists its total enrollment as 100 students. The school's website does not provide any details about match results for its graduates or much of anything else, such as names of faculty or specific hospitals where students do clinical rotations in the US.

I found some other interesting links—too many to list here—about the school, its officials, and its standing in Belize. You would be wise to google it too, or you can see some links in my comment to the school's owner on my April post.

If you have any interest in attending this or any other school not accredited by the Liaison Committee on Medical Education (LCME), you should do a thorough Internet search before going ahead with an application. Do not send money unless you are certain that the school is legitimate and that most of its graduates are obtaining residency positions.

Keep in mind that the number of residency slots available for international graduates will decline even further over the next few years because several new US medical schools will be producing graduates, and many established schools have expanded their classes.

Friday, September 26, 2014

What is an acceptable rate of VTE prophylaxis?

According to the paper “Hospital Performance for Pharmacologic Venous Thromboembolism Prophylaxis and Rate of Venous Thromboembolism: A Cohort Study” that appeared online in JAMA Internal Medicine last month, a rate of 70% for all eligible patients is good enough.

The retrospective study looked at rates of prophylaxis for VTE at 35 Michigan hospitals.

Of the 20,794 eligible patients included in the analysis, 1,658 either died or were transferred to higher or lower levels of care leaving 19,136 evaluable patients, 226 (1.2%) of whom suffered a VTE during either the hospitalization or the 90-day follow-up period.

Tuesday, September 23, 2014

How to get the answers you want from a survey

This isn't about religion or politics, two subjects I tend to avoid. This is about surveys and how they can mislead.
I received this survey in the mail last week. It is from CatholicVote.org and is touted as the "largest survey of Catholics ever conducted on the issue of ObamaCare."

CatholicVote.org promises that the results will "send a strong and clear message to every politician running for election or reelection in the 2014 midterm congressional elections, that the overwhelming majority of Catholic voters demand ObamaCare be repealed."

Judging from the way the questions are framed, I think the message will be clear.

Here are a few examples:

From Section B "ObamaCare's War on Christianity and Morality"

Question #2: Do you think ObamaCare is violating the Constitution's First Amendment protections for freedom of religion and freedom of conscience by forcing pro-life Americans to purchase health coverage that includes abortion inducing drugs?

A) Yes, this is certainly a violation of the Constitution's First Amendment protections.
B) No, this is not a violation of the Constitution
C) Not Sure
D) Other

Question #4: As a state lawmaker in Illinois, Barack Obama voted twice to deny lifesaving medical care to babies born in botched abortions. What is your reaction to this fact?

A) I support President Obama on this.
B) I am horrified and angered by this.
C) Not Sure
D) Other

From Section C "ObamaCare's War on Freedom"

Question #5: Do you think President Obama knew about the crushing cost of ObamaCare for families across America, and was just lying about the cost to get ObamaCare passed into law? Or do you think he shares our shock and dismay at the staggering cost of ObamaCare?

A) I believe President Obama knew about the crushing cost of ObamaCare for families across America, and was just lying about the cost to get ObamaCare passed into law.
B) I think he shares our shock at the staggering cost of ObamaCare and was just unaware of it.
C) Not Sure
D) Other

Question #6: How do you think the mass exodus of doctors from medicine will impact your ability to see a doctor and get the medical treatments you need?

A) A doctor shortage on this scale will certainly drive healthcare costs up dramatically and make it far more difficult for me to see a doctor and get the medical care I need.
B) I don't think we'll see much impact from this doctor shortage.
C) Not Sure
D) Other

Had enough?

I look forward to seeing the results.

Monday, September 22, 2014

The $117,000 surgical assistant's fee

In a post a few months ago, I wondered why Medicare could not control its costs using the investigative power of the federal government instead of releasing physician payment data and relying on journalists to do the work.

Two stories that appeared within days of each other raise a similar question about the private insurance industry's methods.

An article in Modern Healthcare described the impending closure of the proton-beam therapy center at Indiana University, one of only 13 such facilities in the country. Proton-beam therapy, which is very expensive, has never been proven better than other types of treatment for prostate cancer.

Here's what Modern Healthcare had to say:

Blue Shield of California and Aetna last year said they would no longer cover proton therapy as a treatment for localized prostate cancer. Cigna Corp. does not cover proton-beam therapy in the treatment of prostate cancer either.

“I look at this closure as a sign that insurers are finally empowered to say this is a dubious medical technology” in the treatment of patients with prostate cancer, said Amitabh Chandra, director of health policy research at the Harvard Kennedy School of Government.


A couple of days later in the New York Times, a piece by Elisabeth Rosenthal related several anecdotes about patients who were saddled with large and unexpected bills from out-of-network physicians who were involved in their care.

A particularly egregious example was a $117,000 bill from the surgeon who assisted at a 3-hour cervical spine fusion operation. Just to put it in perspective, that's $39,000 per hour or $650 per minute—numbers a professional athlete might envy.

Although the procedure took place at a teaching hospital where residents are usually available to assist, the operative record apparently documented that no qualified resident was available.

The surgeon billed $133,000, but since he was in-network, he received only about $6,200.

Despite some pushback by the patient, the insurance company eventually paid the surgical assistant's $117,000 fee. If he's worth 19 times more than the operating surgeon, maybe he should be doing the operation instead of merely assisting.

Apparently this is not an isolated event. Quoting the Times, "J. Edward Neugebauer, chief litigation officer at Aetna, said the company had ... sued an in-network neurosurgeon on Long Island who always called in an out-of-network partner to assist, resulting in huge charges. The surgeons shared a business address."

The story in the Times related several other instances of insurance companies acquiescing and paying extremely high out-of-network charges.

If insurance companies can decide not to pay for proton-beam therapy, why do they agree to pay an assistant surgeon $650 per minute? I realize they didn't want to leave the patient holding the bag, but have they no recourse other than to pay?

On the home page of the Medical Society of the State of New York, its president responded to the Times piece by pointing out that New York's legislature just passed a law addressing surprise bills, and he correctly noted that some insurance companies do not pay in-network physicians enough to cover their expenses.

But he failed to acknowledge that many of the fees noted in the article are outrageous. Why not at least mention that issue? Doesn't he realize those fees make all doctors look bad?

Wednesday, September 17, 2014

Can Google Glass make you a better surgeon?

Advocates of Google Glass in surgery are apparently desperate to find some use for the device.

An article headlined "Google Glass makes doctors better surgeons, Stanford study shows" concluded that the study offered "compelling preliminary evidence that the head-mounted display can be used in a clinical setting to enhance situational awareness and patient safety."

Using an app capable of displaying vital signs on Google Glass in real time, 7 surgical residents recognized critical desaturation in simulated patients having procedures under conscious sedation 8.8 seconds faster than a control group of 7 residents relying on standard monitors. Glass-wearing residents also became aware of hypotension 10.5 seconds before the control group.

Not mentioned in the article but present in a linked abstract of the paper not yet submitted for peer review was this pearlneither difference was statistically significant.

This evidence is not that convincing. Even if the difference had been statistically significant, it is surely not clinically important.

How seeing vital signs on Google Glass is better than relying on the simple alarms that are built in to every monitor is not clear. Either way, you must stop the operation and look up to see the vital signs.

In a brief video accompanying the article, a surgeon can be seen rather clumsily activating and resetting the app on his Google Glass. The time required to perform these maneuvers apparently was not discussed.

The article, probable written directly from a press release, took a comedic turn with this sentence, "One test demanded that the resident perform a bronchoscopy, in which the surgeon makes an incision in the patient’s throat to access a blocked airway." But bronchoscopy does not involve making an incision in the throat or anywhere else.

If you would like to hear a different side of the Google Glass story, check out this video review from GeekBeatTV entitled "Google Glass is the worst product of all time." You can forward to the 3:45 mark to get past the woes of wearing prescription glasses with Google Glass and hear about the poor battery life, the balky commands, the system crashes, and more.

Tuesday, September 16, 2014

Aortic dissection leads to man's death in the ED: His wife's perspective

A woman wrote to me about the day her husband died. I have edited her email for length and clarity and changed some insignificant details to protect her anonymity as she requested.

Joe passed away outside in the parking lot while they were getting on a helicopter for transport to a hospital equipped to do his surgery.

He had presented to the ED in terrible pain with lots of thrashing and writhing. His right hand was very cold. His right arm tingled to the point of hurting bad. The vision in his right eye was cloudy, and his hearing was muffled on the right. This was in addition to being very pale and diaphoretic upon admission. This is when I felt a dissecting aorta should have been suspected.

I don’t recall the vitals in the beginning, but they were changing and his blood pressure was dropping very fast. As soon as they finished the EKG-in the first 5 minutes of the visit, I asked the doctor about John Ritter's death [the actor died of a dissecting thoracic aneurysm in 2003]. First I asked if he could check for the condition that caused John Ritter's death. I called it an abdominal aortic aneurysm. The doc corrected me and said that it wasn’t an AAA it was a dissected aorta. I said OK, then check for that. This was 1 hour before the CT scan that led to his diagnosis.

Thursday, September 11, 2014

More ratings—this time it's residency programs

Can you really decide which surgical residency program is right for you using Doximity's Residency Navigator?

I don't think so, and here's why.

The rankings of residency programs were obtained by surveying surgeon members of Doximity. They were asked name the five top programs for clinical surgery training. When the survey was announced in June, I predicted that most respondents would probably overlook the word "clinical" and focus on the usual famous academic institutions.

I also pointed out that anyone not intimately familiar with a program would be unable to judge whether it is good or not and suggested that reputation would be the main driver of results.

In fact, that is exactly what happened. Of the top 40 programs listed, all are based at university hospitals, as are 66 of the top 70. Back in June, I speculated about the top five programs and got the first two correct but in the wrong order.

A 2012 survey of surgical residents with over 4200 responders (an 80% response rate) found that community hospital trainees were significantly more satisfied with their operative experience and less likely to worry about practicing independently after graduation. Wouldn't you then expect a few community hospital programs to be among the top 40 hospitals for clinical surgery training?

Proof that the survey's findings are not reliable is that every one of the 253 surgical residency programs in the country was mentioned by one or more of those who responded. This included one program that has been terminated by the Residency Review Committee for Surgery. At least it appears near the bottom of the list.

The number of voters who cited the lower ranking programs must have been very few, meaning the difference between the 200th and 240th program ranks is probably not statistically significant.

Some programs that were rated are so new that very few or no residents have graduated yet. How could anyone know if they are turning out competent clinical surgeons?

Board passage rates for programs, which are available online, were omitted for some and were not clearly identified as the percentage of residents who passed both parts of the boards on the first attempt only.

The percentile rankings of alumni peer-reviewed articles, grants, and clinical trials are displayed prominently. What do those data have to do with the research question—which residency programs "offer the best clinical training"?

So what's the bottom line?

You can put the Doximity Resident Navigator in with the other misleading ratings of hospitals and doctors. Applicants considering surgical residencies should not rely on it for guidance.

It has warmed the hearts of faculty and residents at highly rated programs, but I wonder how the OR lounge discussions are going at places where programs ranked lower than expected.