As an ex-pat, I can't help pondering some basic questions about Americans and the health care system they claim is the best in the world:
Why hasn't any other advanced country decided to adopt an American-style health care system?
Why don't Americans accept that everyone has a right to health care, instead of seeing it as a luxury akin to a new car?
Why is American health care so much more expensive than that of any other country?
Why do so many babies die in America before their first birthday, compared to other wealthy countries?
Why do Americans see nothing "socialized" about free education for all but don't one bit like the idea of health care paid for by the government via taxes?
Why do they change their minds about government-paid health care when they reach age 65?
Why do they think the free market will function with health care the same way it does with hamburgers?
Why do Americans think their doctors ought to be paid so much more than similarly trained doctors in other countries?
Why is socialized medicine good enough for our veterans and active duty military but not for the rest of the country?
Why do Americans tend to brag about how much health care they get, whereas Britons do everything they can to avoid talking about their own health?
Why do some Americans want to let their boss decide what services their doctor can prescribe?
Why do Americans erroneously believe that most uninsured Americans are unemployed as well?
Why are seniors, who have Medicare, much less likely to support health care reform which addresses the needs of the uninsured?
Why do Americans assume that going to the emergency room is a good option for the uninsured?
Why are so many people unable to go to a doctor in the richest country in the world?