This is one of the things about America that always seems strange to me. The whole thing of employers covering everything like that and the importance of insurance. I'm not always up to date with the American politics so could you answer this, would Obamacare stop this sort of thing being such an issue?