monica e. hall
Member
Women doctors, oh dear. Well, it depends where you live. Most doctors in the former USSR are women, and so it is a lower-paid profession, despite the training required. Generally speaking, if a profession is dominated by men, it is highly paid. If it's dominated by women, it is not. People might think I'm just a feminist grouch here, but all I'd say is just check it out.
Nurses in the UK have traditionally been female. But over the last 3 decades men have moved into the profession - and I have nothing against that in principle. Except that they have taken charge. And I still wouldn't have anything against that, except for the fact that nurses don't actually nurse these days - too busy doing courses, in-putting repetitive data to computers, filling in forms which tick political boxes. Doctors don't get you better, they just diagnose (which nurses can anyway) and prescribe and operate, which is a good thing. But nurses get you better, and out of the hospital. Or should be able to.
Nurses in the UK have traditionally been female. But over the last 3 decades men have moved into the profession - and I have nothing against that in principle. Except that they have taken charge. And I still wouldn't have anything against that, except for the fact that nurses don't actually nurse these days - too busy doing courses, in-putting repetitive data to computers, filling in forms which tick political boxes. Doctors don't get you better, they just diagnose (which nurses can anyway) and prescribe and operate, which is a good thing. But nurses get you better, and out of the hospital. Or should be able to.