Just wondered what your experiences are with this. In the beginning, I avoided telling people b/c I thought it was bragging. But lately, when I can't avoid telling someone, it seems like they view it as a negative. We might have been having a great conversation, but as soon as they find out, they suddenly don't know what to say anymore!
I've also been offended by people's comments about doctors, before they know DH is one. I've had people say they'd never want to be married to a doctor b/c of the lifestyle, that a doctor they saw was too "green," that people should ditch evil doctors altogether and take herbs, yada yada. It may be worse for OBs than for other specialties because of the rise in home births and other factors. But I think in general, people think it's more acceptable to criticize doctors... maybe b/c "all that money" is a panacea? (I'd like see some of that money and give it a try! Just kidding.)
How do you deal with this, or avoid having it happen? I can imagine forever feeling isolated because of this. Does it get easier after you've actually lived in a place for a while and people get to know you??
I've also been offended by people's comments about doctors, before they know DH is one. I've had people say they'd never want to be married to a doctor b/c of the lifestyle, that a doctor they saw was too "green," that people should ditch evil doctors altogether and take herbs, yada yada. It may be worse for OBs than for other specialties because of the rise in home births and other factors. But I think in general, people think it's more acceptable to criticize doctors... maybe b/c "all that money" is a panacea? (I'd like see some of that money and give it a try! Just kidding.)
How do you deal with this, or avoid having it happen? I can imagine forever feeling isolated because of this. Does it get easier after you've actually lived in a place for a while and people get to know you??
Comment