How do you react?
I recently have been seeing people I know post and talk about how all doctors push medications/vaccines/etc on patients instead of educating and promoting lifestyle changes and diet because “the doctors want you to be sick, that’s how they make money”. It’s been getting worse and worse, and I keep seeing outrageous claims and people stating that it’s a fact. How do you handle seeing or hearing statements like this? I mostly can hold my tongue but at times the bold statements made without having a clue of what it’s like to work in healthcare baffles me. They act like it’s so black and white with patient care.