When Did Medicine Become Political?

By | September 21, 2022 | 0 Comments
When did medicine become political? Good question. When doctors became employees rather than independent practitioners. When medical schools went woke. When state medical boards threatened to pull licenses if doctors dissented from official dogma. That’s when.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Social Widgets powered by AB-WebLog.com.