Thoughts on why natural health is so vilified
It almost feels as though most doctors are struggling to keep hold of methods they know to be the only way, and are increasingly frustrated with societies push to find answers that don't involve such toxic and deadly treatments. Like people who still think letting a baby cry it out and the methods of dr spock are correct. Or that formula is just as good as breast milk. It just seems like science and research in other countries are evolving quicker than the old school American medical establishment, and we as patients have access to this info but no access to utilize new methods. Thoughts? I imagine in 25 or 50 years from now, people are going to look back and be horrified by how breast cancer was treated . I'm grateful to have this now instead of 30 years ago, my heart goes out to those who had to endure even harsher treatments back then.