"Most nutrition professionals agree that moving away from an animal product-based diet to a plant-based diet is the single most important improvement Americans (and others who eat similarly) can do to improve their well-being. I personally have eaten vegan (totally vegetarian) for over 15 years and have raised my two children that way since birth."