Just watched What The Health on Netflix, which is basically a documentary on why animal-based diets are the crux of diabetes, cardiovascular disease, heart disease, and cancers.
Who swears by plant based whole food diets here?
Anyone who thought they'd never switch from their animal heavy diets but did and never felt better?
I'm not considering going vegan, but I do plan on balancing towards a more plant-based diet. Tips?
Who swears by plant based whole food diets here?
Anyone who thought they'd never switch from their animal heavy diets but did and never felt better?
I'm not considering going vegan, but I do plan on balancing towards a more plant-based diet. Tips?