The Netflix documentary "What the Health?" has people considering giving up meat forever. The documentary might go too far with claims that eating meat is effectively killing us, but people think the film raises good points. Others are quick to point out that vegan diets haven't been proven to be healthier than any other diet. Carnivores argue eating meat is fine—it's more about eating in moderation and sensibly. What do you think? 🍖
Is going vegan actually healthier than eating meat?
Join the conversation and vote below
4 Mos Until Voting Ends
More from The Tylt
Are Apple events overrated?
Is it okay to take your shoes off in public?
Are social media sites actually addressing the right issues?