The term "woke" has been part of African-American vernacular for decades, and refers specifically to political consciousness of black oppression and liberation. But many say its pro-black political meaning has been diluted by mainstream misinterpretation and mocking or clueless usage by white people. Others point out that language evolves and woke is "not the first AAVE word to be taken up by the greater public...and it certainly won't be the last." What do you think?
Do white people need to stop using the term 'woke'?
Join the conversation and vote below
4 Mos Until Voting Ends
More from The Tylt
Is organic food better for you than conventionally produced food?
Who do you serve food to first: Your spouse or your kids?
Best type of nut: Peanut or Pistachio?