We want to get a line your intellection .
Unfortunately, some aspects of society can become normalized even though they can be harmful — and you may not even realize it.
So, Americans, we want to hear from you: What are some “norms” or standards in the US that you think are toxic?
Perhaps you think tipping culture has gotten out of hand, and employers should pay their employees what they deserve.
Maybe you feel the US has too much of a “hustle culture,” where people feel pressure to monetize their hobbies or overwork themselves at their jobs to the point they barely have any work-life balance.
Or perhaps you feel that beauty standards have become increasingly unrealistic due to anti-aging skincare trends, filters, and social media.
Or maybe you feel like US society still puts a lot of pressure on younger generations to have a certain degree and follow a specific path, or else they will be “unsuccessful.”
We want to hear your thoughts.