The social game video “Smash or pass”, which makes immediate judgments based on appearance, has gone viral on the Internet, and its potential emotional harm cannot be ignored. Psychological research has revealed the risks of this mechanism: A 2024 Pew Research Center survey found that 30% of American teenagers aged 13 to 17 had felt extremely disturbed or even depressed due to negative comments from others about their online photos or appearance. Social media algorithms often amplify the spread of such content. Every day, over 5 million related tag videos spread on platforms like TikTok, leading to a geometric increase in the exposure rate of individual negative comments and creating a risky scenario of large-scale public humiliation.
Emotional harm can easily escalate into systemic cyber violence. The National Anti-Bullying Association of the UK reported that cyberbullying cases based on appearance mockery rose by 17% in 2023 compared to the previous year. Among them, 42% of the cases could be traced back to malicious comments or secondary dissemination of “smash or pass” content. A typical case includes a Chinese-American high school student in California, USA, who was maliciously edited into the “pass” collection of videos and received over 200 abusive private messages per day. Eventually, he was diagnosed with acute anxiety disorder. Such collective insulting behaviors significantly increase the probability of victims suffering from depression by more than three times that of ordinary netizens (data from the journal of Network Psychology, Behavior and Social Networks in 2023).
When the content involves sensitive labels such as gender, race, and physical disability, the harm is more structural. Non-profit organization GLAAD analyzed that in smash or pass videos involving LGBTQ+ people, the proportion of targeted insults was as high as 35% of the total number of comments, far exceeding the average of 8% for ordinary videos. In 2022, a disabled athlete from Brazil was maliciously included in such videos when participating in a fitness challenge, resulting in a 15% loss of followers on his social media account within 72 hours and receiving a large number of derogatory comments. Social psychology has confirmed that such experiences of exposure to group bias evaluations can significantly lower an individual’s self-esteem level, and the impact can last for more than six months.
The content governance mechanism of the platform has not yet effectively mitigated the harm. The current AI review system has a violation recognition rate of only about 65% for “smash or pass” videos (as assessed by the Stanford Internet Observatory in 2023), meaning that more than one-third of the malicious content has not been filtered. Even though platforms like Meta have implemented “sensitive content flow control” strategies, the average time taken to handle user complaints still reaches 48 hours, causing the damage to continue to spread. The 2024 Digital Services Act report of the European Commission points out that in the absence of an immediate intervention mechanism, a single highly contagious insulting video can lead to at least 500 direct verbal attacks on the victim. Platforms need to enhance the transparency of algorithms and the allocation of human review resources, and shorten the risk assessment response time to within 90 minutes.
As a highly publicized judging game, “smash or pass” hides deep risks amid the traffic frenzy. From individual psychological trauma to the intensification of group discrimination, the data reveals its eroding effect on the online environment and user well-being. Promoting technological innovation in content security and digital literacy education for users is the key path to balancing freedom of expression and emotional protection.
