“This video may contain graphic or violent content,” is one of the first things that can greet someone when they click on Instagram reels. It’s easy to come across uncensored car crashes, accidents, and gun violence. There are whole accounts dedicated to this type of content, and the comments are flooded with joking reaction photos to scenarios people could’ve easily died from. I’ve been seeing content like this for so long, that I don’t react nearly as strongly as I might have a few years ago, and most people I know don’t either.  

Being desensitized to graphic media seems to have numerous effects on us as a younger generation. We don’t care nearly as much about violence that we might have only a few decades ago. The language we’re using and the joking approach we often take to sharing graphic media is undoubtedly influencing this mindset. This made me wonder if we’re starting to lack empathy towards victims because of our unserious nature.  

“There’s so much out there… you can say or post anything,” Derek Cangello, a social studies teacher at Oyster River High School (ORHS) said. He himself chooses to show what may be considered graphic content in class, like videos from 9/11, but he uses it in an educational sense that works with the context of the class. On social media, it’s often unmoderated. According to The Wall Street Journal’s “The Corpse Bride Diet’: How TikTok Inundates Teens With Eating-Disorder Videos,” only 1% of videos uploaded and reported by users actually get taken down by TikTok.  

Gavin Kenoyer (‘27) brought up the December 22nd, 2022 incident on TikTok when moderation went entirely down. Users took advantage of the system and were posting extremely violent or sexually explicit content that took TikTok two to three days to remove. It’s a constant battle of 40,000 moderators versus 1.59 billon users as of 2025, with other platforms having similar numbers.  

With the increased use of AI, it seems like it’s become easier for violations to occur and for apps not to pick up on them. Lucy Milutinovich (‘27) has recently noticed an influx of violent videos being covered up or labelled as AI. “Because it’s AI, TikTok is fine with keeping it up because it’s not ‘real’ people, even though the video is still really descriptive.” Milutinovich says it’s often obvious what’s going on in the videos, even with the AI filters.  

However, Kenoyer argues that some content is important to keep up for awareness with proper warnings, like the reality of the Palestine and Israel conflict. “We need moderation but not to the point where it’s censorship,” he says. This disagreement has spread to conversations around topics like political violence. Some claim that easily accessible videos of people being killed is insensitive, like the video of the political commentator Charlie Kirk being shot. Others are on the same side as people like Kenoyer, arguing that it’s necessary for some to see and be aware of. It’s clear there’s conflicting opinions on this topic.  

Despite this, censorship is still extremely prevalent, especially with the language we use. Words like “unalived” in place of murder and “graped” in place of raped are commonly used amongst users discussing sensitive topics. There are two possible explanations for this that I’ve been told about. The first is that this kind of wording gets around TikTok’s censors, but I’ve seen numerous videos still up with hundreds of thousands of likes that use language that hasn’t been censored. The second may be the taboo in our culture around these topics. It’s true that there’s two things coexisting here. It’s difficult to discuss murder and sexual assault, even if we’re being constantly exposed to this sort of graphic material. It’s becoming apparent to me as I’ve conducted these interviews and research that we’re desensitized to witnessing it online, but not when having in-person discussions about it.  

“By using nicknames… it lessens the impact of the story,” Milutinovich argues. This sort of behavior is most often observed in the true crime community. They often share stories of real cases while eating or doing makeup, using censored language to describe terrible crimes. Milutinovich continues with, “There are a lot of true crime podcasts where they’ll be talking about an extremely sensitive topic and will be laughing about it, obviously not taking the case seriously at all.”  

 Dr. David Tizzard told publication, TastingTable in the article, “Here’s Why Mukbang Videos Attract So Much Controversy,” “Exposure to graphic or disturbing content, especially when paired with pleasurable stimuli like food consumption, may dull emotional responses over time.” This could also be applicable to social media, as most people now know how social media content can trigger dopamine and addiction, no matter what they’re seeing.  

It has come down to us becoming used to and comfortable with graphic content. In my forensics class this year, we discussed multiple cases involving sensitive content in an educational manner, much as Cangello does for his students. It was noticeable, though, that a lot of people weren’t fully processing how severe some of these cases really were. As I feel the need to be honest, I myself was included in this sort of behavior. It feels easy to disconnect myself, as I wasn’t or didn’t know anyone involved.  

 According to the Sky News’ article, “Teenagers exposed to ‘horrific’ con[1]tent online – and this survey reveals the scale of the problem,” over 50% of teenagers report that this sort of content comes up without them even searching for it. This means the algorithm is the one pushing graphic posts onto For You Page’s. There’s also been an increase of even posting these sorts of videos, meaning it’s flooding people’s feeds more and more, despite social media platforms who swear they’re working to remove it. “… [the content is] like going from drips from a faucet to a flood… it’s unhealthy,” Kenoyer says in relation to this influx.  

It’s also evident that people feel comfortable joking about these topics. Dramatic edits and AI generated videos of Jeffrey Epstein with millions of likes have been created even before the official unsealing of the Epstein Files at the end of January, and an influx occurred after. Almost the entire American public knows about the heinous and despicable crimes committed by the numerous names mentioned in the files, yet many people have no problem with making a mocking of it and the victims involved. “… it’s often just dis[1]missed or joked about,” Kenoyer adds. This behavior makes it even easier for us not to take crimes and tragic events seriously.  

Cangello also highlighted how we as a culture perceive violence in comparison to not even inherently sexual nudity. “We think we’re so progressive, but have no problem showing violence in media. But when it comes to nudity it’s like ‘oh no’… but instead you can see someone’s head get blown off,” he says. Our normalization to violence even applies to real-life sexual experiences. This circles back to the language we use, like common terms being that you “cracked” or “banged” someone if you were intimate with them.  

Overall, it’s now obvious that we have a long way to go with our exposure to graphic media online. From the language we use, to the humor we make out of it. If we work to better sympathize and understand how to take the content we consume earnestly, we can become more understanding towards the serious matters these posts often contain. 

-Emily Taylor

Leave a Reply

Trending