Into the Realm – Cashier Brooks
It began, as many crises do, with a seemingly harmless decision. I was procrastinating on a Sunday night, the kind of procrastination that convinces you watching something mildly educational is technically productive. My homework sat untouched on my desk, an indictment of my focus, when YouTube offered me an irresistible distraction: Top 10 Creepiest Unsolved Mysteries.
The thumbnail—a shadowy figure in the woods, possibly Bigfoot, possibly someone with sciatica—was absurd enough to click. The video delivered what I expected: ominous music, grainy footage, and theories barely held together by shaky editing and blind faith. It was the intellectual equivalent of eating a gas station burrito: unsatisfying, vaguely regrettable, but exactly what I needed.
But YouTube doesn’t see clicks as innocent curiosity. To YouTube, a click is a confession, a declaration of allegiance to a particular reality. By Monday morning, my recommendations were no longer the soothing mix of cooking tutorials and mildly pretentious TED Talks I had come to expect. Instead, my homepage had transformed into a dystopian circus: “The Moon Landing Was Staged,” “9/11 Was an Inside Job,” and, most disturbing of all, “Birds Aren’t Real: Wake Up, Sheeple!”
The bird video caught my eye. Its premise was as absurd as it was insistent: pigeons, the narrator claimed, are government surveillance drones. “Have you ever seen a baby pigeon?” he demanded, his tone implying that answering “no” would unravel my entire worldview. I hadn’t, but I’d also never seen baby squirrels, and I wasn’t accusing them of espionage. Still, the video sat there, daring me to click.
I resisted, determined to prove to the algorithm—and myself—that I wasn’t falling for its narrative. I clicked on sourdough recipes, a TED Talk about creativity, even a bike repair tutorial. But YouTube wasn’t fooled. A sourdough video was followed by “10 Foods the Government Is Poisoning Right Now.” A guide to fixing a flat tire segued into “How to Escape Surveillance Using Everyday Tools.” The algorithm wasn’t just misunderstanding me—it was remaking me, insisting I become its version of me.
And then came the onion man.
“Doctors HATE This One Weird Trick to Cure Everything!” screamed the title, accompanied by a thumbnail of a man cradling an onion like it was the Holy Grail. Against my better judgment, I clicked. He explained, with the intensity of someone banned from multiple medical forums, that rubbing a raw onion on your feet would detoxify your body. “The toxins,” he whispered solemnly, “are drawn into the onion overnight.”
I stared at the screen, horrified and, I admit, a little impressed. Here was a man willing to risk both his dignity and his circulation in pursuit of his beliefs. The algorithm, naturally, took my click as gospel. My homepage became a landfill of chemtrails, anti-vaxxer propaganda, and exposés about the Illuminati’s supposed monopoly on Starbucks.
It wasn’t long before I felt trapped, a character in a Kafka story where my crime was curiosity and my punishment was endless redefinition. Algorithms don’t punish you out of malice—they punish you out of efficiency. They don’t see nuance, only patterns to exploit. To the algorithm, I wasn’t a person—I was a series of clicks, malleable and infinitely marketable.
Desperate to reclaim my identity, I devised a plan: flood the algorithm with innocence. For three days, I binge-watched PBS Kids. Arthur, Dinosaur Train, and Martha Speaks became my allies, their cheerful jingles a desperate attempt to overwrite the chaos.
At first, it worked. My recommendations softened: How to Draw a Cat, 10 Best Bedtime Stories for Kids, Arthur’s Guide to Friendship. But algorithms, like viruses, evolve. Soon, YouTube began suggesting conspiracies about PBS Kids. “The REAL Reason Arthur’s Parents Are Never Around” theorized they were operatives in an underground aardvark resistance. “Dinosaur Train: Propaganda for Big Oil?” made me question whether fossil fuels were behind it all.
This wasn’t just misunderstanding; it was erasure. The algorithm wasn’t reflecting me—it was rewriting me, one absurd click at a time. The scariest part wasn’t that YouTube thought I was a monster. It was how easily I started to see the monster, too.
I deleted my account that night, but the paranoia lingers. Algorithms don’t just collect data; they sculpt it into identities, flattening complexity into patterns they can monetize. They’re mirrors that distort and direct, showing us not who we are but who they need us to become.
We like to think of ourselves as solid, autonomous beings. But all it takes is a few careless clicks to unravel that illusion. Were we ever that solid to begin with—or are we just shadows flickering in the algorithm’s endless loop?