Riley deletes a suicide note every seventeen minutes.
Not the same note. Different people, different pain, all funneled through the content moderation queue at SocialSphere, where I spend my days deciding which human misery gets to stay online and which disappears into the digital void.
The company calls us "Community Safety Specialists." The rest of the world calls us the people who decide what you're allowed to think.
Three years of this work. Forty thousand posts per day, eight seconds per decision. My productivity metrics are excellent — 99.7% accuracy according to the AI training models.
What they don't measure is how I've started checking my own posts obsessively before hitting send, self-censoring thoughts that might trigger the same algorithms I operate all day.
Remove or approve. Remove or approve. The rhythm becomes hypnotic after a while.
Today started differently. During the morning rush — teenagers posting before school, parents venting while their coffee cooled — the deleted posts weren't disappearing. They were accumulating like digital sediment in my peripheral vision.
A teenage girl's rage about her parents, flagged for "family conflict content," lingered at the edge of my screen even after processing fifty other violations. Her words hung there like smoke: "They don't even see me anymore. I could disappear and they'd just be relieved about the phone bill."
"Chris," I called to my cubicle neighbour, keeping my voice steady.
"Your deletion queue running normally?"
He glanced up from his monitor, eyes glazed with the thousand-yard stare we all develop after seeing humanity's unfiltered thoughts. Red-rimmed from staring at posts about self-harm, revenge fantasies, and casual cruelty delivered in Comic Sans font.
"Running smooth. Hit my targets by ten-thirty. Why?"
The teenage post flickered, then vanished. But somewhere in the server room's white noise, I swear I heard her voice still speaking.
Lunch brought a new problem: deleted posts accumulating faster than I could process new ones. A businessman's confession about embezzling money to pay for his daughter's cancer treatment. A mother's exhausted admission that she regretted having children, typed at 3 am while her infant screamed. A twelve-year-old's desperate plea for someone to notice the bruises, carefully worded to avoid triggering the self-harm detection algorithms.
All removed according to community guidelines. All supposedly scrubbed from the servers after seventy-two hours.
But nothing digital ever really dies.
When I pulled up the deletion logs during my break, muscle memory guiding my fingers through administrative backdoors, I found something that shouldn't exist.
Buried in routine maintenance folders: "Emotional Archaeology Project - Authorized Personnel Only."
The folder contained everything. Every deleted post from the past five years, cross-referenced with user behaviour patterns, IP addresses, device fingerprints. Not just the posts people published and regretted, but the ones they typed and deleted immediately — captured through keystroke logging and browser cache analysis.
We weren't just moderating content. We were archaeological diggers in the human psyche.
Jennifer Morrison, 34, marketing manager from Manchester. The system had tracked 2,847 deleted posts over three years, each one automatically analysed for emotional content, psychological markers, and commercial potential.
Her public Instagram showed a successful woman with a perfect life — champagne brunches, yoga retreats, inspirational quotes over sunset photos.
Her deleted posts revealed crushing loneliness, prescription drug abuse, and a meticulously planned suicide attempt involving her car and the Thames. The real Jennifer lived in her digital shadows, in the thoughts she'd typed but never dared to share.
Entry after entry painted the same picture: humans living double lives, their curated public selves hollow performances while their authentic thoughts festered in corporate databases.
"Fascinating reading?"
Dr. Patricia Walker appeared at my shoulder without warning, her reflection ghostlike in my monitor.
SocialSphere's Chief Data Scientist had been with the company since the beginning, back when they pretended social media was about connecting people instead of harvesting their psychological profiles for targeted advertising revenue.
My stomach clenched.
"This project. What exactly are you doing with deleted content?"
She pulled up a chair, settling beside me with the casual authority of someone accustomed to justifying the unjustifiable.
"Revolutionary research, really. Traditional psychology relies on what people are willing to admit about themselves. But deleted content reveals authentic thought patterns filtered through social anxiety. Pure psychological truth."
"And you're monetising this data."
"We're utilising it for predictive wellness interventions." Her tone carried the practiced smoothness of corporate doublespeak.
"Jennifer Morrison, for instance. Our algorithms identified her risk patterns months before any traditional screening would have caught them. We've been serving her targeted content for depression treatment, suicide prevention resources, therapeutic services."
"Without her knowledge. Without her consent."
"She gave consent when she agreed to our terms of service. Section forty-seven subsection twelve explicitly covers data analysis for user safety purposes." Dr. Walker's smile never wavered.
"We're helping people before they even know they need help."
My hands trembled as I scrolled through more files. Thousands of people, their darkest thoughts preserved, categorised, and fed into advertising algorithms.
"What about the people who don't want help?"
"Everyone wants help, Riley. They just lack the self-awareness to seek it appropriately."
That evening, alone in the humming server cathedral, I accessed my own deletion file. Three years of removed thoughts lay catalogued with clinical precision: hundreds of unsent resignation emails, deleted complaints about management, draft after draft of break-up texts I'd never sent to my ex-boyfriend.
The algorithm had mapped my discontent with surgical accuracy, tracking my growing disillusionment with my job, my relationship, my entire adult life. But at the file's end, something impossible: posts I'd never written, thoughts I'd never had.
The system had extrapolated from my deletion patterns to predict what I might delete in the future. It knew me better than I knew myself.
My phone buzzed against the desk. A notification from SocialSphere: "You seem stressed. Would you like to talk to someone? We've partnered with BetterHelp to provide confidential counseling services."
The camera mounted above my monitor showed a steady red light.
"No," I said aloud to the watching lens. "I would not like to talk to someone."
But my fingers were already typing a post: "Working late again. Sometimes I wonder if this job is destroying my faith in humanity one deleted thought at a time."
Cursor blinking. Delete button waiting.
If I pressed it, the confession would join thousands of others in the digital shadow realm, analysed and categorised and sold to the highest bidder. Another data point in the map of human desperation.
If I posted it, it would be the first authentic thing I'd shared in years.
Instead, I chose a third option. I powered down my workstation, watching the screen fade to black like a digital death.
My phone continued buzzing with notifications I would never answer — targeted ads for anxiety medication, job placement services, relationship counseling, all generated from analysis of thoughts I'd never shared.
Outside, the city was full of people typing messages they would never send, living lives they would never share, deleting thoughts they would never speak aloud. Their authentic selves locked away in corporate databases, their public selves hollow shells optimised for maximum engagement metrics.
We'd built a world where our real thoughts were too dangerous for public consumption, but too valuable for private disposal.
Where the most honest version of ourselves existed only in the spaces between what we typed and what we published.
Walking home in the rain, I kept my phone turned off in my pocket. For the first time in years, my thoughts were my own — unmonitored, unanalysed, unmonetised.
Tomorrow I'd be back at my desk, deleting other people's truths for healthcare benefits and a salary that barely covered my student loans.
But tonight, in the pause between digital heartbeats, I remembered what it felt like to think without an audience, to exist in the spaces between, where souls might still hide from algorithms.
The most honest version of ourselves might be the one we're too afraid to show the world.
But maybe that's exactly why it deserves to stay human…
You don’t have an audio feature as an option... Sometimes I prefer to listen while working on something else or doing daily chores... Anyway, GG as a gamers would say. 😉