Owning the Simulacrum: Big Tech, Friction, and the Disappearing Origin
How to Lose a Civilisation in 10 APIs

Once upon a time, memory lived in people. Now it lives in endpoints. These days, what we call 'efficiency' often just means forgetting faster. Somewhere between SDK updates and go-to-market decks, we stopped telling stories and started shipping features.
They’re not buying your data anymore. They’re buying your mirror.
And you’re selling it — because that’s the only version of you that now earns.
Welcome to the simulacrum. Where Builder.ai raises half a billion dollars by promising to “code like a human,” and Microsoft and SoftBank smile in the background.
It’s a neat trick. Package South Asian labor as 'AI-enabled creativity,' raise global capital, and when things go sideways — say, when an algorithm mangles cultural nuance or plagiarizes Indigenous rhythm — shrug and point to the training data. Blame the Indians — not directly, of course. Symbolically. Coded into attribution-less datasets. Decorated in heritage filters. Just enough 'culture' to brand it, not enough to be accountable for it. They’re not selling code. They’re selling the idea of human genius — clean, brandable, frictionless.
But friction is where the soul used to live.
We used to have rituals. Repetitions. Awkward pauses. The stuff that made things sacred. Now, we optimize for smooth: swipe, skip, subscribe. Grandma told parables; your app gives push notifications.
Big Tech doesn’t steal culture. That’s too crude. It simulates it. It samples the vibe, repackages it with some Helvetica, and sells it back to you with a free trial. Stripped of history. Drenched in UX.
Baudrillard would call this sign-value: when things aren’t bought for their use, but for what they say about you. Your yoga mat isn’t for posture — it’s for Instagram. Your AI-generated temple chant? Not for prayer. For background vibes while you deep work.
And so it goes: identity becomes merchandise. Culture becomes content. Meaning becomes metadata — if you're lucky.
You are not the content.
You are the context.
Pause. Quick gut check: Where did you first learn about India? Textbook? Festival? Instagram reel?
The answer matters. Because whoever owns your memory, owns your imagination.
This is what the paper Owning the Simulacrum is about. It’s not just about AI stealing stuff. It’s about the collapse of epistemic sovereignty — when knowledge no longer remembers where it came from. When machines remix culture without citing it. When your grandma’s lullaby becomes “calm_hindi_folk.mp3.”
We are entering an era where creation doesn’t require memory. Just training data. Where authenticity doesn’t matter — only aesthetic.
Indus Valley had no search bar. But it had seals, grids, water aligned with stars. Memory lived in the environment. In repetition. In ritual.
Now we outsource it to APIs.
The paper doesn’t just rant. It prototypes. It imagines a world where metadata is cultural infrastructure — not a technical afterthought. Where remembering is a feature, not a bug.
Imagine walking into a museum. You scan a QR code. It shows a folk tale in your native tongue — but no author, no location. Just:
“Generated by LLM.
Source: Unknown. Enjoy!”
At the end of the room, you see a screen. On it: your voice. Your dialect. Telling a story you never told. The machine smiles. “Thank you for your contribution.”
This isn’t paranoia. It’s product strategy.
Now picture this: a glitch. A mismatch. An AI-generated post that misuses a sacred chant. A visual that merges tribal art with a completely unrelated brand logo. It's offensive. It circulates. There's backlash.
A press release appears: “We apologize. The output was based on publicly available datasets. We take your concerns seriously.”
If you squint, you’ll see what just happened.
The company didn’t apologise for what it simulated. It apologised that you noticed.
And behind that apology is the shield: That wasn’t us. That was the data.
This is the new deniability. Wrapped in inclusive fonts. Covered with a skin of diversity dashboards. Big Tech no longer needs to erase you — it just needs you pre-labeled and deniable.
To critique the machine, we have to own our role in its fluency. We trained it. Fed it. Praised it for its smoothness.
But now it’s time to ask: What did we forget to remember?
This blog — and the paper it riffs on — is a small resistance. A friction. A refusal to let simulation become the default mode of knowing.
We don’t want to cancel the simulacrum. We want to own it.
Ethically. Collectively. Consciously.
Because if simulation is our fate — memory better be our weapon.
And friction? Let that speak first.
📎 Reading More
For a deeper dive into the ideas explored here—epistemic sovereignty and the politics of simulation—check out the full paper: Owning the Simulacrum
To follow the conversation on AI, culture, and the future of memory, see the companion essay: