There was a time not too long ago when discovery was part of the work. You didn’t just arrive at an opinion. You traced it. You followed the breadcrumbs. You listened to elders, studied album liners, read the footnotes, watched interviews all the way through instead of cutting to a 15-second clip that told you what to think before you’d even formed a question.

Somewhere along the way, we stopped doing that.

Now, we scroll. We repost. We trust the caption. We trust the voiceover. We trust the algorithm. We trust the person with the biggest following, the cleanest grid, the best lighting, the most confident tone. We trust summaries more than sources, aesthetics more than context, and vibes more than verification. Increasingly, we’re trusting machines to tell our stories in our voice, with our faces, wearing our skin. That should scare us more than it does.

This isn’t about artificial intelligence alone, but also authorship and cultural custody. It’s a call to discuss how easily we’ve surrendered our memory, our image, and our meaning, and how disturbingly comfortable we’ve become consuming versions of ourselves that are no longer ours.

When We Stopped Asking Who Was Speaking

One of the most alarming shifts of the last decade is intellectual. We no longer ask who is telling the story. We ask whether the story is entertaining. Digestible. Shareable. On-brand. We rarely interrogate the narrator. If we’re dealing with a huge lay-off of Black writers in Black media, who is actually writing the stories?

That’s how we ended up here: circulating AI-generated images of Black people that feel “close enough,” sharing synthetic voices that mimic Black cadence, reposting summaries of Black history written by people—or systems—with no relationship to the culture beyond training data. We call it innovation. We call it efficiency. We call it the future. However, there is something deeply familiar about this dynamic.

Black culture has always been mined, mimicked, stripped of context, and resold. What’s new is the speed and the mass participation. We’re no longer just reacting to digital blackface. We’re helping it spread. We’re boosting it. We’re captioning it with fire emojis and saying “this ate” without asking who cooked the meal or whose kitchen it came from. We used to be more discerning than this, or at least, we pretended to be.

A woman with curly silver hair wearing a black top and blue jeans poses confidently seated in an arena, with a basketball court visible in the background.

Digital Blackface Isn’t New

Digital blackface didn’t start with AI. We’ve had it with reaction GIFs, exaggerated avatars, and anonymous accounts using watered down Ebonics. We’ve had it with voices that sounded like us but weren’t accountable to us. AI has scaled it. Now, the mimicry is cleaner, faster, and less traceable. A system can generate a “Black woman” who looks racially ambiguous enough to be palatable, spiritually vague enough to be marketable, and confident enough to feel authoritative—all without ever having to live inside a Black body or inherit its consequences. Now, we’re sharing it.

Why are we so comfortable circulating images and narratives about ourselves without caring who made them? Why are we less interested in truth than in tone? Why does something feeling “right” matter more than it being rooted? We wouldn’t accept a history book with no author. We wouldn’t accept a sermon with no theology, but we’re accepting cultural storytelling with no provenance. In a time where we’re being called to remember, we’re willingly accepting amnesia.

The Death of Media Literacy—and the Rise of Group Think

We like to pretend this problem started with AI, but it didn’t. AI has just exposed how far we’d already drifted.

Media literacy has been in decline for years. We stopped reading full articles. We stopped checking primary sources. We stopped distinguishing between commentary and reporting. We replaced curiosity with consensus.

Now, instead of researching, we wait for “the smart ones” to tell us what matters.

What to think.
What to wear.
How to smell.
How to heal.
How to be spiritual.
How to be feminine.
How to fall in love.
How to be free.

Influencer culture has replaced some of our natural inquiry. Discovery has been outsourced. Interpretation has been centralized. Nuance and gray areas don’t survive the algorithm. We don’t ask questions anymore. We ask for breakdowns with the refrain of “enlighten me, queen!” We ask for threads. We ask for “someone to explain this to me like I’m five.” Then, we (via ChatGPT) regurgitate the explanation as if it’s our own conclusion. Furthermore, we have the same generative AI package it to sell as an e-book.

It’s efficient and feels communal, but it’s also dangerous. When everyone is thinking together without thinking deeply, what you get is an echo chamber. We also begin to lose some of the differences that make us unique.

@mustbemaia

have you noticed this as well? UK people, is this happening for y’all? people have even said cities are losing their distinct characteristics, too #linguistics #accent #accents #southern #southernaccent #newyork #newyorkaccent #california #californiaaccent #cities #langauge #aav

♬ Clair de lune/Debussy – もつ

Are We Afraid of Sitting With Ourselves?

There’s an uncomfortable question underneath all of this, and it has nothing to do with technology.

Are we afraid of solitude? Are we afraid of sitting with ourselves long enough to discover what we actually think or wrestle with complexity? Is it too uncomfortable to not have an answer right away? Self-discovery isn’t always aesthetic. It’s slow, quiet, and doesn’t always translate well to carousel slides. It doesn’t always resolve into something inspirational. Maybe that’s why we avoid it.

It’s easier to let someone else name our journey than to chart it ourselves. It’s easier to buy a starter kit than to immerse ourselves in the histories and customs of things. It’s easier to adopt a ritual than to understand the cosmology it came from. It’s easier to repost “ancestral wisdom” than to read the documents our ancestors actually left behind.

Those documents exist.

They’re in oral histories.
In recorded interviews.
In journals and diaries.
In church programs and funeral pamphlets.
In album liner notes and handwritten lyrics.
In the archives of Black newspapers.
In the entire Ebony and Jet catalog.
In the art.
In the art.
In the art.

The songs people are suddenly asking, “Was this spiritual?” “Was this coded?” didn’t come out of nowhere. They came from a lineage of people who were documenting their lives in real time, leaving maps back to themselves.

We don’t lack access. We lack patience.

Preservation Is Creation’s Twin

Our duty isn’t just to make things. It’s to also hold things and sit with what already exists long enough to understand it before we remix it into something hollow. Preservation requires stillness and reverence. When we actually sit with old photographs, we see posture, styling, body language, context. We notice who’s centered and who’s cropped out. When we read old interviews in full, we hear uncertainty, contradiction, evolution. When we listen to music without rushing to interpret it for content, we hear longing, resistance, coded language that doesn’t announce itself. These are not just artifacts, but also instructions.

They are people who lived before us saying, Here’s how we survived. Here’s how we loved. Here’s what we hid. Here’s what we couldn’t say out loud. Instead of studying those maps, we’re asking AI to redraw them for us. We are asking people who haven’t proven their authority to speak on these things to give us a digest.

Why?

The Cost of Letting Machines Remember for Us

Here’s the thing no one wants to say plainly: when we allow machines to generate our images and summarize our histories without oversight, we’re handing over custody. Memory is power. Narrative is power. Whoever controls the archive controls the future. If our images are synthetic, our history summarized, our spirituality aestheticized, and our culture flattened into prompts, what happens when those systems decide which versions of us are most profitable, most palatable, or most acceptable?

We’ve already seen this movie. It never ends with us in control.

This isn’t an argument against technology, but an argument against abdication. Tools are only dangerous when the people using them stop asking questions. Right now, too many of us have stopped asking.

Remembering Is an Act of Resistance

There is something radical about slowing down and refusing to be spoon-fed meaning. Remembering—actively, intentionally—is a form of resistance. So is discernment. So is saying, “I don’t know yet, and I’m going to find out for myself.” Our elders and ancestors didn’t leave us riddles to be solved by algorithms. They left us breadcrumbs to be followed by people willing to walk. All we have to do is slow down, ask better questions, and remember that some things are meant to be kept. That’s is the work that helps us find our way back to what’s real.

Leave a Reply

Discover more from I Love Us

Subscribe now to keep reading and get access to the full archive.

Continue reading