What if the next major threat to the community you serve isn’t a hostile politician or budget cuts, but an algorithm that no one in your organization understands?
It’s not a hypothetical. Right now, as you’re reading this, AI systems are accelerating, and they’re influencing every corner of our lives—from how information travels to whose stories rise to the surface.
And while it’s tempting to step back from all the hype, especially when the pace feels exhausting or the tools feel untrustworthy, this moment calls for something different. It calls for leadership from people who understand equity, culture, power, and community. People like you.
AI is going to shape the next decade of social change whether we participate or not. The real question is: Will justice-minded leaders help guide it—or simply deal with the fallout later. Because one thing is becoming increasingly clear: this technology will reflect the values of whoever shows up to shape it.
And historically? That’s rarely been us.
But it can be.
AI Isn’t Coming—It’s Already Here
Let’s start with the uncomfortable truth: AI’s inevitability isn’t a future prediction anymore. It’s present reality.
The algorithms deciding which social media posts about your campaign get amplified or buried? That’s AI. The systems determining which neighborhoods get investment dollars or which communities get flagged for over-policing? AI is increasingly part of those decisions. The platforms where you’re trying to tell authentic stories about the people you serve? They’re all running on AI-powered recommendation engines that decide who sees what, when and why.
This isn’t about being a tech optimist or pessimist. It’s about recognizing that these systems are now embedded in the infrastructure of how we communicate, organize, and create change.
Ignoring AI because it feels overwhelming or outside your wheelhouse is like social justice leaders in the 1990s deciding not to learn about the internet because websites seemed too technical.
The question isn’t whether AI will impact your work. It’s whether you’ll help shape that impact or simply react to it.
The Double-Edged Sword We Need to Talk About
Here’s where it gets real: AI presents both genuine opportunities and serious risks for social justice work, often simultaneously.
On one hand, these tools can help under-resourced organizations punch above their weight.
Imagine being able to read through thousands of community comments and quickly spot what people really care about. Or sending personalized messages to supporters that still feel genuine and human. Or instantly translating your annual report so Spanish-speaking, Mandarin-speaking, and Arabic-speaking community members can all read it in their own language.
For small businesses and nonprofits running on tight budgets with tiny teams, AI could give them superpowers that used to only belong to huge organizations with millions of dollars.
But here’s the catch—and it’s a big one. These same tools can make inequality worse if we’re not paying attention.
Meredith Whittaker, the president of Signal (you might use it for private messaging), has been warning people about a serious problem: most AI is built by big tech companies whose entire business model is based on watching what we do online, collecting our data, and keeping us glued to our screens as long as possible.
Their goal, she says, isn’t to help communities or make our society more fair. It’s to make money by keeping us clicking and scrolling.
Whittaker’s warnings aren’t just theory—they’re about real problems happening right now. AI systems can bake discrimination right into their code, make unfair decisions affecting millions of people, and put more power into the hands of a few massive tech companies.
Here are some actual examples:
- Facial recognition technology that works great on white faces but fails to recognize Black and Brown people accurately
- Police departments using AI to predict where crimes will happen, which just sends more cops to the same neighborhoods that were already over-policed
- Social media platforms where AI moderators constantly remove posts from activists and people of color talking about racism, while letting actual hate speech stay up
These problems aren’t accidents or glitches that will get fixed in the next update. They happen because the people building these AI systems didn’t include justice-minded people in the process.
When you design technology without diverse voices at the table, you end up with technology that protects some people and harms others.
Justice Leaders Bring What AI Needs Most
Here’s the part of the story we don’t celebrate enough: the people doing justice-driven work hold the wisdom AI desperately needs.
We know what fairness looks like. We understand how power works and why culture and community matter. We know that stories can shape how people see the world. We see who gets hurt when big institutions make poor decisions. And we care about more than money—we care about dignity, belonging, and basic humanity.
AI can cause the most harm when the people in charge don’t have this kind of perspective.
That’s why it’s so important for justice-minded leaders to get involved—to help question, design, and guide these systems so they don’t end up hurting the very communities we’re trying to protect.
AI doesn’t just need more engineers who rush to build things. It needs people who slow down and ask the important questions:
- Who could be harmed?
- Who gets left out?
- Who actually benefits?
- Who is deciding all this?
AI doesn’t need more billion-dollar ideas. It needs values grounded in real communities and real experiences.
AI doesn’t need more hype. It needs more heart and humanity. And justice-minded leaders already have that in abundance.
Showing Up Isn’t Optional
If you’re leading with justice at the center of your work, ignoring AI won’t protect you from its impact. But learning about it—engaging with it, shaping it, questioning it—will protect your people and strengthen your mission.
Because the truth is: AI will influence your narrative whether you use it or not. It will shape your audience’s expectations whether you like it or not. It will transform your field whether you actively adopt it or wait a decade. And it will impact your community whether you engage or opt out.
This is the moment to bring our values, our lived experience, and our storytelling power into the next chapter of technology—before someone else defines it for us.
Because when people who care about justice shape AI, it becomes something more than a tool. It becomes power in the hands of the people who’ve always deserved it.
And that is exactly why we can’t look away now.

