Organizing Strategy and Practice

Power, Not Panic: Why Organizers Must Engage with AI to Build the Future We Deserve

Lee Anderson and Oluwakemi Oso

Arguing that disengagement from AI is not a strategy but a surrender, this piece challenges progressive organizers and funders to reclaim the technology, using frameworks like Afrofuturism to build people power instead of ceding the future to billionaires and autocrats.

AI-generated political ads. Deepfake disinformation. Predictive voter databases that track and target us without our consent. These are not futuristic threats. They are currently unfolding, reshaping our political terrain in real time. And the billionaires, autocrats, and tech oligarchs using AI to consolidate power are not waiting for our permission. They are moving fast, and unless we choose to intervene, they will continue to use this technology to undermine democracy, deepen inequality, and entrench patriarchy and white supremacy in service of authoritarianism. 

Our movements opting out of AI is not a strategy. The danger is not just in how AI functions. It is in who controls it. Many of today’s tech futurists openly champion anti-democratic visions from Curtis Yarvin’s authoritarian fantasies to Peter Thiel’s billionaire libertarianism. If we disengage, we cede the ground entirely to those who do not share our values. And philanthropy cannot opt out either. Funders shape what gets adopted, who innovates, and how quickly movements can adapt by resourcing the infrastructure, training, and imagination this work requires.

Today, the Right is leading the charge. Polling from the American Association of Political Consultants shows that conservative campaigns are using AI more consistently and confidently than Democrats. They are using it to scale misinformation, target voters with precision, and flood the digital ecosystem with fear-based narratives.

Meanwhile, many progressive grassroots organizers remain overwhelmed, underfunded, and understandably hesitant to engage. And their reasons are legitimate. AI consumes massive energy, often from data centers built in Black and brown communities. It accelerates labor precarity, amplifies bias, and automates structural harm. As Black organizers and technologists, we know this all too well. AI systems are shaped by biased data and extractive design. They have been deployed without our input, often to surveil, criminalize, and exploit the very communities we fight to protect. And yet, our communities have always found ways to resist, adapt, and build.

Progressive grassroots organizers must treat AI the same way we treat policy, narrative, and strategy: as a site of struggle. As something we can study, contest, reimagine, and reshape. AI is not neutral, it is a terrain of power. And organizers need to wield its ability to build people power instead of only panic. Disengagement does not protect us. It just reinforces the systems we are trying to dismantle.

We need a new story about AI. One that moves beyond the binary of savior or destroyer. AI is not inherently good or evil. Like any tool, it reflects the conditions in which it was created. And like any tool, it can be reclaimed, retooled, and redirected.

Frameworks like Afrofuturism and cyborg feminism show us how. Afrofuturism imagines futures where Black and Indigenous communities are not only surviving new technologies but designing them. Cyborg feminism invites us to reject purity politics and engage with the tools we have, even when they come from inside harmful systems. Both frameworks insist on imagination. Both call us to be architects of the future, not just survivors of it. And imagination isn’t abstract: the only way to retool technology like AI is to study it, test it, and work with it in practice.

Professors, Lonny Avi Brooks and Reynaldo Anderson, write, “In this reactionary moment of AI hype, digital displacement, incels and disaffected rightwing accelerationists, characterized by some as the Dark Enlightenment, Afrofuturism –– and ancestral intelligence more broadly –– offers our best chance to imagine futures worth living in.” That future is already being shaped. The question is whether we will shape it too.

Fortunately, movement organizations are already using AI to expand capacity, deepen strategy, and protect what matters most: human connection and trust.

At FairCount in Mississippi, organizers used AI to analyze voice memos from canvassers. The tool surfaced relational and emotional insights that traditional databases could not capture. As a result, their voter engagement became more predictive, more effective, and more human.

In New Mexico, a statewide coalition used a closed GPT model to streamline messaging across partner organizations. This allowed them to move faster, stay aligned, and support overstretched communications staff during peak campaign moments.

During a national mobilization effort, one grassroots group was flooded with emails from people asking how to get involved. The team did not have the capacity to respond to every message quickly. AI-generated replies helped connect people to local actions and resources, saving staff time for deeper, intentional engagement.

At re:power, where we provide movement technology training to grassroots organizers across the country, fellows developed Howdy.ai, a generative learning tool for organizers. It supports experimentation with data analysis, campaign planning, and message generation while centering liberatory values, meaning it is designed to flag bias, protect against surveillance, and uphold human dignity instead of replicating harm.

These tools are not about replacing people. They are about restoring capacity. AI can handle what is mundane, repetitive, and time-consuming so organizers can focus on what only humans can do: build relationships, move strategy, and grow power.

The concerns are valid. Bias is real, and so are the risks of surveillance and job loss. That is why movements must be at the table to shape guardrails, demand transparency, and push for protections that put people before profit. At the same time, AI is not just something to regulate, it is also something we can wield. Every technological shift in history has required organizing to win safeguards, but also creativity to put new tools to work for justice. AI should be no different. Movements can fight for strong protections while also using AI to expand capacity by freeing up time and reducing repetitive tasks, which allows organizers to focus on what truly builds power: experimenting, strategizing, and deepening relationships.

Some point to the environmental toll of AI as a reason to step back. That cost is real and innovation is already driving new ways to cut energy use and reduce harm. The solutions are not perfect, but the pace of change is undeniable. If grassroots organizers opt out, the future will be built without us.

In addition to our movements wielding the power of AI for good, funders must also step up. Their choices shape what gets adopted, who innovates, and how fast movements can adapt. Refusing to invest in AI literacy, experimentation, and infrastructure does not reflect caution. It reflects abandonment. If funders want movements to win, they must fund the conditions that make that possible. That includes infrastructure, training cohorts, open-source development, and new evaluation metrics that center values, not just outputs.

This is a moment for courage and clarity. We must build an AI Bill of Rights grounded in liberatory values. We must reject paralysis and fund ethical exploration. Philanthropy must continue to fund training spaces where organizers can learn, test, imagine, and lead.

As Black organizers, we draw strength from our ancestors and traditions. And as Black technologists, we look ahead with the visions offered by Afrofuturism and cyborg feminism that remind us that technology can be key to our liberation. Let us master AI to amplify what has always been at the heart of organizing: human connection, collective strategy, and the imagination to build something better.

As movements take up this challenge, funders must resource the infrastructure, training, and imagination it requires. AI is here. The only question is whether we will shape it, or let it shape us.



About Lee Anderson

Lee Anderson (he/him) is the Director of the Movement Technology Program at re:power and a cultural strategist, storyteller, and trained vocalist. He works at the intersection of art, narrative, and social justice to shape advocacy, engage communities, and move movements forward. lee@repower.org

About Oluwakemi Oso

Oluwakemi Oso (she/her), Chief Program Officer at Higher Ground Institute and lead of re:power’s Data x Power Movement Tech Fellowship, works at the intersection of data, technology, and social justice — bridging the divide between organizing and innovation to make tools like generative AI accessible and actionable for progressive movements.