How Philanthropy Should NOT Respond to the Rise of Artificial Intelligence
Piecemeal approaches to #AI focused narrowly on harm reduction miss the point entirely about the revolutionary potential of this emerging technology.
I've led a handful of philanthropic intermediaries at the intersection of emerging technology and community change. Below I consider how philanthropy should respond to artificial intelligence's promise and potential dangers today.
Foundations have a significant role in supporting movements to address uneven power dynamics in the tech sector. However, doing so effectively requires strategic coherence and longer-term thinking.
In a more democratic and equitable society, everyone would have access to the benefits of AI regardless of race, gender, or class. But we don't live in that society today. Philanthropy, therefore, could direct resources toward addressing the various symptoms rather than the root cause of imbalances.
For example, if millions lose their jobs and inequality widens because of automation and AI, that is not technology's fault. That is because we refuse to reimagine the role of work and the distribution of wealth.
If AI is used to create fake news, manipulate images and videos, and spread disinformation, then it is not AI's flaw. We cannot promote critical thinking, media literacy, and democratic participation, as well as a crackdown on the nefarious financing, manipulation, and propaganda of people.
If AI reinforces existing power imbalances and biases, it is not just a technology problem. Our government cannot promote transparency, accountability, and democratic community participation in the development and deployment of AI.
Philanthropy has the opportunity to fully harness the transformative power of AI for promoting inclusive economic development and democracy. To achieve this, it is important to explore new approaches rather than adhering to traditional patterns. Instead of providing limited funding to fragmented initiatives, philanthropy should consider coherent strategies and place bigger bets supported by a clear theory of change.
What is required from philanthropy is the courage to back visionaries pushing for a rethink, redesign, and reorganization of society. It requires systematic and intentional funding strategies that provide larger grants to fewer initiatives with breakthrough potential and strategies grounded in political reality. This often means supporting place-based initiatives and/or national intermediaries providing value-add to said initiatives.
To realize this ambition, foundations must also move beyond their typical posture toward technology which is harm reduction. The potential harms from AI depend on the broader societal context. For example, Sam Altman, the founder of OpenAI and ChatGPT, has said that AI could exacerbate the problem of capitalism, such as inequities in wealth and power.
While Altman praised capitalism for creating wealth unforeseen in prior human history and acknowledged its shortcomings, he has spoken clearly about the need for humanity to consider a post-capitalist future that is more inclusive and equitable.
Philanthropic support for research and endless panel discussions dissecting the potential dangers of AI treat the problem as if it is a lack of awareness. The real problem is that no broad-based, organized social movement has enough power to carry forward an alternative vision for an equitable or democratic deployment of AI technology that benefits all humanity.
What is needed now is the emergence of new, broad-based, and inclusive social movements that are not afraid to challenge the dominant worldview and call for the necessary reorganization of society so that AI can work for and not against the interests of all.
Philanthropy's task is to help to create conditions for such movements to emerge and grow. This includes building broad alliances between diverse groups with common interests, supporting alternative media and institutions such as data cooperatives or community-owned AI platforms, and building a training infrastructure for nonviolent direct action and civil disobedience.
For these movements to be successful, they must include engineers and researchers responsible for the development of AI in their ranks. I have written elsewhere about the need for specialized technical assistance for movements aligned with the ultimate goals of community impact and lasting change.
It is essential to support strategies grounded in long-term thinking about the world we want to build rather than simply reacting to the threats posed by new technologies. Advances in AI will emerge so quickly that piecemeal and reactive grant-making approaches focused on specific issues of harm reduction will be obsolete before they ever have a chance to take root.
In the final analysis, philanthropy should remember that its resources and influence are necessary, but more is needed to promote equitable outcomes in the tech sector. AI technologies will evolve and change in the short term, but the long-term imperative for movement-building and infrastructure will not. The only viable long-term path for philanthropy is to leverage its resources to strengthen democratic participation, community engagement, and eventually alternative community-owned and operated technological development and deployment models in partnership with the private sector.
I've been working with national foundations and the Media Democracy Fund on a philanthropic initiative (Digital Equity and Opportunity Initiative) to strengthen the movement to close the digital divide and build lasting civic infrastructure. The strategic philanthropic framework to address the digital divide is relevant to address the promise and peril of AI.
If you are interested in learning more about the role of philanthropy in meeting the opportunities that AI presents to humanity, I hope you will subscribe to this blog and stay tuned for more content.