Skip to Content

Technology, Human Spirit, and the Ethics of Progress: Charities and AI

December 11, 2025

Time to read: 5 minutes

A Growing Conversation About AI in the Sector

As charities adapt to a rapidly changing world, their relationship with technology, especially AI, has become increasingly more complex. Across three years of survey responses, we’ve heard both enthusiasm and unease: from excitement about streamlined operations to serious concerns about losing the human touch, copyright infringement, and environmental impact of AI (CICP 1.03.14; CICP 1.10.42; CICP 2.10.39; CICP 3.05.16).

To help deepen our collective understanding, let’s turn to a powerful philosophical voice from one of the greatest thinkers of the 20th century: Ernst Cassirer, whose essay Form and Technology (1930[2012]) explores what technology is, and what it means for our spirit, culture, and humanity. His insights may offer us a language to think more clearly about the ethical and cultural questions AI poses – not just whether it works, but whether it works for us.

Technology as “Form”

Before we can evaluate technology, Cassirer argues, we must understand its essence – its forma formans, the inner idea that gives rise to its “form” – rather than only reacting to its visible outcomes (forma formata), or its “being”. He cautions against reducing technology to utility, and challenges us to see it instead as an expression of culture, spirit, and creativity – akin to Plato’s view of techne as guided by archetypes rather than mere imitation.

Cassirer doesn’t see technology merely as a tool or a set of inventions, but as a new way of shaping meaning – a cultural force with the power to transform how we relate to nature, to others, and to ourselves. He reminds us that even the most powerful technologies remain human achievements. They reflect our hopes, our fears, and our ideals. Thus, AI is not simply a new tool; it is a new chapter in the story of human meaning-making.

Key Conflicts Triggered by the Rise of Technology

Reading Cassirer today offers a powerful lens for reflecting on AI. He outlines three key conflicts that arise as technology becomes more central to modern life – conflicts that can help us understand the concerns many charities have about AI’s growing presence.

The Conflict Between Happiness and Technological Will

The first conflict, according to Cassirer, is that technology promises mastery over nature – but it also imposes its own laws, efficiency-driven logics, and utilitarian demands. As we use this technology, we risk losing the “organic unity” of existence – that sense of life as meaningful beyond output.

In the context of AI, this conflict plays out when charities adopt tools that save time or streamline tasks, but begin to question whether these gains serve their missions. Does using AI to write a grant proposal free up time for human connection – or distance us from it? Does automating a donor appeal improve engagement – or flatten it into formula?

These difficult questions point to the fact that technology, as Cassirer warns, can reframe our sense of what matters.

The Conflict Between Technological and Artistic Creation

Cassirer’s second insight distinguishes between technological work – which solves problems through precise design – and artistic creation, which merges form with deep inner expression. Both kinds of work require discipline and imagination. But art, for Cassirer, speaks from and to the soul. Unlike technology, its beauty is expressive, not just functional.

This distinction comes alive when charities raise concerns about AI-generated art or music. While generative AI can compose poems, design posters, or simulate voices, does it express anything truly human? One might say that technological creation builds, while artistic creation reveals.

For mission-driven organizations – especially those working in arts, culture, or education – this isn’t just philosophical. It goes to the heart of what they do. Can a chatbot teach empathy? Can a machine-generated story carry the same meaning for youth in crisis? These aren’t arguments against AI, but reminders of what must not be lost.

Freedom vs. Bondage: The Deepest Paradox of Our Time

Finally, Cassirer urges us to ask the most serious question: not what technology can do, but what it ought to do. He warns that when technology becomes untethered from ethical purpose, it risks serving consumption over conscience.

Here, his words resonate powerfully with some of the environmental and social concerns raised by charities in our survey. AI systems, especially large models, have a large carbon footprint and require massive resources. Are these tools being developed and deployed in ways that serve the public good? Are their benefits equitably distributed?

Cassirer is clear: technology must fit into a larger moral horizon – one guided by values like justice, solidarity, and care. And it is precisely this ethical orientation that charities bring to the conversation.

Toward “Freedom Through Bondage”

Cassirer calls the ideal of technological culture “freedom through bondage.” We obey natural laws, learn from constraints, and in doing so, discover new possibilities. But the freedom we gain is not automatic. It must be cultivated  – through reflection, ethics, and the collective will to use technology responsibly.

Charities have a unique role to play in the scrutiny of technology. Their work is not driven by profit but by purpose. Their accountability is not only to stakeholders but to communities. Charities’ everyday decisions – about adopting AI or opposing its use in certain circumstances – offer a kind of moral barometer for our digital age.

Author

Nguyen, Thi Kim Quy

Want to receive our blog posts directly to your email?  Sign-up for our newsletter at the following link, and follow us on social-media for regular project updates:

5 min read

Technology, Human Spirit, and the Ethics of Progress: Charities and AI

5 min read

Stable on Paper, Strained in Practice: The Workforce Crisis in Canadian Charities

4 min read

Digital Fundraising in Canadian Charities: What’s Working, What’s Next, and What’s Holding Us Back