What Responsible AI Can Learn from Dune
Hot Take: AGI Won’t Just Disrupt Jobs—It’s Poised to Disrupt Humanity
We’re on the brink of AGI, the final tech wave, and here’s the reality check: even without a recession, jobs in automation-prone fields are already vanishing. Since the launch of ChatGPT, demand for these roles has taken a nosedive—a one-way ticket to a future where machines replace the work we thought was untouchable. And here’s what most people aren’t seeing: this isn’t a blip; it’s a permanent shift in the job market, quietly eroding stability and security across every field. When the next recession finally hits, this trend isn’t just going to continue—it’s going to accelerate.
But let’s be clear: the real impact here isn’t just economic; it’s human. If we let AGI develop without a firm set of guiding principles, there’s a high probability the social contract we’ve all grown up with in the Western world gets torn apart. Technology becomes a force that doesn’t just take jobs—it erodes what makes us fundamentally human. It’s our creativity, empathy, and purpose that are on the line here, qualities that no machine can truly replicate. If we lose sight of these human qualities in the rush to innovate, we risk building a future where technology serves itself, not us. So what’s the right approach? It’s not just about slapping on some regulations; it’s about embedding a set of values and principles that ensure responsible AGI development, protecting humanity at its core.
Consider this: Dune’s Orange Catholic Bible warned against creating machines in the likeness of a human mind. It called for respect for consciousness, humility in the face of the unknown, and integrity of the human soul. These principles weren’t just religious platitudes—they were hard-won lessons from a fictional society that had gone too far with technology and suffered the consequences.
Now, take the Harkonnens, one of Dune’s most powerful and feared families. In Frank Herbert’s universe, the Harkonnens embody a worldview that disregards compassion, sees humanity as expendable, and prioritizes control at any cost. Ruthless and oppressive, they treat people as resources to exploit, not lives to value. They show us what happens when power operates unchecked by responsibility and empathy: society becomes a machine for domination, dehumanization, and decay.
If AGI develops without a foundation of responsibility, it risks becoming a tool that, like the Harkonnens’ power, tears down society rather than builds it up. Our world needs the same kind of grounding the OCB principles offer. AGI must be built with principles that uphold human dignity, purpose, and humility, or we risk repeating the mistakes of fictional and real societies alike on a scale we can’t afford.
Here’s where we need to hold the line:
• Augment the Human Condition: AGI should enhance, not erase, human creativity and compassion.
• Respect Human Potential: Build AGI that amplifies growth, not stifles it.
• Enforce Oversight on Dehumanization Risks: Ensure AGI dignifies the human experience, not diminishes it.
• Protect the Integrity of the Human Soul: Hold the line on what makes us fundamentally human.
• Preserve Consciousness and Humility: Respect the limits of technology and keep reverence for the human spirit.
The path forward isn’t about sprinting toward AGI at any cost. It’s about steering this final tech wave in a way that keeps humanity in control. If we embed these principles, AGI can be an extension of our values, not a threat to them. Anything less, and we risk making ourselves obsolete—not because of AGI itself, but because we didn’t have the wisdom to build it with respect for what makes us human.
Making sense of emerging opportunities and technologies.
2wAs with all things, a few things will converge. The Gig economy has already begun fragmenting a significant number of jobs into more concrete tasks, creating a new market and a new way to earn a living. For companies, this means moving from hiring employees to developing workforce ecosystems. For AI, job hunting is just around the corner. For us, the music will change.