
When I read Jurassic Park for the first time, I was just a kid. I had seen the movie even earlier, and I was obsessed with dinosaurs as a child. Growing up in my early teens, having my artistic side, and discovering how they made the dinosaurs put me on a path in life that I will be forever grateful for. So you can imagine how the movie, and later the book, really hit home with me. (This led me to a career in VFX and games). And I still watch the movie at least once every year still.
What I didn’t pick up until later was the moral of the story ( I was too young and too distracted by dinosaurs). I picked up a copy of the book recently and had another read. I really love how the book differentiates itself from the movie with its much darker tone. Reading the book again it dawned on me, written in the late 80’s and published in 1990, this story is probably more relevant today than it ever has been.
But just to set the expectation, this is not a book review. This is about my reflection on the moral behind the story and how well it fits in with what I observe is happening in our society today.
Replace commercial Bio-engineering with AI
With all the AI things happening right now, the speed of change and the reckless progress it’s making really had me thinking. John Hammond is basically the entire AI industry personified. Blinded by the glory of being first.
Spare no expense – John Hammond
Sound familiar? It sure does to me.
He was so blinded by the magic of bringing back the dinosaurs, no matter the cost, both of capital and human life. Even when things went horribly wrong, he still thinks about moving forward.
This is basically the story we hear from OpenAI and the other AI providers. Most believe there is a chance that AI might go horribly wrong (some even strongly believe it being the cause that ends humanity), but they keep saying that the upside is too great, like it doesn’t matter if entire industries get wrecked in the progress.

The dangers and fight for survival
So now InGen is manufacturing dangerous dinosaurs, thinking they can control it. Somehow things inevitably go wrong and now dinosaurs are loose and eating people.
Let’s look at what’s happening in the AI space. Powerful AI platforms were created and released to the public and now artists are losing their jobs, there’s a rise of fake news and misinformation at scale, and more jobs are about to get destroyed as more industries get disrupted (transportation, customer support, just to name a few).
And just like the people on the island, people around the world all of a sudden have to find a way to survive this rapid change.
The issue is not the technology
The issue is the pace. It’s that change and innovation is happening too fast. Just like the park, when things are rushed and have a disruptive force, people’s lives will be affected and we rarely see any consequences for those responsible (I won’t spoil what happens in the book or movie, so I’ll leave it at that).
If we had a political system that could keep up with regulation, financial safety nets, etc. Maybe the pace would be more manageable. But this is just our reality that governments can’t move as fast.
We then have the moral issues. The AI engineers (just like Dr. Wu in Jurassic Park) are so blinded by their focused on making progress with their work. This brings me to the iconic scene in the movie where the cast are seated around the table for lunch after being exposed to the park’s dinosaurs for the first time, I can’t help but think about Ian Malcolm’s (chaos theorist) brilliant line when he shares his thoughts:
Your Scientists Were So Preoccupied With Whether Or Not They Could, They Didn’t Stop To Think If They Should. – Ian Malcolm
This fits perfectly with what’s happening with generative AI, especially in the creative space. I must confess that I was amazed at how good Midjourney and Dall-e could produce “art” at first, but that feeling wore off and eventually got replaced with a feeling of, we, having lost a bit of our humanity as a result. My cynicism grew even stronger when I started to think more critically about how these models got trained and then commercialized on the expense of great artists around the world that are all of a sudden increasingly struggling, in an already tough industry to survive in.
There is this lack of empathy and sense of responsibility from these leaders that I find astounding.

The political space
The pace is a hard problem to solve. On one hand, you have a free market and innovation, which I think is important. On the other hand, you will end up disrupting too much, so you end up with a broken economy with all the wealth flowing in one direction.
So, governments worldwide must find a way to restructure themselves to be able to move with the times, as the world continues to evolve. I think many of these challenges can be solved simply by having the people in politics and lawmakers be more proactive with educating themselves on what’s happening in the world of technology.
Governments could benefit from having more philosophers with a deep understanding of technology in the house.
Perhaps creating the technology is ok, but not scraping the entire internet and profit from material that’s not yours to profit from. You hear the arguments (from the AI companies) “It’s all fair use”, which is currently being argued in lawsuits, and “But it wouldn’t be possible to license all the material” (I’m paraphrasing here).
The “But it wouldn’t be possible to license all the material” is a crazy argument. What they are saying is that they should be able to spend their way out of the law. Should they get a pass because they already spent billions and it being inconvenient for them to retrain the models in a more ethical way? They can spend billions on hardware and engineers, but not on licensing and making sure that people providing the data (unwillingly) get paid?
This should be a clear cut to what should be happening. My suspicion is that these political bodies just aren’t knowledgeable enough on the subject and are spending a lot of time now catching up.
What should/can we do about it?
As regular people, the best thing to do is to just be aware of the technology and jump in. Experiment with it, learn its capabilities, and see where it struggles. Pandora’s box is already open, and there’s no turning back now. So this is the best option you have.
All change is scary and painful, even if done with good intentions. When the world starts to adapt, perhaps people will start to thrive more now than ever before, and not get eaten by that dinosaur.
It sounds bad, but…
This article might make it sound that I’m against AI. I’m not actually, but I also want to add that it’s not that simple. I think it’s a powerful technology that can do a lot of good. But the way we got here (especially looking at generative AI) is very morally questionable.
The kid in me still loves dinosaurs and having ambitions. And I still love how technology can benefit the people and the world. But perhaps the most important lesson from Jurassic Park isn’t just about the dangers of unchecked innovation—it’s about respecting the power we’ve unleashed.
With that, I want to end this article with one last quote that I’m hopeful not only applies to the dinosaurs, but also the people:
Life finds a way – Ian Malcolm