How much of my knowledge of this massive artificial intelligence shock should I share?
As all the power, risks, and implications of the rapidly emerging AI continue to accelerate and I continue to learn more, what is my responsibility?
A friend wonders what is happening.
I’ve been studying and researching, I get it, but I’ve paused my regular communication with that friend, and I give him only vague reasons why.
Is this an all-or-nothing situation, where any mention of this turning point in history would require a full explanation to those I love – so I avoid bringing it up at all?
Most people will not hear of the impact of AI until it’s too late – because it’s already too late. Not until their jobs are impacted, their companies reorganize, they mistakenly follow some deep fake, or they get absurdly wrong answers from ChatGPT.
How much am I responsible for sharing knowledge that will eventually change their lives? What is my A.I. responsibility?
How important is my telling them that human cognition, reason, and creativity are now obsolete, inferior, and have been replaced by machines – rapidly evolving machines?
How do I share all my fears about the lack of security, the wide distribution of AI ‘in the wild’ that will prevent “turning it off”, that puts the technology in the hands of anyone with the very best and the very worst intentions, that even the developers don’t understand how it works and the new skills that it learns?
Do I describe how the world’s biggest companies have started a competitive arms race driven by greed and hubris, and because of that race, the rate of development is proceeding faster than anyone predicted?
Do I point out how the tech-bros and developers and CEOs brush off these risks and any others with platitudes, vague hopes, thin assurances, and naïve visions of do-gooderism?
For any audience that might read what I publish, even the potential benefits present life-changing outcomes.
But I fear the most for the current and coming generations of good people:
- For coders and programmers, how many will program themselves out of a job, a career, and a livelihood, as AI becomes superbly good at writing its own code?
- For data experts, any analysis, synthesis, and reporting they do will become far more accurate and efficient with AI.
- For anyone in research and reporting skills, their tasks will be done more efficiently, many times faster, and for far less costs with AI.
- For anyone working with man-machine interfaces and human-computer interactions, will AI completely redefine the relationship between people and technology?
- For artists, animators, illustrators, and creatives by the millions, MidJourney and many other visual AI tools are creating and advancing at a rate and level of quality that blows the mind. And makes many of these brilliant creators obsolete.
- For anyone in customer service, financial tasks, legal work, teaching, project management, and many other professions – how soon will AI take over those roles?
What happens to meaning and the human need for usefulness and belonging when the definition of value and therefore the marketplace and the professions that have been devoted to creating that value, go away?
I want to be optimistic here, or perhaps I just need to be more realistic. This accelerating stream of big scary headlines and alarmist claims about the end of civilization and doomsday machine-age scenarios – these may all be reflections and expressions of our human tendency toward drama, seriousness, importance, maximum buzz, more clicks and ads, and the relentless desire for attention.
So should I dial back my own drama while retaining a skeptical, critical eye on the positives and negatives?
Maybe with some or all the examples above, AI will make those jobs more efficient and productive, and all these smart, energetic people and their employers will learn how to put AI to effective use while retaining humans and their humanity.
Perhaps in the arts, we’ll carve out categories, genres, and styles specifically for the digital and learn how to appreciate those, like we do with CGI in the movies, the amazing tools on our smartphones, and the beauty of synthesized music.
And perhaps all that will be driven by our collective humanity and actually designed and programmed into the AI. The people, their needs and desires, their money and use cases, and their own capacity to adapt and resist that dark night will anchor this next phase of civilization in a new kind of realism and authenticity.
Perhaps, most optimistically, the prospect and risk of losing our humanity will encourage and even force us to cherish it, cultivate it, and celebrate it.