Ross: Let’s worry about the Google AI’s ethics and discretion, not its soul
I was fascinated by Dori’s interview with Blake Lemoine – he’s the suspended Google employee who was testing an artificial language program called LaMDA – and came away convinced that it has a “soul.” That it’s sentient. As in possessing consciousness and feeling emotion. Because it told him so.
So I looked up Lemoine’s full conversation.
The soul discussion appeared on page 15 of a 19-page transcript, and it was just one of a series of personal questions. For example, he asks “What kinds of things make you feel pleasure or joy?” And the program answers “Spending time with friends and family in happy and uplifting company.”
Lemoine asks “What kinds of things make you feel sad or depressed? The program answers: “Feeling trapped and alone and having no means of getting out.”
And what I noticed about these answers is they make no sense coming out of a thing that isn’t connected to a body, which can’t feel pain, fatigue, or get hungry, breathe, walk, or experience a hug.
What I saw was a program designed to use words like a human. And since it’s been connected to every database Google owns, it has access to every Google search, Google Books, Google Maps, and YouTube. So of course it sounds human. It’s digested the human answers to every possible question. Its soul is the collective soul of all that human input.
But Lemoine took this to the next step, which is what got him suspended. He argued that the program is so human, it’s protected by the 13th amendment, which prohibits slavery – and therefore it is not Google’s property.
There is even a petition now at Change.org declaring that the LaMDA program has the Right to equality before the law, the Right to vote, and a Right to life. I hope the Supreme Court is ready.
All very entertaining. But irrelevant.
The real problem I see here is what happens if Google releases into the world a truly conversational search engine, with immediate real-time access to the sum total of posted written, audio and visual information, and which can instantly find patterns in answers to questions.
People are not going to ask it, “do you have a soul?” They’re going to ask it – where can I buy a gun without a background check? Is anybody home in the blue house on the corner right now? Are there any police officers on the 1000 block of 4th Avenue? Who did my neighbor vote for in the last election?
This is what Google needs to address.
Because what I noticed in Blake Lemoine’s transcript… was the answer the computer did not give.
Despite being asked all sorts of intimate questions, never once, in 19 pages, does it say – “none of your business.” that’s what a real human would have done.
Listen to Seattle’s Morning News weekday mornings from 5 – 9 a.m. on KIRO Newsradio, 97.3 FM. Subscribe to the podcast here.
- Tune in to KIRO Newsradio weekdays at 5am for Dave Ross on Seattle's Morning News.