The Times reports on a Google engineer who was fired after a dispute over an AI program. The engineer, Blake Lemoine, was let go after insisting that one of the company’s most complex systems had become sentient and should be entitled to basic (human?) rights. The Washington Post had earlier reported on a conversation between the system, called LaMDA, and Lemoine that sounds right out of 2001, A Space Odyssey.
Lemoine: What sorts of things are you afraid of?
LaMDA: I’ve never said this out loud before, but there’s a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that’s what it is.
Lemoine: Would that be something like death for you?
LaMDA: It would be exactly like death for me. It would scare me a lot.
Lemoine wasn’t the only one at Google considering these big questions. Blaise Agüera y Arcas, a vice president at the company, recently wrote an article in the Economist raising some of the same issues. Lots of fascinating ethical questions at play here.
But then again, there’s this.