Would you like to go on a date? |
Recent
artificial intelligence (AI) films like Her
(excellent) and Transcendence (meh,
it was OK) have entered a not so new question about computers and where they’re
taking us as a culture. I’m not really a
scifi/futurist writer (although I’ve played one for teleconferences) but every
now and then, I’m engaged on a science-y topic that invites some logical
discussion. The question of AIs walking
among us is nothing new. One of the
earliest science fiction movies, Metropolis
(1927), had AIs. Westworld, which came out in 1973, the year I was born, posited the question of using life-like
androids as a means of high-end entertainment.
But
most of the earlier films (and writing, for that matter), have AIs going awry
in a really bad way. Witness the
impressive, but psychotic, actions of one HAL 9000 when “his” programming is
tampered with in 1968’s 2001: A Space Odyssey. Newer films posit the ever-rising possibility
and plausibility of humans not just killing or being killed in horrible ways by
AIs (Terminator, Blade Runner, Alien, Resident
Evil, etc.), but also having relationships with our artificial creations. No, not just those, relationship, as Jude Law showed us through his android gigolo
in the Spielberg/Kubrick A.I. (although,
those as well). The meaningful,
emotionally-connected relationship that most humans strive to seek and find,
but generally fall short—like Haley Joel Osment showed us through his little boy
character in the Spielberg/Kubrick A.I.
Come with me if you want to have an emotional connection! |
Before
you reject the argument out of hand, consider for just a moment the depth of
emotion and connection that online relationships/dating elicits. If you’ve never watched an episode of Catfish, well maybe you’re the lucky
one. On the other hand, when 99.44% of the “catfishers”
turn out to be less than they suggested online, the explosion of drama is like
witnessing the Hindenburg, but in color and with people.
Of
course, those are people, real, breathing, eating, farting people, on either
end of the interwebs screen, and the emotion is real, even if it’s based on a
lie. Part of this is because we, as
humans, want to love and be loved—so much so that we’re willing to be blind to
obvious, and repeated, red, neon, warning flags. Already, there are examples of online and real life interactive programs that can and have replicated emotional connections.
In
some ways, an artificial companion would be better than the messy, complicated,
social interactions we currently navigate on a daily basis. Take Facebook for example. Have you had “a friend” who lost friends over
a comment, a joke, a political or social stance? I know I have. Although that’s mostly because not all my
jokes are funny. Now consider an AI, free
from a lot of the emotional baggage and triggers that human companions are
subject to. There could potentially be less work to be done in a relationship,
on whatever level. We’re talking about mimicry here, not an actual emotional response, but if it's a perfect
mimicry of love, affection, tenderness, etc., will you be able to tell the
difference? If you can't tell the difference between "real emotion"
and mimicry, does it matter?
Some
empathy for others isn't full empathy. It can't be. You'd be emotionally
bankrupt on a constant
basis if you were always empathizing with friends,
family, co-workers, acquaintances, and Michael J. Fox. But based on experience
(which you draw from your memory bank and social programming algorithms), you
know that you can make certain sounds and certain facial expression so that the
other person at least THINKS that you care. You probably do care, but you
aren't actually empathizing with them. Or maybe you don't care, but you just
want the other person to believe you do, for whatever reason—social constraints
require it.
Search your feelings. You know that I am cute! |
Could
you love an artificial human … real love? The answer is obviously yes. Humans can feel
strong emotional connections to any number of non-human things. Animals/pets,
books, movies, inanimate objects of all stripes. We assign an emotional value
to them, and they take on that meaning, whether they want to/can or not. If an AI, an artificial human, can return
emotion, even if it's perfect mimicry, as a real human you probably wouldn't
care. If you can't tell the difference, then you have a loyal, loving, caring
companion—just like Suzette—in many ways better than a real human.
Except
for Batman. Nothing is better than Batman.
No comments:
Post a Comment