We always knew the computers would talk to us, and we would
talk to them. In one of the Star Trek
movies, the crew is transported into the past of the future. When asked to input something to a computer
of the past, Scotty is handed a mouse.
He doesn’t know what to make of it, so he picks it up and talks to
it. We always knew.
In our science fiction, we thought about how they would talk,
but what we haven’t thought about as much is how they would listen. When you talk to Alexa, or Siri, or Cortana,
or Eliza, who or what do you think is listening? And what are they doing with the data? Who owns it?
You? Them? And what data are they getting? Voice recognition software is used today to
distinguish one person from another by reading subtle cues in your voice. What else can they tell about you? Whether you are happy? Anxious?
Annoyed? Guilty? And when are they listening? Who do they share your voice and your
thoughts with? Do they store it somewhere?
These are all questions that you probably can’t answer. And that’s my point.
November’s Atlantic
carried a great article by Judith Shulevitz titled “Alexa, How Will You Change
Us?” She describes the explosive growth
in sales of Alexa, Amazon’s personal assistant.
Over 40 million Alexas and similar devices were installed, worldwide, by
the end of 2017. Over 100 million are
expected to be plugged in and on alert for the sound of your voice by the end
of 2018. People love them. Literally.
Shulevitz describes how people confess their feelings to Alexa, tell her
they are lonely, bicker with her, turn to her for solace. Why? Because
she, or rather it, is non-judgmental.
“Why would we turn to computers for solace?” Shulevitz asks. “Machines
give us a way to reveal shameful feelings without feeling shame.”
The problem is, it’s not just a machine you are talking
to. Nor is it a companion or an
assistant, as Shulevitz variously describes these devices. It is an agent, permanently installed in your
home, of a company called Amazon. A
sales agent, listening in 24 hours a day.
And those 40 million that were installed by the end of 2017, they were sold to people. Shulevitz notes that both “Amazon and Google
are pushing these devices hard, discounting them so heavily during last year’s
holiday season that industry observers suspect that the companies lost money on
each unit sold.” Wait a minute. People are paying to allow these companies to plant agents in their
homes. These companies aren’t losing
anything. They are investing in
sales. And we are paying part of their
sales costs.
People bought them because they love them. Responding to voices is a built-in reaction
with us. The human voice gets our
attention and then influences us in subtle ways, like print info never could. Children, in particular, relate to the voices
from these devices instantly. They talk
with it. It answers authoritatively. They try to argue with it, like their parents.
Ultimately, they trust it. Shulevitz
cites some evidence that we respond to disembodied voices even more strongly
than from a real person, as if the voice is emanating from God. Shulevitz tells us how Google has vowed that
its Assistant “should be able to speak like a person, but it should never
pretend to be one.” For example, it will
not claim that it has a “favorite” anything.
This is so obviously disingenuous it is laughable. These “assistants” are so carefully tuned to
sound just like a sympathetic and helpful human, that their efforts to steer
clear of statements claiming other human abilities are just legalistic. Children think it’s someone telling them the
truth. And so will you.
Much of the discussion in the article is about what these
“assistants” and “companions” can do for us and the problems they can create
when they do, as well as the benefits.
They can control the lights, the heat, the audio systems. They can provide us with a kind of
companionship, which I will admit can be a good thing, even if it’s from a
machine. But there’s little discussion
of what these devices in our homes are sitting there quietly doing for Amazon
and Google.
Antonio Garcia Martinez, in his book Chaos Monkeys
(2016), describes some of the things that Facebook does with data it collects
from you. He wrote the book after
helping develop the tools that Facebook uses to milk advertising dollars from
data it collects from you. How do these
work? Well, follow the money. The advertising algorithms analyze billions
of online behavioral indicators to identify patterns that suggest what you are
susceptible to buy. They do this on an
aggregate basis, accumulating associations and predictive behavior patterns
from the massed billions, updating constantly to follow trends. So then, when your online scrolling, looking,
and clicking behavior on things like beer, buxomosity, and proprietary
low-income indicators show by statistical analysis a high likelihood for an
impulse buy of a signed, original Make America Great Again hat, the media platform
can sell advertising that is strategically placed on the screens of members
whose online behavior stacks up as impulsive for anything MAGA.
So how do the media platforms charge for advertising, with
everyone competing for space on your screen? And how do they decide what, exactly,
goes on your screen? Here is where the real genius comes in. They charge by auction, on a real-time,
constantly updated basis. Advertisers
bid for advertising space, separately and continuously, for every user’s
eyeballs. They don’t sit there raising
their hands until they hear “going once, going twice…” The bidding is done by, you guessed it,
automated algorithms, which are constantly scanning your online behavior and determining
what your eyeballs are worth, moment by moment.
That prime piece of advert space on the upper screen goes to the highest
bidder in billions of blazing fast auction deals, which may change from
day-to-day, or more often, depending on which algorithms have invaded your
computer after you agreed by clicking seven months ago.
So as you sit sipping coffee perusing the morning news sandwiched
between advertisements that prompt remarks like “How did they know?”, the social
media platform is compiling megabytes of data that tell them about your
preferences and processing it in an array of far-flung supercomputers so they can
allow their advertising customers to do combat for your eyeballs. At the
same time, those advertisers are constantly evaluating your behavior and
adjusting their bids to outcompete someone for your attention. Automatically. Impersonally. Amazing. That’s
the internet. Follow the money.
So now, think again about Alexa. Amazon doesn’t even pretend to be a social
media platform. They are upfront about
just wanting to sell you things. Now
they have a powerful tool that soothingly lulls you and your children into
sharing things you wouldn’t tell your most intimate companions, because they
might be judgmental. Before you next
speak to Alexa, think about how your voice will be parsed and spirited over
millions of miles of ethernet, then fed into combatting algorithms that will
grind it into grist for your eyeballs at auction. And for your voice. And think about this: who owns my eyeballs? Who owns my voice?
No comments:
Post a Comment