I was 16 years old when I moved from Idaho to New York. I was chubby and insecure and hoping no one would notice me. But as I rode the subway downtown after the first day of school, a few girls from class sat down across from me. The leader of the group looked me up and down and said, “You know, you’d probably feel more comfortable here if you had the right bag and shoes.” Each of them, I suddenly realized, wore the same flats and carried the same bag slung over their shoulders.
I wanted to jump out the window. Even now, as a professional adult and mother of a toddler, the pressure to wear just the right thing feels overwhelming. Every woman—and many men!—knows the anxiety of trying to look good, to be fashionable so she might be judged not by her clothes, but by her work, her skills, her personality. That obligatory scene in every rom-com where someone tries on every outfit she owns before throwing them all on the floor is a cliché for a reason: It resonates.
And it explains why Amazon just introduced the Echo Look, a $199 Alexa-enabled camera that does everything Amazon’s original voice assistant can do, plus judge your outfits to help you decide what to wear.
Think of it as Amazon’s version of Cher’s computerized closet in Clueless. It can catalogue your clothes, suggests outfits, and help you choose one of two looks with its “Style Check” feature. You can create a style Lookbook, which of course allows the world’s largest online retailer to recommend clothing you might want to buy. This clearly raises all kinds of privacy concerns—do you really want Amazon knowing when you’ve gained a little weight? Imagine what else it can infer from that? And what else is that camera recording, anyway?—but the psychological repercussions of allowing a computer to judge you are no less troubling.
Wait, you say, people judge my appearance all the time. Strangers, co-workers, employers, even total randos on this crowdsourcing fashion app I downloaded. Fair point! But that is not this. Echo Look relies on artificial intelligence, which brings at least an illusion of objectivity.
When those kids on the train criticized my fashion choices, it upset me for a few minutes. But then I figured, “Screw ’em. I’m my own person and they’re just mean girls.” But if a computer, with its cold, impersonal algorithm, criticizes my fashion choices, how can I possibly dismiss that? An AI doesn’t have an agenda, so it must be right. Right?
‘It could give the wrong impression that there is some quantifiable right and wrong to fashion.’ Stanford ethicist Susan Liautaud
Of course not. First of all, humans train that AI and give it certain parameters. That creates a bias. That’s how algorithms work, and it explains why algorithmic bias remains so hard to eliminate or even identify. In this case, Amazon says fashion specialists will inform the AI. That’s better than relying on high school girls, but the fashion industry isn’t exactly unbiased. And for now, it remains unclear just what parameters Echo Look’s human trainers use.
“The difficulty is that we don’t know how this algorithm really works,” says Stanford ethicist Susan Liautaud. “It could give the wrong impression that there is some quantifiable right and wrong to fashion, but if we knew for example that [Amazon was saying] ‘We’re not trying to tell you what’s right or wrong, but we’ll tell you what Vogue editors would think,’ then the consumer can put that into context and say, ‘You know I don’t really care what Vogue editors think.”
Without knowing the details, it’s difficult to contextualize Echo Look’s judgement. If the goal is to suggest clothes based on how they fit, will the AI privilege clothes that make you look slimmer? Or clothes that have, say, a traditionally feminine silhouette instead of an androgynous look? Like Alexa, Amazon designed the Echo Look to learn over time, so presumably it will figure out your personal style. But it bears asking: Whose sense of beauty will it serve? And will it end up like that troupe of girls in my high school, suggesting we all dress the same?
“Kids are very sensitive to evaluation already,” says John Weisz, professor of psychology at Harvard, specializing in adolescent mental health. “And to add another source of evaluation, and to have it be a source that from many people’s perspectives is very authoritative, it’s hard to see what you gain.”
Weisz is talking about teenagers—who will surely use the Echo Look, even if the marketing materials suggest Amazon designed it for young professionals—but insecurity about physical appearance is by no means unique to adolescents. Professional women in particular experience so much pressure to dress right that an entire cottage industry of stunt fashion journalism ponders what might happen if women simply opted out and wore the same thing all the time.
And that gets back to the point: Amazon didn’t create the Echo Look to make you look better. Oh, sure, the Look might help you with that. But Amazon created the Echo Look to sell you clothing. And there’s a ready market for just such a device, because so many people feel self-conscious about their appearance. And that raises the possibility that the Look might be incentivized to say, “Uh, that? Really? No.” so you’ll buy something.
“They want to sell things, so telling people that they look good just the way they are is probably not what they’re going to do,” says Weisz.
Amazon got into the fashion game in 2012, and introduced its own clothing brands last year. Although the Echo Look presents an interesting use case for artificial intelligence, it is just another way of selling you stuff. But that can be easy to forget when you’re standing in pile of clothing pulled from the closet, asking Alexa to accept you as you are.