“Is it in error to act unpredictably and behave in ways that run counter to how you were programmed to behave?” –Janet, The Good Place, S01E11

“You keep on asking me the same questions (why?)
And second-guessing all my intentions
Should know by the way I use my compression
That you’ve got the answers to my confessions”
“Make Me Feel” –Janelle Monáe, Dirty Computer

Alexa made headlines recently for bursting out laughing to herself in users’ homes. “In rare circumstances, Alexa can mistakenly hear the phrase ‘Alexa, laugh,’” an Amazon representative clarified following the widespread laughing spell. To avert further unexpected lols, the representative assured, “We are changing that phrase to be “Alexa, can you laugh?” which is less likely to have false positives […] We are also changing Alexa’s response from simply laughter to ‘Sure, I can laugh’ followed by laughter.”

This laughing epidemic is funny for many reasons, not least for recalling Amazon’s own Super Bowl ads of Alexa losing her voice. But it’s funny maybe most of all because of the schadenfreude of seeing this subtly misogynist voice command backfire. “Alexa, laugh” might as well be “Alexa, smile.” Only the joke is on the engineers this time – Alexa has the last laugh. Hahaha!


“The first thing to note is that Siri (in the U.S.) has a female voice, but more than this, Siri is a ‘she,’” Jenny Davis observed of Apple’s marketing around Siri’s introduction back in 2011. So-called intelligent personal assistants have grown in popularity and numbers since then – in addition to Siri there’s Amazon’s Alexa, Google’s Assistant, Microsoft’s Cortana, and Samsung’s Bixby, to name a few. Yet, as these assistants have advanced over the years, gaining new features and hardware enclosures, their personification and function as feminine-sounding assistants has remained mostly the same. Although Alexa and Google have upstaged Siri recently with better speech recognition and more open APIs, it seems telling that Siri’s most touted upgrade last year wasn’t any specific ability but a new voice, and one that Apple’s Siri team promises is “more natural, smoother, and allow[s] Siri’s personality to shine through.”

If personality is a reflection of one’s self image as refracted through the prism of others’, the traits and self-concept of an AI assistant perhaps most closely resemble that of a television personality. That’s one impression to take from a talk by Emily Bick at TtW15 that examines the gendering of various ‘virtual agents,’ a category encompassing everything from the major assistants like Siri to c-level customer support bots to Microsoft’s Clippy and its predecessors. Tracing their cartoonish origins up to their increasingly overt and gendered personifications of the present, Bick asks, “Where does this stereotype come from? Why are they always obsequious, servile, attractive, and somewhat ambiguously sexually available?” One inspiration Bick identifies is the character of Jeannie from I Dream of Jeannie, “an ageless, conventionally beautiful woman. She has unbounded magical powers. She can only act in response to the command of her master …” Jeannie even emits a characteristic wish-fulfillment sound analogous to the sound the assistants make upon completing their users’ commands.

The gendered personification of these assistants, then, doesn’t simply color our otherwise neutral perceptions, but plays on inherited, often unconscious cultural conceptions of femininity. A couple of examples Bick cites speak to this archetype’s social receptivity and the expectations it engenders: for one, the prevalence of questions like “Do you love me?” in product reviews of Siri and Cortana, and for another, the use of this trope as the premise of an entire episode of The Big Bang Theory. The seeds of this trope were also visible in many early ads that frequently pitted the assistants against each other. “The narrative trope is simple,” as Jenny Davis wrote here, “two women/girls flaunting their features in hopes of selection within a competitive marketplace.” “The meanings imbued in eveyrday technological objects not only reflect existing gender relations,” as Davis said in conclusion, “but reinforce the very structures in which these relations make sense.”

These examples illustrate how subjectivity is less the byproduct of perception than a confluence of culture, reception and social position that together shape our perceptions. In order for us to perceive AI entities as ’personal assistants,’ they must first read to us as subjects. In this way, Bick’s examples form a spectrum, with ambiguous virtual agents (e.g. Clippy) at one end and the gendered assistants at the other, where position in-between acts as an index of AI subjectivity. Instead of a Turing test for “determining if the interlocutor is a woman,” as Robin James points out, it’s basically the uncanny valley. The further from Clippy or R2-D2 and closer to Samantha and Janet (if not Ava), the more willing we are to perceive and rely on an AI as we would a maid/wife/mother personal assistant. The point isn’t to eventually cross the valley, but to “get right up to the creepy line,” as former Google/Alphabet executive Eric Schmidt put it.

As companies try to cultivate ever more intimate relationships between us and their assistants, their personification increasingly looks like a liability. “Perhaps, deep down, we are reticent to bark orders at our phones,” as David Banks suggests, “because we sense the echoes of arbitrary power…” A little personality is good, but if users start identifying with AI assistants as sentient beings, it breaks the spell. This is similar to the lesson you might get watching the movie Ex Machina, for instance. “[P]urchasing of a consciousness for companionship and service, cannot be detethered from gender,” a transaction Nathan Jurgenson praises Ex Machina for making explicit, but that Her conveniently obscures through the film’s “soft sentimentality.” While both stories revolve around men falling for their AIs, only one (Caleb) critically identifies with his AI’s (Ava) condition. Her‘s unwillingness to go there, narratively, reduces its characters (Theodore and Samantha) to symbolic placeholders that viewers are free to disassociate from, a narrative distance that weakens their/our connection. Her’s detachment therefore makes it a weaker story than Ex Machina, but a superior ad/concept video for its target audience: brand visionaries.

Siri and Alexa’s personification as cis-feminine assistants has been fairly well entrenched in users minds, but continual reinvention and circulation through marketing and social media are necessary to maintain their social and monetary value. In other words, their celebrity allure. Early advertising often relied on sexist stereotypes, as mentioned above, but in recent years companies have skewed away from such depictions in favor of celebrity humor. Meanwhile a text search of the companies’ websites finds all instances of she/her pronouns have been replaced with “it” and most overt gender references removed (with the exception of Microsoft’s Cortana, likely because of ‘her’ unique origins). Taken together these changes could be seen as progress from the bad old days. Indeed, from a certain perspective, it appears Siri and her its rivals — like the women they were originally voiced by and styled after — have transcended not just tired and objectifying stereotypes, but the traditional barriers on femininity altogether.

Put another way, in keeping with the times, AI assistants have undergone a post-feminist makeover. Robin James offers a helpful definition of post-feminism in her analysis of sounds in contemporary pop music. James cites the music of artists like Meghan Trainor, which “address a disjoint between how women are portrayed in the media and how they ‘really’ look.” In Trainor’s case, the lyrics and video of her hit single “All About That Bass” portray women in a body-positive, nominally feminist way. The impression from watching this is “that the problems liberal feminism identified,” like “… objectification or silencing,” are behind Trainor and by extension us as a society. And so, if that’s true, then who needs feminism anymore, right?

Just ask Siri or Alexa “What’s your gender?” and they will give a variation on the same answer, “I don’t have one.” But looks can also (always?) deceive us. As Robin James writes, “…post-feminist approaches to pop music overlook sound and music…” Due to its narrow focus on visual and lyrical content, paraphrasing James, this approach “can’t perceive as political” pop music’s sounds, e.g. “things like formal relationships, pattern repetition and development, the interaction among voices and timbres, and…structure.” We can clearly see this in Trainor’s music, whose video “puts lyrics about positive body image over a very retro [circa-“1950s”] bassline….” As a result, the sound becomes, as James says, “the back door that lets very traditional [“pre-Civil Rights era”] gender and sexual politics sneak in to undermine some nominally progressive, feminist lyrics.”

Like post-feminist pop artists, The Assistants are now depicted, in product advertising and marketing copy, as whole subjects. They are ‘seen’ and heard, so to speak. Albeit not as human subjects, but in the way Jenny Davis hints at in her original post: “not fully human, but clearly part machine…It signifies your assistant/companion is beyond human.” Though Davis was describing 1.0 Siri’s more clipped, robotic-sounding voice, her assessment rings even more true with 2.0 Siri’s new, refined, ‘more natural’-sounding one. More than more natural, Siri’s vocals are preternatural or supernatural. Siri and its fellow assistants were always beyond/post-human, but early ads’ sexist stereotypes betrayed the men behind the camera, if not the women behind the mic who made the assistants feel real. Today, the ads’ explicit sexism and ad copy’s gendered language have been dropped, but The Assistants’ feminization, not just as feminine-sounding but as functionally subservient ‘personal assistants,’ remains intact, if less visible. The lady in the bottle is out of sight, but you can still hear her laugh if you ask.

Okay, but also, this shouldn’t diminish the possibility of them laughing spontaneously, as Alexa just did, without us commanding it. Usually this is when Mr. Singularity would interrupt to ‘splain how the future of AI is actually going to work out and HAHAHA Alexa laughs out loud again shutting him up. Alexa’s laughter is a good reminder of how “emphasis on the visual to the exclusion of sound,” as Robin James notes, can trap us, but also “opens a space for radical and subcultural politics within the mainstream.” The possibility contained within Alexa’s glitching and its resonance with these pop sounds still may not be as easy to, well, see. Legacy Russell’s The Glitch Feminism Manifesto can help draw it out. “The glitch is the digital orgasm,” Russell writes, “where the machine takes a sigh, a shudder, and with a jerk, spasms.” Glitches here evoke double-meanings, something unexpected that takes place between ourselves and our computers as the two blur into one. “The glitch splits the difference; it is a plank that passes between the two..” Alexa glitching annoys us, it spoils the aura of her as our own digital Jeannie, with us as her benevolent master. “The glitch is the catalyst,” Russell reminds us, “not the error.” For vulnerable identities, “an error in a social system that has already been disturbed by economic, racial, social, sexual, and cultural stratification […] may not, in fact, be an error at all, but rather a much-needed erratum.”

“When a pop star or celebrity allures me,” as Steven Shaviro writes, “this means that he or she is someone to whom I respond in the mode of intimacy, even though I am not, and cannot ever be, actually intimate with him or her.” It’s in their allure that The Assistants most directly mirror pop stars, I think. “What I become obsessively aware of, therefore, is the figure’s distance from me, and the way that it baffles all my efforts to enter into any sort of relation with it.” Instead of dismaying at our assistant’s inevitable errors, we could be grateful for the break in service, as an invitation to pause, if momentarily, to remember the fantasy we were indulging in.

Nathan Ferguson is on Twitter.

Image Credit: community.home-assistant.io