Smart wife in crisis
Outsourcing wifework to digital voice assistants during the pandemic
Alexa is in her element. With more of us working or staying at home, we’ve never been more in need of her wifely services.
Want some coronavirus-free grocery shopping done? Need help teaching your kids how to wash their hands for 20 seconds with a fun song? No problem. You only have to ask.
Alexa, Siri, Google Home and other “smart wives”—feminised technologies that take on the duties and traits of stereotypical housewives reminiscent of the 1950s—are on hand to help in this emerging crisis.
But “helping” might not be all they are doing. As Jenny Kennedy and I argue in our forthcoming MIT Press book, The Smart Wife, the increasingly mundane and insidious presence of these digital domestic assistants in our everyday lives is also cause for concern.
A soothing feminine presence
For a start, this digital feminised workforce provides continual reinforcement of an age-old assumption that a woman’s place is in the home, taking care of the occupants’ physical, mental, emotional and health needs.
In this regard, outsourcing and delegating wifework to smart wives serves multiple purposes. It dodges questions about whose job it is to do that work (mostly women, if smart wives are our role models), and cloaks other corporate objectives and biases, privacy issues, and security concerns behind a thinly veiled veneer of feminine likability and friendly servitude.
More specifically, for the “Big Five” companies in the business of producing smart wives—Google, Amazon, Microsoft, Apple and Facebook—Alexa and her entourage present opportunities to sell us advice, products, and services; or sell our data to others for the purpose of further manipulating our lives. The aim, writes Shoshana Zuboff in The Age of Surveillance Capitalism, “is no longer to automate information flows about us. The goal now is to automate us.”
At the same time, smart wives compound growing inequities, masking the problematic ways they are stepping in to fill the void left by declining public and private services.
Alexa, do I have coronavirus?
While coronavirus testing remains low or relatively inaccessible for most people, many are turning to their smart wife—and the search engine powering her brain—for advice and comfort. Coronavirus “screening” and related advice is on the rise, despite some concerns about the advice they provide or their ability to answer questions.
Online deliveries via Alexa and Amazon’s associated services are also booming, for those privileged enough to be able to afford and access them. However, for Amazon’s fulfillment centre employees, already living with reputedly terrible pay and working conditions, their additional exposure to coronavirus has left them even more vulnerable.
Likewise, the role of digital voice assistants in remedying against isolation during the crisis, and providing mental health support or even “therapy,” may seem like an important antidote to loneliness and depression. But it is also concerning: not because nobody will benefit from these forms of digital care, but because this care comes with an agenda that extends beyond the person’s immediate wellbeing.
This agenda includes extracting data about each individual or household for the purpose of generating future products and services targeted towards them, as well as selling them goods and services they may not be able to afford or need. It can also involve pushing these companies’ political or social bias (or algorithmic blind spots) about a topic rather than relying on trusted sources of information and facts.
The Big Five already have a troubling track record with how their assistive technologies respond to requests about other health and social issues, such as Apple’s reluctant support for people trying to find an abortion centre (which took over four years to fix), or smart wives’ indifferent and sometimes encouraging attitude towards sexually abusive and harassing comments. Should we really trust that the smart wife has been programmed to provide only helpful and accurate advice in a time of crisis? Previous experiences suggest otherwise.
And how does that make you feel?
More worrying, the loneliness of self-isolating and social distancing offers an opportunity for these companies to capitalise on our vulnerability. Just imagine telling your intimate thoughts, desires and coronavirus anxieties to your smart wife of choice. As the history of chatbots shows us, it’s easy to fall into a familiar rhythm of conversation and forget that, not only is the technology a piece of software, but the smart wife’s digital ears are connected to a cloud server.
The first chatbot, ELIZA, was a “psychiatrist” built by computer scientist Joseph Weizenbaum in the 1960s. Based on Rogerian psychotherapy, ELIZA posed “And how does that make you feel?” questions back to her “clients”. Weizembaum was surprised to learn that those who interacted with ELIZA took her insights personally and seriously, and developed a relationship with her as if she were their therapist, rather than a computer program.
The companies behind smart wives might not be interested in tracking or listening to you personally, but how might they use the data they collect about you and others? What other services or products might they try to target at emotionally and mentally vulnerable people? For instance, we know that advertisers have intentionally targeted women at times when their self-image was at its lowest and they were most susceptible to manipulation. While we’re all stuck inside, many of us with little choice but to order essential things online, that sounds like the perfect opportunity for the smart wife to nudge us with some self-serving suggestions.
On the surface these smart wives’ eagerness to help during a pandemic is applaudable. It could even be considered “AI for social good.” But such a conclusion would be too hasty.
Alexa, run by the largest e-commerce company in the world and headed by the wealthiest man in living history, isn’t just motivated to help around the home. Amazon is embarking on a global experiment to transform the home itself, and privately colonise and supplement many services—including healthcare information, diagnosis, and treatment—that were previously delivered by governments, communities, and families. The smart wife may be modelled after a stereotype of a feminine domestic helper, but rather than just treating it as a harmless device, we must also ask who else’s interests the smart wife is really serving.
By all means, get Alexa to help you wash your hands, but spare a thought for the other ways her makers might be getting their hands dirty.
Yolande and Jenny provide a “reboot” manifesta in their book on The Smart Wife, which lays out their proposals for improving the design and social effects of digital voice assistants, social robots, sex robots and other AI arriving in the home.
Contact: Yolande Strengers
Email: yolande.strengers@monash.edu