Engaging girls and women in AI futures: Lessons from ‘Superbots’

By Associate Professor Yolande Strengers

In less than a decade, voice assistants and the AI they depend on have become a ubiquitous part of our lives. They help with daily tasks, provide on-route navigation, set reminders, connect us with others, provide companionship in their own right and tell bad jokes. Integral to their design and success is something that is vastly underplayed: their personalities.

By and large, digital voice assistants are designed with friendly, service-oriented personalities that we find comforting and non-threatening. This has led to a procession of incredibly similar assistants like Siri, Google Home and Alexa that prioritise feminine voices, names and traits.

While this is slowly changing, leading brands continue to be criticised for creating subservient and demure personas, and promoting gender stereotypes and harmful biases – opening them to abuse.

This was the impetus for the Superbots Industry Immersion Program, an innovative collaboration between Monash Tech School, Monash University’s Faculty of IT, the Women in Voice (WiV) ANZ Chapter and voicebot company TalkVia.

Monash Tech School bridges the gap between emerging technologies and future innovators.

Superbots Industry Immersion Program

The Superbots program was aimed at Years 7 to 9 girls from government schools in the Monash local government area. The group was identified in our research as being susceptible to peer pressure, with IT stereotypes likely to deter them from careers in the field.

The early to mid-secondary school years are when girls’ career aspirations typically turn towards traditional gender roles. In 2017, Australian girls aged 14 to 15 aspired towards careers in teaching, nursing and beauty, with engineering and transport, ICT and science not even making their top 10. Low confidence in maths (regardless of ability) and higher confidence in humanities subjects has also been linked to girls’ declining interest in STEM during these years.

Combining socio-technical tutoring, interactive co-designed activities and mentorship, Superbots presented AI as a creative, social and technical practice. It placed social science at the centre of advanced AI to consider how we can design more meaningful and inclusive voice assistants that reflect diverse personalities.

Leveraging Monash Tech School’s expertise in using Design Thinking to facilitate student agency, participants imagined and designed their own voicebot which centred around a ‘personality planner’ that prompted consideration of anthropomorphic traits – such as the voicebot’s name, ‘species’, gender, race and age, in relation to its intended purpose and audience.

Design Thinking is a highly-collaborative and creatively driven process. It’s the ideal pedagogy to allow students to wonder ‘How might we’ in a safe, non-judgmental forum. Embedding Design Thinking in Superbots engaged girls who may self-identify as being HASS orientated with human-centred career ambitions.

The WiV Australia and New Zealand network mentored participants on their voicebot personalities which were programmed into conversational question and answer responses using the voicetech platform, TalkVia One.

The feedback was kind and helpful, and enriched the Design Thinking process that allowed students to work in small teams, research, communicate, negotiate and test their prototype.

The mentors – leaders in their fields and businesses – also supported the content by highlighting their own creative pathways into voice-based AI, with bachelor degrees in Arts and Fine Art complementing their technical skills.

The results

Superbots demonstrated the critical overlap between Human, Technical and Innovation studies.

The program ran in November 2021 and April 2022, and involved a total of 39 students. Feedback was extremely positive, with most agreeing that the experience helped them explore broader ideas and learn new skills.

More importantly though, almost 80% of participants reported they were more likely (36%) or highly likely (already planning to, 42%) to study a VCE Science, Maths or Technology subject.

Across the two cohorts, intention to consider a STEM-related career after school was also very positive, with 76% reporting they were more likely (49%) or highly likely (already planning to, 27%).

Most students also stated a boosted confidence in their knowledge of IT and STEM after Superbots.

Why Superbots came to be

A familiar lament in the computing disciplines and industry is the lack of representation from girls and women – where they account for 28% of enrolments globally in information and communications technology (ICT). Progress has been remarkably slow, and in some cases, falling further behind.

In advanced technology disciplines like AI the data is even more concerning, prompting the AI Now Institute in 2019 to label the situation a ‘diversity crisis’. The Institute noted an extreme disparity in the representation of women in industry (eg. comprising 15% of AI research staff at Facebook and 10% at Google). Black workers were even more poorly represented, and there is no data on trans or other gender minorities.

Last year, the Faculty of IT’s Equity, Diversity and Inclusion Committee researched the underrepresentation of girls and women in computing science. We found ongoing cultural associations between advanced technical disciplines and masculinity.

Geek stereotypes and computer whiz assumptions still circulate, perpetuated by pop culture and reinforced through all facets of society. While a proportion of girls and women will pursue technical disciplines and careers nonetheless, a bigger opportunity exists.

As well as programs that break down stereotypes and advocate for girls and women to pursue technical careers, we must also recognise how creative and humanities industries (where women are more likely to be represented in higher numbers) are integral to the successful development of AI.

Diversifying AI

One explanation is that we’re not paying enough attention to the industries and stereotypes that structurally exclude underrepresented groups.

What if we redefined the boundaries of technical disciplines by disrupting the categories that designate what’s ‘in’ and what’s ‘out’?  What if we repositioned technical disciplines as socio-technical, with a more broad and diverse range of interests, concerns and approaches? 

In other words, if AI – as a field of research and technical application – were more diverse, could that open up opportunities for a greater variety of people to contribute?

Of course, this isn’t just an inclusion issue. If AI is to realise ‘social good’ ambitions embedded in institutional and corporate visions, this discipline also needs to embark on a more extensive and radical interdisciplinary and collaborative effort with the humanities and social sciences.

The Emerging Technologies Research Lab, an initiative between the Faculty of IT and the Faculty of Art, Design and Architecture, as well as elsewhere across Monash and beyond, provides examples of what’s possible.

The Centre of Excellence of Automated-Decision Making and Society for instance, is embedded in the social sciences and humanities, and delivers cross-disciplinary and interdisciplinary research to create knowledge and strategies for responsible, ethical and inclusive automated decision-making.

Where to next?

Superbots is an example of a larger opportunity to diversify how we think about AI and define the boundaries of this emerging discipline and industry.

While the program focused on girls and women, there’s a vision to involve more underrepresented and excluded groups – and consider how the intersections of attributes such as race, gender and disability shape people’s interests and opportunities in AI and more broadly, their experience of it.

Most importantly, the lessons from Superbots invite us to contend with the ways in which highly technical disciplines and industries have sought to gatekeep their knowledge and expertise.

Placing emphasis on underrepresented groups to change in order to fit the discipline is one way forward, so far with limited success. Another is to redefine the boundaries of AI and other technical fields themselves.

In order to develop an inclusive discipline that invites people in through multiple pathways, we must recognise that AI is a social science as well as a technical one.