AI will be used to create, curate, personalise, or translate stories, fiction, movie scripts, textbooks, blogs, social posts, knowledge hubs, Adverts, using algorithms and data models rooted in gendered representations of the world.
AI increasingly powers virtual assistants that face consumers, customers and workers. Many bots, virtual assistants, or domestic robots have an implied feminine gender, perpetuating gender stereotypes. Most virtual assistants are developed & trained by white males.
AI powered applications will execute decisions as well as propose them. Systemic bias & stereotypes risk being inherited from data sets, embedded deep in code, and reinforced by learning loops, perpetuating or worse amplifying gender bias. Applications at risk: