Categories range from binary examples, like ‘safe vs. dangerous’, to more complex forms of ‘categorial discrimination’. Not only are they fundamental to the existence of all living beings, they also represent a necessity to maintain the illusion of stability, on which society is based. However useful in everyday life, internalised categories also create inequality and injustice, often for those already marginalised and discriminated against, by pushing people into categories they do not represent. Trying to combat strong stereotypes in design, designers are often unable to identify and challenge their own internalised categories, failing in their ambitions to create equal, inclusive and equitable futures.
Aiming to tackle our unconscious tendencies to categorise and discriminate, smart technologies like AI are being utilised in decision-making processes. Whilst such paternalistic approaches to technology-use could be productive in an ideal world, the reality is that structures of power and prejudice are being baked in the data we feed our algorithms, perpetuating our biases and increasing injustice. A more reflexive conversation with artificially generated information is needed. I thus present a collection of queer ambiguous toys, of which each highlights a new reflexive designer-AI collaboration. I define those reflexive interactions as: ‘a form of human-machine collaboration, where the AI is responsible for triggering and assisting the designer’s process of identifying and challenging bias and collective imaginings, rather than proposing the ideal solution itself.’
Check out my website through the button below, and find out how we can use AI to challenge our collective imaginings of ‘femininity’ and ‘masculinity’ in child toys.