Their increasing appearances in pop-culture (Blade Runner, Her, Humans, Westworld), reflect our growing fascination with the role artificially intelligent, anthropomorphic robots could have in society. The sex industry is set to be an early adopter of such technology, exemplified by the digitalisation of pornography, development of sex toys into teledildonics and uses of virtual reality. Sex dolls are widely available but in an increasingly open-minded society, demand is growing for interactivity and human resemblance.
While potentially harmless from a hedonistic standpoint, the technological enthusiasm of engineers may mean the impacts of sexbots are overlooked. Of the dolls manufactured by Realbotix, 80% are ‘female’ and based on a pornographic representation of women. Their development into sexbots may exacerbate the false imitation of sexual relationships, devoid of intimacy. The supernormal nature of sexbots provides an exaggerated stimulus which could result in a decrease in traditional sexual experiences between humans. Jessica Szczuka, from the University of Duisburg-Essen, conducted research into the future of sex robots, finding 40.3 % of straight males claim they could imagine using one within the next five years.
Long-term use of sexually explicit material has been shown to result in a six-fold increase in self-reported sexually aggressive behaviour. At Linz’s Arts Electronica Festival, sex doll ‘Samantha’ was severely molested by a group of men, suggesting that sex dolls invite abusive treatment. Could this be seen as a predictor of sexbots entertaining male aggression instead of challenging it, by providing positive affirmation for such treatment? Even through marketing True Companions are eroticising female non-consent, perhaps normalising sexual violence, naming their doll ‘Frigid Farah’. The contrary suggestion that they could reduce violence by providing a sexual outlet is currently unfounded given the few sexbots and hence data, available. It has even been suggested that child sex dolls, like those manufactured by Japanese company Trottla, could be used as a paedophilia treatment but surely their availability has the potential to normalise it?
The development of intelligence in sexbots points to a larger question about ethical frameworks regarding AI. The prospect of sexbots able to respond emotionally and adapt to their owners’ needs, seems increasingly likely. A consciousness is hard to define, however if engineers aim to produce sexbots capable of making decisions and mistakes, feeling pain, learning, at what point does their existentialism begin to resemble slavery?
There is currently no legal nor moral framework to support sexbots’ introduction into society: sex doll engineers work with a passive responsibility with regards to the issues potentially related to their developments. In Hong Kong, Ricky Ma, was able to create an automaton resembling Scarlett Johansson due to lack of laws in place to protect people from this type of violation. The privacy of users is also a concern as they could become vulnerable to hacks if sexbots integrated third party platforms. The consequentialist could argue that in a climate where female rights are a prevailing issue, engineers should be socially responsible as not to dehumanise women to merely ‘plastic holes’ and should consider impacts to customers’ mental health. Could the use of a robot to seek escapism from loneliness, social anxiety or an unhealthy marriage, intensify these issues? It is important that engineers adopt an active responsibility approach and during R&D investigate the holistic effects of the sexbots on their customers and society.
In question is the engineer’s professional responsibility, and so we might consider it the duty of the company as a whole, to define a code of conduct for them to work within. The development of sexbots rather than capitalism is in question here, so we may suspend judgement on the primary measure of a company’s success being economic and consider its secondary motivations. If the intention is to provide an immersive experience, be it companionship or the mimicry of ‘real’ sex, a Kantian stance qualifies the robots’ manufacture as moral, by this positive impetus. The wider deontological perspective also validates their actions, given the engineer’s duty to innovate, in spite of the eventual consequences to wider society, which remain speculative. With these intentions why should the sexbot manufacturer be responsible for foreseeing and mitigating against every avenue of product misuse, outside of the intended design purpose? Given clear guidelines and Ts&Cs, does culpability for ‘misuse’ of a robot not fall solely to the end user? As for speculation that ‘misuse’ of robots could precipitate an increase in cases of human sexual or physical abuse, a recent article builds the case for there being no link between ‘virtual’ and ‘real-life’ violence, as negative correlations fail to hit the headlines.
Even considering ultimate accountability, egoism ethics tells us that if no other human is implicated by the consumer’s use, they are free to do as they wish. Likewise, many religions, including Christianity, cannot negate the concept of free will despite their moral teachings. In both the actions of the creator and end user, if the intention is to maximise pleasure, then a hedonistic perspective would offer support to such innovation too, just as it supports the sex toy industry, already worth over £250 million a year in the UK. Hedonism may seem a viewpoint easy to overlook, yet 56 % of brits are predicted to have watched internet pornography?
It has been suggested that pornography ‘will need to play a starring role’ in virtual reality becoming a viable reality, and the same could be likely with ‘personal’ robotics. The pragmatist could argue that the more we allow technology to seep into aspects of our lives, the sooner the age of the digi-sexual will dawn and the more likely it will be accepted as an aspect of modern living. The Foundation for Responsible Robotics’ 2017 report, Our Sexual Future with Robots, acknowledges the potential for ‘widespread’ consumption of these products, should societal perceptions evolve. By stifling innovation in this lucrative area, we risk undermining a utilitarian goal by impeding advancements in robotics and AI that could benefit wider society, for example by their application in developing healthcare robots.