An interdisciplinary collaboration between the Faculties of Arts and Medicine & Health at the University of Leeds
On Wednesday 2nd November, some members of the Augmenting the Body team—Stuart Murray, Amelia DeFalco, Ray Holt, and Sophie Jones—took a field trip to Sheffield Robotics to get a closer look at the research going on there. We were warmly welcomed by Tony Prescott, Michael Szollosy, and an array of resting (for now) robots, including the spiky-haired Zeno, the lab’s expressive humanoid robot. Sitting quietly in a corner, Zeno’s piercing eyes seemed to follow me around the room. I eyed the robot nervously: were those bright pink lips frowning or smiling? Was Zeno happy to see me?
This is the trouble with robots: it’s hard not to interpet a smile as a sign of happiness. If Zeno finds its way into school classrooms, as Prescott suggests in the video above, then these uncanny interactions will be part of the fabric of everyday life. Is it a problem if we ascribe human motivations and emotions to them? Szollosy told us that, when the robots are out in public, he regularly has to remind onlookers that the machines can’t be sad, angry or tired. At the same time, the instinctive sense that it’s wrong to hurt a robot might not be wholly irrational. Are the boundaries of acceptable moral behaviour altered by our interactions with robots?
The work of Sheffield Robotics spans not only the design and application of robotics technologies, but their cultural and ethical implications. At his Dreaming Robots blog, Szollosy explores public perceptions of robots in the news and on TV (Westworld is live-blogged every week). One of the aims, say Prescott and Szollosy, is to prevent robotics going the way of GM foods, which are still haunted by the ‘frankenfoods’ label. They ask: why are we afraid of robots that have the potential to enhance education, act as companions for dementia sufferers, support children with autism, and generally “make things better”? To the age-old fear of robots stealing jobs, Prescott and Szollosy point out that most people don’t much like their jobs anyway. Robots call for new socio-political solutions, they say: universal basic income, for example.
What’s more, the dystopic fantasies detract attention from more nuanced debates about these new technologies. Augmenting the Body’s visit brought to the fore a set of questions about design, disability and norms in robot design. Our visit ended with a demonstration of the iCub, an advanced humanoid that acts as an open-source platform for research into learning, cognition, and AI. Smooth-faced and “bald”, with deep black eyes and eyebrows illuminated in red, the iCub is less uncannily realistic than Zeno at first glance. Even so, as we watch it learn to identify, point at and pick up two small rubber toys—a chicken and a panda—it reminds me more and more of a child. “Do you know this is a chicken?”, the iCub keeps asking, like a toddler flaunting its new skills.
Sheffield’s iCub is currently legless, but its virtual avatar on screen stands on two legs. After a thought-provoking discussion of symmetry at our first Sadler Seminar on dance, disability and prosthetics, we wonder: do we want our robots to have symmetrical bodies? If we were to look at a robot with one arm, would we see a broken machine? The notion of humanoid robots begs the question: what does a human look like? If humanoid design proceeds from a set of norms about the human body, where does difference—of disability, age, race, and gender—fit in? We explored many of these questions at our ‘Redesigning the Human’ seminar on November 7th, at which we heard presentations by Prescott and Andrew Cook from Dundee’s Hands of X project
The next seminar, on Disability and the Dishuman, takes place at 2pm on December 5th at the Leeds Humanities Research Institute. See you there?