Andy Wynne from Ignite Exponential looks at the evolution of robotic pets and asks if we will grow to adopt and love them.
But, can you actually love a robot?
There have been many studies on human interaction and robotics that show we have natural empathy for moving objects, which appear to ‘live’. As children we knew our dolls and action figures weren’t alive, but we loved them anyway, and if you are one for movie references, think of Wilson the volleyball in Castaway. More recently, if like me you watched the film Ron’s Gone Wrong over Christmas, you will have seen a vision of what a future collaborative robot or ‘cobot’ could look like, connected devices as companions.
Robotic companions have also been found to reduce stress, anxiety and depression, the same way real pets do. They provide empowerment to those that feel they are being taking care of; loving something they feel they are in control of, where normally they are the vulnerable ones. These benefits to wellbeing are met without the demands and high maintenance of a real pet including the attention, daily exercise and yearly cost. A rechargeable battery and interaction on demand without the hours of training needed, what an amazing asset!
Many years ago when I worked in the toy industry, our design team explored new products, technologies and play experiences including the emergence of early robotics. But the technologies at the time and motor function weren’t up to speed, leaving many children and adults dissatisfied with the experience. But, during those early years of robotics, something quite spectacular was produced by Sony, its robotic dog called Aibo. We were lucky enough to have purchased this £3,300 toy, if I remember correctly, with our objective to discover if a price tag of this nature could fulfil its promises as a robotic pet and capture all those emotions we as humans associated with man’s best friend.
Aibo had sensors all over its body, so he or she knew when it was being patted, and cameras that helped it learn and navigate the layout of its surroundings. Aibo also had ears with microphones to hear voice commands processed by deep-learning algorithms to interpret the instructions. Sony had succeeded in creating a robotic toy dog with “real emotions and instinct”, albeit an expensive one.
Aibo could bark and perform a few tricks as well as capture photos with the camera in its nose . Unlike a real dog, Aibo didn’t need to be fed or groomed, and it would signal its mood with a colour-coded lighting system on its head; green happy and orange angry. And like a dog, it could be ‘trained’ through actions or affectionate taps on its head.
Not long after, Panasonic started work on a robotic cat, while Tiger, the Furby-maker, introduced a discount version of Aibo called the Poo-Chi, which sold more than 10 million units in its first eight months.
The Furby was the hottest toy to be sold in 1998. While children understood that toys like the Furby weren’t ‘alive’, they still formed emotional attachments and felt empathy. But when a Furby suddenly decided to stop working, it could easily be swapped out for another. One experiment performed by researchers asked children to hold a hamster, a Furby, and a Barbie doll upside down for as long as they could. The children righted the hamster almost immediately and the Furby after about 30 seconds, when it started to shake and say things like ‘me scared’. They held the Barbie upside down until their arms got tired. When designing a cobot, there is a need to incorporate features that create emotional triggers should that be required.
Even when we know that a “pet” like Aibo can’t think or feel, that doesn’t stop us from treating it like something that does. Think how adults and children with disabilities or the elderly could benefit from easy “hassle free” companionship, particularly through long periods of time without the social interaction as we have experienced with the pandemic. A robotic pet could satisfy these emotional needs, responding to touch and movement, which are stimulants proven to increase cognitive activity. Trials are already underway to treat conditions like dementia and Alzheimers with assistance from robotics.
It’s now 20+ years since Aibo made its first appearance. With all the latest advances in smartphone technologies, compact cameras algorithms and miniature processors to crunch data, surely a successor to Aibo is on the horizon.
But what will this deliver as an experience and will it replace our beloved best friend? You can create a social robot that combines all of the mentioned triggers, and leads people to enjoy interacting with it, even though it’s a set of components, cameras and whirring noises. Interestingly the cobot character ‘Ron’ in the film ‘Ron’s Gone Wrong’ is described as a ‘walking talking digitally connected device’. The film quite rightly highlights that many younger people already have an attachment to their mobile phones, and having accepted a ‘relationship’ with technology, would find moving to a simple-looking robot an easy transfer of emotion.
Older generations who many need support for healthcare needs might find having a trusting relationship with technology much harder, and may need softer design touches for example, the appearance of an organic being.
With the array of sensing technologies, we have in our hands today, combined with creativity and innovation, we can open up more opportunities commercially, domestically and within the realms of play. As a product designer, I’ve been discussing with my robotics team here, how to harness these opportunities and deliver great experiences for people at different stages of their lives. If you would like to chat about this with us, please do get in touch.