The blog of Tokyo based photographer and photojournalist, Damon Coulter

Hardware Honeymoon

dc asimo robot-201306282191

dc asimo robot-201306282140

It appears that Honda’s Asimo robot has been having some troubles in his new job as a museum tour guide.

Despite being the most advanced robot on the planet it appears not able to interact fully with unpredictable humans yet and mistakes waves for other gestures or gets confused by questions it was not expecting.

Indeed some have called it an expensive toy due to it lack of real-world usefulness. A version of the robot (actually very different and more traditionally mobile on its caterpiller tracks) did work inside the Fukushima nuclear plant to inspect the reactors but this strikes me as more of a PR stunt to redeem the Honda Asimo brand after Asimo and other robots available at the time of the tragedy on March 11th 2011 proved to be completely unsuitable for any of the jobs one might have expected them to be designed for.

I must admit though, having watched Asimo do some of its work at the Miraikan Museum of Emerging Science and Innovation last week, it is quite impressively nimble in the right environment. You have to remind yourself that this is still a lump of heavy metal and plastic when it is running, jumping and interacting mutely with people. There is an unmistakable human quality to it that makes you forget that it is, or at least should be, just be a tool for doing difficult jobs for us. The size is about that of a ten year old and it seems to have some of that juvenile impulsive energy to its actions too.

This creates a definite feeling of empathy for the machine in its current woes. You actually feel sorry for it because it might be tired or sad and you wish it luck in improving its skills and avoiding any future embarrassments, even though it cannot feel any of those emotions.

Asimo may be useless really but I think that is not its point. It is a robot that even twenty years ago was the stuff of Science Fiction only. It looks and moves like a person and in another twenty years the technology will have improved even more and we might well have truly humanoid robots that look and act even more like us so much so that we treat them as humans with rights and less ironic empathy.

We may even fall in love with them. Many people already think fictional robots like Wall-E or real ones like Asimo are cute. Could we actually get to a point where robots are programmed to react to anything our human psyche throws at them, not just the 100 questions Asimo can currently answer when questioned through ag touch-pad? Might there come a time when they would be bale to respond emotionally in ways that are convincing enough to allow us to romantically and even sexually build relationships with them?

That is the science of Lovotics which Professor Adrian Cheok of Keio University is working on and I have mentioned the science in passing before.

I interviewed professor Cheok a few months ago and wrote up a couple of articles. One is quite  a long read but very illuminating on the future of augmented reality computing (using our five senses to communicate) and very interesting on the future of human-roboit relationships of which Asimo is the vanguard.

I will post the article I wrote about Professor Cheok below: if you want to pay me to publish it please get in contact.




More images of Asimo and other robots at my archive here:

Hardwear Honeymoon

Interview with Professor Professor Adrian Cheok of Keio University about Lovotics and Future Developments in Digital Communications.

If you have ever been lost you will know there are many different emotions that come with that momentary lack of direction. A sense of panic perhaps, if late for a meeting, or a private embarrassment because you are in a place you ordinarily know well. There could even be some fear that an unwise wander will lead you into danger. It is exactly at times like this, when we feel most alone, that we need another human to help us. In the past we might have found that support by asking for directions. These days we more likely to seek assistance from the maps on our smart-phone; and though it can show us the way more succinctly than a stranger pointing-out turns and landmarks it cannot quite calm the other feelings getting lost creates.

Digital technology is ubiquitous; many of us now carry around powerful computers in our pockets that aid and inform our lives is ways that even our younger selves could not have imagined. The arrival of the internet heralded a change in the way humans deal with information. We have access to more data than ever before, and remember and use it differently. The evolution of the web into a mobile technology which is constantly accessible might even reshape the way we communicate our human relationships.

Because the medium is limited to the computer screen at present we are only able to express ourselves using two of our senses: sight and sound, even though those we share with might want more from us. Lost in an unfamiliar city in the middle of the night for example, we might want a hug to cheer us up and calm our nerves. Hugs could also transmit love to family far away. In the inter-connected world of the 21st century people important to us can often be scattered across the globe and though we might take our relatives’ love as a given wouldn’t it be good if we could actually email someone a real hug every now and then; or send them the smell and flavour of their favourite food at a special time?

The ability to use all our five senses for communication across the internet is something that Professor Adrian Cheok of Keio University Media Labs in Tokyo is passionate about. Born in Adelaide in Australia to Greek and Malaysian parents, Professor Cheok embodies the realities of the modern Diaspora which is perhaps why he is determined to find ways for us to communicate our deeper feelings more widely. At a recent lecture in Seoul’s Media City he predicted that the next stage of the internet will use all five senses to move us from what he calls; “the age of information to the age of experience.” The age of mixed reality.

What will those experiences be? Basically Mixed Reality is the merging of the real and virtual worlds; a kind of augmented existence that moves the virtual experience away from the screen into the three-dimensions we inhabit. This could mean something as simple as finding the information we need about the environment in searchable pop-ups, viewable on a wearable computer like the new Google Glass spectacles. A walk-through Wikipedia if you like. But it could be more immersive; blurring our reality by over-laying it with the virtual worlds of imagination; an idea Professor Cheok brought wonderfully to life in 2005 with his mixed reality Human- Pac Man game that allowed gamers to assume characters and run from ghosts (played by other gamers) in the streets of a real city. Unlike the world of a games console however, the laws of physics apply in mixed reality: gamers cannot fly or jump over buildings even while they can see to monsters, aliens or Pac Mans running through their local neighbourhoods.

Such playful technology does have practical applications also: imagine being able to carry-out such simulations as learning in the classroom, boardroom or army barracks. Or want to now how that new car or sofa set will feel and look before buying it; or how the food will taste before ordering it in a restaurant? How all these sensory forms can be modelled and digitised, and thus communicated remotely, is what the Mixed Reality labs headed by Professor Cheok are all about. What he is clear about is that these new forms of communication are the key to the future internet and will fundamentally adjust our reality. According to Professor Cheok, our reality might already be changing anyway.

“I think we already live in a kind of virtual reality. The first time I saw a lion, I was pretty excited, but now I don’t think my young daughter is so excited [at seeing a lion in the zoo] because she’s already seen a lion 100,000 time on the discovery channel, in HD, in its African environment and so there is just not that excitement there was when we were growing-up.”

Of course all our realities are filtered through our individuality. That old metaphysical conundrum of wondering if we all see the colours the same way is only half the question for Professor Cheok.

“Why do we only see colour, why don’t we hear colour? They’re all electro-magnatic waves. What do we call reality? It’s not so much a blurring between the content we create digitally and the content in the analogue world. Because finally all of it is being seen through a kind of prism of our own senses. Other creature and animals see the world differently.”

Smell and tastes are the what the 41 year old professor, who spends his time commuting between positions in Labs in Singapore and Tokyo, hopes are going to be next improivements in digital human communications. Students in both places are working on ways to transfer them digitally and more importantly develop meaning for them that can be understood emotionally in conscious and perhaps even unconscious ways.

“In my lab in Singapore, one of my PhD student just finished with an electric taste actuator. Using electric currents at different frequencies on your tongue to stimulate your taste sense receptor to have artificial taste. So we can have virtual taste. Of course we could do this for sending a virtual taste but I think once we have this ability virtually then it doesn’t necessarily just have to be looking at sugar on a screen and getting a sweet taste but it can be another form of communication. I was thinking that you can send a message on your phone but it’s also got a taste. So, you know; sweet messages, maybe romantic and if your angry maybe you send a bitter message. These tastes can also develop new kinds of communication.”

Smell is one of the most emotive of senses but there are difficulties with creating virtual smells that are much harder to overcome than those of taste. Even though the senses my be linked in our bodies, according to professor Cheok, to recreate them digitally involves a whole new method and technology.

“Taste is much easier because you have just five fundamental tastes and also the tongue is much easier to actuate because you can just put, like my idea, an electric lollipop on to your tongue and generate electric currents. With the nose I’m working on a device where you have some magnetic coils on your nose and using magnetic actuation and magnetic fields it will create a current in your olfactory bowl. Because we don’t want to directly stick anything in [the nose].

“Once we have all five senses communicating in mixed reality [it] will allow much more sensing of real presence with other people, with other surroundings. But I think also, just like with the social networks and the internet of course where we had email which was a kind of electronic letter, we’ll develop completely new forms of communication, such as Facebook, which, when you think about it, it would have been impossible before the internet to be able to have, in real time, updates from all your friends wherever they are in the world. And so I think similarly we will develop new kinds of communication once we can communicate with all of our senses through the internet.”

Some of the technology his students are working on in the Cutie Lab at Keio’s Hiyoshi campus in Tokyo take this idea of communicating emotion remotely to distinctly marketable products that could be future must-haves. Haptic technology, or machines that transfer a sense of touch can make cushions into game consoles or allow us to operate robots arms that are able to relay the feelings of weight, temperature and texture to remote operators. There are also devices that can provide that virtual hug such as Huggy Pajamas that can transfer the intensity and warmth of a hug on a pressure sensitive doll to pyjamas, worn by a child whose parents are away for example, using air pockets and heating elements incorporated into the cloth. More practical perhaps is the RingU: a ring worn on the finger that we can give a squeeze to transfer a “squeeze” to a partner ring. The squeezes can be set to have different meanings: “I love you”, “I’m sorry”, “I miss you” for example and the ring can be set with a electronically lit “stone” that shows a colour also imbued with emotional meanings.

Perhaps the most interesting of the telepresence devices developed so far at the Keio labs is the Kissinger. This small devise can transfer that most personal of messages, a kiss, to someone with a partner device. Literally by kissing the machine the movements of your lips can be mirrored in the other machine and your kiss will be given to whoever has their mouth against the other machine. The days of the emoticon may well be numbered. Though as professor Cheok says it may take a bit of time to become accepted.

“The thing is it’s just like the mobile phone, in the early days there were these businessmen with these bricks right, and you thought that so geeky and who’d ever want to use that? Initially some technologies are a niche market. But once enough people use it you have a kind of band-wagon effect, it’s a networked effect. Now, sure you can choose not to have a mobile phone, but because everyone else has got it, it’s become the new social norm. So I think a lot of these technologies maybe will become like that including robotics and mixed reality and all these things that people initially might find a little bit worrying or scary.

“I think Google Glass is showing us that, finally after all these years of research, it’s really going to become a real product. With a lot of technology you see, there’s a lot of early players and companies but what really will bring it to market is something they can scale. A company like Google can really scale this: they can make millions of the glasses. Also the other thing is, they’ve got all the content. Already, they can visually recognise a face; and from that they do a visual search of who that person is. Almost everyone now has a lot of data on Google or Facebook or whatever, and you can basically can just meet someone in real-time and know everything about them.”

For many people the pervasiveness of the likes of Facebook and Google is a worry. Privacy fears are routinely analysed in the media and campaigners keep tabs on what such companies, and governments, do with the data they collect on us. But for Professor Cheok the need to socialise is something that is innate in humans saying that no-body forces people to share on Facebook. Indeed he wonders if our need to share information about ourselves with others is a fundamental desire instilled in some evolutionary imperative. For some the ‘getting to know you’ part of a new relationship is a stress that could be alleviated by pulling up, in real time, information already out there somewhere anyway.

“I think that historically, maybe western culture has been more private: they’re more separate with their private life and their, let’s say public life, work life. And this maybe because physically there is more space. Like in America people have a lot of space and privacy became more important. Whereas I think in Asia privacy was not as important. So I think what’s happening is that actually the whole world is becoming more Asian in some respects. We’ve always had examples of cultures that there hasn’t been so much concept of privacy and it’s not just been Asian. In ancient Greece you they didn’t really have any concept of privacy. Basically there was no real separation between public and private life. So I think actually what the internet is doing is definitely changing the new norms but I’m not sure it is something that is totally new because we’ve already had cultures which have relatively little privacy. And if you really go back, real, real far back, like when we’re in the savannah, we’re all walking around naked and we didn’t have any privacy right? So it may be going back to the more natural state of humanity.”

The big question of course is if our personalities can become a collection of data that is secret to no-one and our senses and feelings can be digitised and transferred can they be put into machines that are able act human by copying and reacting to us? And if so should we treat these machines as alive? Professor Cheok is not so sure.

“It is a program fundamentally, [] if the robot looks human enough; acts human enough then for almost all intents and purposes we will interact with that as if it’s a human. Because that’s just how we get on with the world. We don’t always logically analyse the situation. We always make rough approximations in life otherwise we just couldn’t live in the real world. It goes back to when you’re in the jungle; if it sounds like a tiger coming up behind, you just run, you don’t logically think it through until it’s clear. We’re programmed to just take rough approximations and that is why we don’t need a really intelligent robot. We can ignore the whole argument about is it really intelligent.

“I think what’s happening in these different fields of augmented reality and robotics is that the technology is becoming advanced enough so we’re seeing a convergence. [] We have always thought about robots as machines but they are also a form of communication. They don’t necessarily have to be machines that do something but they could be like a telephone which is just a communication system. So the robot allows you have a physical tele-presence or to feel the presence of someone else remotely.”

Though Professor Cheok thinks that the United States is now the world leader in robotics, primarily because it can centralise research and pump money (particularly from the military who are keen to develop battlefield robotics) into it, Japan is still uniquely accepting of living with robots. Perhaps this is due to religious ideas in Shinto that imbue non-living things with having a soul. Robots are being developed in Japan that not only work for us, perhaps as carers in a rapidly ageing society for example, but also as companions. Perhaps even lovers.

“She quite nice for a 3D girl” is a saying among the otaku (geeks) of Akihabara in Tokyo that shows the way the future of human-robotics relations (Professor Cheok calls this Lovotics) may take. The otaku are famously socially inept and prefer spending their time obsessing over technology and cute manga characters. For many the stresses and unpredictability of a real relationship with a real “3D” partner might all be a bit much and many prefer to live in fantasy worlds with love interests that are necessarily passive due to the fact they do not actually exist. Already game companies like Nintendo have games that cater to this need with virtual romances, a kind of uber-Tamagochi (though she looks like a high-school girl as otaku are usually men), which will return the care and compliments the players give to her with ego-massaging messages of love. To many otaku the fact that it isn’t real love doesn’t seem to matter. In fact for some that is the point. Just as our attitudes changed to adapt to the realities of the internet professor Cheok sees a time when human-robot relationships will be much closer and more intimate than we imagine possible now.

“My personal prediction is that initially it will be a small group of people. Like even right now, people fall in love with their Nintendo [girlfriends] on that dating game. And we think that’s weird. Of course then there will be that with robots. Now how mainstream this will go I’m really not sure. It’s going to be hard to predict. I do know that at least some of the population will really feel love for a robot. But I think we always have to remember that the robot itself is not truly loving us as in the way we feel love. We are still creating a virtual reality. Maybe very very realistic but finally I think there will always be a difference between human to human love and human to robot love.”

Of course what that robot will do and what it will look like will depend on the kind of love we expect from it. There are many kinds of love: from the love we have for our parents and children to romantic or sexual love. Professor Cheok also points out that we don’t necessarily need something to look human to love it. Our love for our pets is another kind of love and easily understandable to us. Initially robots my be created, indeed are already being created, that will be able to answer this need in us.

“For a friendship robot, maybe looking like Wall-E would be fine because physical attraction is not part of friendship love really. So you can imagine friend robots which look like R2-D2, this kind of thing; and maybe we’ll find that comforting. Of course then if you want the romantic love then I can imagine robots will be created to look very human-like.”

At lot of the research into people reactions to such robots relationships has been negative. As this is a new technology there is, unsurprisingly, little to compare it with aside from science fiction. The problem is that popular culture usually sees robots, especially humanoid robots, as a potential danger. From movies like Terminator or A.I., to Blade Runner, there is usually a feeling of disgust or distrust of such robots. If they do not resemble us enough there is the Uncanny Valley to cross. This phrase was coined by Professor Masahiro Mori to describe the revulsion we feel when something is not quite human enough to be believable. If they resemble us perfectly we may not trust their motives or logic as it is different from ours. Jealousy for example is an essential component of the emotion of love. If a robot can love us, people worry that it can also feel jealous if we then fall out of love for it and could end up hating us. Professor Cheok doesn’t think so.

“Other researchers have a different views but my fundamental belief is that we’re modelling intelligence. I don’t think it really is an intelligence. [] I don’t think we could ever blame the robot for murdering [someone] we would ultimately blame the manufacturer or whoever programmed that robot. So I think similarly we wouldn’t give the ultimate, let’s say responsibility to a robot if the robot got angry it would be part of it’s program. So sure we can program the robot to be like a human, if you’re with another man or woman (or another robot) and it would act angry. But it’s been programmed like that and you could easily flip the program and switch that off.”

The key to the differences in human relationships and robot relationships, he thinks, is empathy. While a robot could provide the support, care and even an appearance of understanding we need from it, and as argued before for some people that approximation of a real relationship may be satisfying enough, there is little chance of it developing empathy for us. More importantly for the future of robots in society, he argues, is our ability to develop empathy for them.

“That is something that we still have to overcome, [the fact] all these robots and devices we make have an off switch. Whereas you can’t turn of a dog for example. Can we feel love or true empathy for things which cannot die? That probably needs much more research. But that maybe something that we have to look at carefully. Is empathy related to the fact that it’s living, which means that it can die? If you can just turn off your thing and store it for five years and nothing is going to change, maybe that decreases our empathy.

“And the other thing is reproduction. We know that even with the very simplest creature it is very difficult to perfectly clone something. You might have a pet beagle and sure you can go to the shop and get another beagle but we all feel sad when our pet beagle dies. Because we know when we go to the pet shop although [all those pet beagles] may look exactly the same we know that it’s not a perfect reproduction. Whereas if [my] CD is cracked I don’t care because know I can just buy another one from Amazon. And it’s a perfect reproduction. I think there is a fundamental thing about death which is related to empathy. But as an engineer, I don’t know what that relationship is.

“But I think that is a very important point: things that have the ability to be perfectly reproduced and cannot die; maybe that’s a fundamental barrier for us to feel the true empathy. But I could be proven wrong because I think people adapt over time to technology and maybe our concept of empathy will change.”

Perhaps this change is attitude is part of a process that has been going on for centuries anyway, as professor Cheok points out there may come a time when we think differently about machines that we take for granted at best these days and even fear.

“I think that, if you look at the progression of society, though we have lots of ups and downs, in general I think we’re evolving to much higher level of empathy for each other. All this thing before about how black people used to be sub-human almost. Now we find that terrible. How did we treat people each other like that? It’s inconceivable now. And that’s not really so long ago, that’s only in the mid-twentieth century. Things really change fast but in general society is evolving hopefully up and up. We’ll have variations but [I think] robots will have rights, legal rights like humans have rights, and we will feel empathy for robots. And we may get to the stage when we think, yeah sure the robot can’t die but so what? That’s a kind of discrimination, I mean why do we discriminate against digital beings? So the way we don’t discriminate against the black people now; we may evolve to that point where we feel; why should we discriminate because that robot was not biologically born?”


3 responses

  1. Pingback: Greetings Earthling | sungypsy blog

  2. Pingback: You Looking At Me? | sungypsy blog

  3. Pingback: Almost Human | sungypsy blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s