The article’s a year old, and they’ve apparently got the thing up and running as I heard a radio interview with it the other day.
I was at the table next to it at last year’s AAAI robot competition. Neat gimmick, using PKD. So, here’s the thing – it’s pretty impressive as an animatronic entity. But very little A.I. involved. I’m pretty sure it was using the Ainebot chatbot software for conversation, as it used some of the exact stock responses as our robot did. Other responses were hard-code scripted. They had their patter nailed – it seems as though no one even noticed that they had to reboot the computer (or perhaps just the face-tracking and chat software) every 5-10 minutes. And, if you watched it long enough, you could see that there was very little coordination among the motors controlling the expression.
However, since the skin material is Hanson’s thing (he was getting his PhD from UT Austin for the work), the above is sorta nit-picking. The skin itself was pretty amazing, and the number of motors controlling it were enough to give it some real expressive looks. I’m curious as to how much progress the Hanson team has made in the past year…
I dunno. I wish I could find it on the web, but I’ve searched and searched, and haven’t been able to turn anything up. It would have either been a BBC or NPR/PRI program, but nothing shows up on their sites.:mad:
I assume you’ve visited Hanson Robotics?
Yup. That’s where the link in the OP comes from.