You can tell your future domestic assistive robot to “go to the kitchen and get a bottle of water from the table,” but how will it decide what that means and how best to execute this request?
The robot needs to understand the grammar of such a sentence — that “go” means robot movement, that “to” is a preposition indicating a path somewhere and that “the kitchen” is a noun phrase referring to a specific space.
How does the robot figure all that out, and correctly execute the requested action? Because people like Juan Fasola are programming it.
Fasola is a graduate student in the lab of USC Viterbi School of Engineering computer science Professor Maja Matarić, an internationally recognized socially assistive robotics expert.
He recently won the Cognitive Robotics Best Paper Award at the IEEE/RSJ International Conference on Intelligent Robots. The paper outlines Fasola’s biology-inspired framework for interpreting commands that a robot can then execute.
Among the most notable aspects of the paper is the way his framework allows for constraints, such as “go to the kitchen but walk along the wall.”
To successfully fulfill such a constraint, the robot needs to know precisely what “along” means.
“I had to look at different literature in linguistics and cognitive psychology to get at the root of what ‘along’ refers to,” Fasola said.
For his program, he specified that “along” means proximity and a path that travels parallel to the reference object. This translates to numerical calculations the robot conducts in order to achieve the task of going to the kitchen while satisfying the user’s along-the-wall constraint.
“The key to the framework,” Fasola said, “is that the meaning of spatial prepositions can be represented within the robot by computational models called spatial semantic fields, which can then be used by the robot to follow user commands appropriately.
“The robot can combine the execution of the path of getting to the goal, but also with the constraint, which is how you execute the task,” he explained. “The robot can combine them very easily using my framework. That was one of the really cool things about the paper and probably the most novel.”
The paper contains results from simulation tests, but Fasola has since started running experiments with robots such as the PR2 and Bandit. In the 3-D simulation, the program would plan and execute paths, but with full knowledge of the placement of obstacles in the environment.
Fasola used Bandit in the past for his work in socially assistive robots for the elderly, and he plans to use Bandit again to test this framework with the same target users. He plans to run a pilot study this spring at a senior living facility.
“Hopefully they can give it natural language speech commands,” he said, “and the robot will understand it.”