In the basement of Powell Hall, a robot arm works busily 24 hours
a day, moving from side to side over a cordoned-off sandbox,
sometimes dipping down and blowing away debris with a built-in
compressed-air jet to reveal buried artifacts.
The signals to control the arm come from operators all over the
world via the Internet. Conceived and programmed by an
interdisciplinary team from the departments of Computer Science
and Anthropology, the Mercury Project is the Internet’s first
public access to a tele-operated robot.
The project officially went on-line Sept. 15, though it had been
in use and undergoing testing since Aug. 1. According to Kenneth
Y. Goldberg, assistant professor of computer science, the arm has
already been run by net subscribers working from computers in
Australia, Europe, South America and Asia, as well as all over
the United States and Canada.
As of Sept. 8, more than 100 users a day were signing on to
explore or use the system from locations in virtually all
countries in which Internet linkages exist.
“This system combines robotics, archaeology and interactive art,”
said Michael Mascha, adjunct professor of anthropology, who
collaborated with Goldberg and a team of graduate students to
bring the project into being.
Mascha and graduate student Nick Rothenberg described the system
in relation to the human body: Most Internet sites provide only
access to digital information stored on the hard drives of
connected computers. In the last six months, several sites have
connected a camera to a computer, thus adding an eye whereby
users can observe a remote environment. With the Mercury Project,
a third level has been introduced. An arm now allows users to
“reach beyond the digital domain, and physically alter a remote,
real environment,” Rothenberg said.
The system uses multimedia Mosaic software to provide access to
the World Wide Web (WWW), a part of the Internet that allows for
the transmission of visual, sound and other data in addition to
text. Users who navigate to the system’s electronic address –
https://www.usc.edu/dept/raiders/story/index.html – arrive at an
orientation window that explains the project and gives basic
operating information about the system.
Would-be operators must first take a test proving they have
learned enough about the system to operate it safely. Once
certified and given computerized “operator’s permits,” users can
proceed actually to run the robot.
Operators see a display consisting of a still video image of the
area directly under the robot’s hand plus a schematic map of the
area the robot arm traverses. To move the arm, operators use a
computer mouse to select a desired location on the schematic map.
Clicking on the spot with the mouse signals to the arm to move to
that position over the sandbox.
Buttons located in the display allow the user to move the arm up
or down and, when it is down, to send a blast of compressed air
into the sand, uncovering objects buried there.
The objects, including matchbooks, bits of paper, dollhouse
miniatures and other items, were carefully chosen to create a
consistent milieu – in this case, one suggesting the 19th century
– like that found in an archaeological dig.
“The installation encourages a collaborative exploration, with
each user posting their discoveries in the log, so that the
common threads emerge gradually,” said Goldberg. “The artifacts
tell a story as users uncover them.”
This whimsical side of the project reflects Goldberg’s interest
in mixing robotics with other disciplines. In 1992, he built a
robot system to create images as part of an art exhibition
mounted at the Fisher Gallery.
The two inventors said their installation is a prototype for
systems that could be useful in many existing applications.
“A version of this system could be installed in a museum,” said
Goldberg, “allowing scholars to view historical artifacts over
the network while the artifacts remain secure in the museum’s
archives. It would be better than a simple image in a catalog,
because a viewer would be able to explore the object in three
dimensions, selecting from a myriad of viewpoints.”
The system also has great educational potential, according to
Mascha. “Students could use it to remotely explore a medical
dissection or an architectural model that is difficult to
transport,” he said. “The system opens the door for students from
around the world to enter a ‘virtual classroom.'”
Creation of the system presented significant technical problems.
The team used off-the-shelf technology for both the robot arm (an
old commercial unit built by IBM around 1980) and the video
camera (an EDC 1000 unit, made by Electrim Inc.).
The WWW computer linkage limited the flow of both command
information from operators to the robot and sensory information
from the robot to the operator. By making adjustments, however,
the team tried to “balance functionality with user interests.”
Goldberg and Mascha credited graduate student team members –
including Rothenberg, Steven Gentner, Carl Sutter, Jeff Wiegley
and Juergen Rossmann, along with Carl Sutter and Rick Lacy from
the Center for Scholarly Technology – with successfully working
out effective compromises in this direction, “and generally
making this something robust enough to function in the full light
of the Internet community.”
Users from all over the world have recorded reactions to their
experience in the logbook to which they are invited to contribute
after finishing their turn at the controls. A French operator
called it a “truly remarkable experiment; easy to handle, even
from overseas.” A user from Australia wrote, “this is without a
doubt the most amazing thing I’ve ever encountered on the net,
bar none. After 18 years out here, I didn’t expect anything to be
this surprising. It will be interesting to see where things go
Goldberg and Mascha plan to give an extensive presentation on the
system at the Oct. 17-20 WWW conference in Chicago.
[Photo:] Guided by an Internet user, the robot arm unearths a
miniature globe as Michael Mascha, a member of the team that
designed the system, looks on.