The debut of a presidential bust might not seem like big news.
But the latest depictions of President Barack Obama are more than they seem. Not sculptures or plaster casts, they’re first-ever 3-D portraits of a sitting president — exact duplicates created with digital technology from the USC Institute for Creative Technologies.
The USC institute is part of a Smithsonian Institution-led team that created the digital and 3-D printed bust and life mask. Both will be on view in the Commons gallery of the Smithsonian Castle through Dec. 31.
The team scanned Obama earlier this year, using two distinct 3-D documentation processes. Experts from ICT used their Light Stages face scanner to capture high-resolution shape and reflectance properties of the president’s face in seconds. Next, a Smithsonian team used handheld 3-D scanners and traditional SLR cameras to record peripheral 3-D data to create an accurate bust.
The data and the printed models are part of the collection of the Smithsonian’s National Portrait Gallery.
“The Smithsonian’s 3-D presidential portrait project represents the first deployment of a Light Stage system designed for mobile use and the fastest scanning session ever conducted by ICT’s Graphics Laboratory,” said Light Stages inventor Paul Debevec, ICT’s chief visual officer and a professor of computer science at the USC Viterbi School of Engineering. “The Smithsonian Institution had an ambitious vision to create the first-ever 3-D printed model of a president, and it was an honor to contribute our technology to the process.”
The Graphics Lab’s involvement in the Smithsonian’s presidential scanning project represents several technical breakthroughs. It provides a way to obtain data for a new form of presidential portrait, one that can digitally recreate every skin pore and fine line, presenting it in 3-D. It also demonstrates that the Light Stages is portable, which opens up opportunities to scan people and objects all over the world.
The final result is amazing.
Randall W. Hill Jr.
“This collaboration is a great symbol of the imagination and innovation that the government, academia and industry and accomplish by working together,” said Randall W. Hill Jr., executive director of ICT. “The final result is amazing and shows the power that Army-sponsored university research can have in developing technologies to preserve our present and past.”
The ICT Graphics Lab has refined its facial-rendering techniques in collaboration with Hollywood’s visual effects industry, helping digitize the stars of such movies as Avatar, Gravity and Maleficent to create computer-generated characters with the appearance of real people.
The Light Stages process has been used to help create believable digital characters for the Emergent Leaders Immersive Training Environment, developed with the U.S. Army Research Lab as a state-of-the-art training platform for interpersonal communication and leadership skills. The U.S. Army funds much of the basic research that goes into the development of the Light Stage systems.
Current projects include recording and projecting life-size 3-D depictions of heroes and historical figures, including recent Medal of Honor recipient Ty Carter and Holocaust survivor Pinchas Gutter (a collaboration with the USC Shoah Foundation, in partnership with Conscience Display).
When combined with artificial intelligence algorithms from ICT’s natural language group, these interactive projections can answer people’s questions about their lives and experiences.
In 2010, Debevec and his collaborators received a Scientific and Engineering Academy Award for the development of the devices.
In addition to Debevec, key ICT contributors to the scanning project were Graham Fyffe, Xueming Yu, Paul Graham and Jay Busch. The data captured was post-processed by 3-D graphics experts at the software company Autodesk to create final high-resolution models. The life mask and bust were then printed using 3-D Systems’ Selective Laser Sintering printers.