HISTORY
1980-1983: Simon Fraser University — Goniometers
1982-1983: MIT — Graphical Marionette
1988: deGraf/Wahrman — Mike the Talking Head
1988: Pacific Data Images — Waldo C. Graphic
1989: Kleiser-Walczak — Dozo
1991: Videosystem — Mat the Ghost
1992: SimGraphics — Mario
1992: Brad deGraf — Alive!
1993: Acclaim
Today: Many players using commercial systems
Biomechanics labs began to use computers to analyze human motion. Techniques and devices used in these studies launched to make their way into the computer graphics community. In the early 1980's, Tom Calvert, a professor of kinesiology and computer science at Simon Fraser University, attached potentiometers to a body and used the output to drive computer animated figures for choreographic studies and clinical assessment of movement abnormalities. During this time, the analog output was converted to a digital form and fed to the computer animation system. The animation system used the motion capture apparatus together with Labanotation and kinematic specifications to fully specify character motion.[1]
Commercial optical tracking systems such as the Op-Eye and SelSpot systems began to be used by the computer graphics community. In the early 1980's, both the MIT Architecture Machine Group and the New York Institute of Technology Computer Graphics Lab experimented with optical tracking of the human body.
An optical tracking system operates by equipping the tracked object with markers that are either covered with passive retro reflective surfaces or are powered, active, infrared LEDs. Tracking cameras are positioned to intersect over a volume where at least two cameras are needed to track the markers. Data from each tracking camera is sent to a central PC for final processing.
The technology is limited by the speed at which the makers can be examined (thus affecting the number of positions per second that can be captured), by occlusion of the markers by the body, and by the resolution of the cameras—specifically for their ability to differentiate markers.
In 1988, deGraf/Wahrman developed "Mike the Talking Head" for Silicon Graphics to show off the real-time capabilities of their new 4D machines. To create the original face to work with, a real person, Mike Gribble, was used as a model. His face was scanned in using a 3D digitizer to get about 256,000 points of digital data. These points are converted to polygon data which makes shading of the image possible. The talking component of Mike was acheived by scanning in the real Mike as he mouthed each phoneme. Phonemes are the subparts of words used in pronunciation. To simulate speech, the implementors developed code to interpolate between phoneme positions.
Mike was performed live in that year's SIGGRAPH film and video show. The live performance clearly demonstrated that the technology was ripe for exploitation in production environments.[3]
Durint 1985, Jim Henson Productions had been trying to create computer graphics versions of their characters. They met with limited success, mainly due to the limited capabilities of the technology at that time. In 1988, with availability of the Silicon Graphics 4D series workstation, and with the expertise of Pacific Data Images, they found a viable solution--Waldo C. Graphic was born. Waldo is the world's first computer-generated Muppet. Waldo's strength as a computer generated puppet was that he could be controlled in real-time in concert with real puppets.
In 1989, Kleiser-Walczak produced Dozo, a (non-real-time) computer animation of a woman dancing in front of a microphone while singing a song for a music video. The company established the use of motion capture techniques to create a realistic effect. Kleiser's chose an optically-based solution from Motion Analysis that used multiple cameras to triangulate the images of small pieces of reflective tape placed on the body. [6]
A real-time character animation system, Mat the Ghost was born. Mat was a friendly green ghost that interacted with live actors and puppets on a daily childrens' show called Canaille Peluche.
In 1992, SimGraphics developed a facial tracking system called a “face waldo.” The importance of this system was that an actor could manipulate all the facial expressions of a character by just miming the facial expression himself—a perfectly natural interface.
One of the first big successes with the face waldo, and its affiliated VActor animation system, was the real-time performance of Mario from Nintendo's popular videogame for Nintendo product announcements and trade shows. Driven by an actor behind the scenes wearing the face waldo, Mario conversed and joked with audience members, responding to their questions and comments. Because of this success, SimGraphics has devoted their time and effort on live performance animation, developing characters for trade shows, television, other live entertainment, and improving reliability and comfort of the face waldo technology.
The creation of a real-time animation system now called Alive! was formed. Soom after DeGraf joined Colossal Pictures, he used Alive! to animate Moxy, a computer generated dog who hosts a show for the Cartoon Network. Moxy is performed in real-time for publicity, but post-rendered for the actual show. The actor's motions are captured by an electromagnetic tracking system with sensors on the hands, feet, torso, and head of the actor.
At SIGGRAPH '93 Acclaim astonished audiences with a realistic and complex two-character animation done entirely with motion capture. Acclaim primarily uses the system to produce character motion sequences for video games.
Animation software vendors, such as SoftImage, have integrated these systems into their product creating "off-the-shelf" performance animation systems. While the field of human motion capture is not perfect, as other system that exists, the practice is now well established as a viable option for computer animation production. As the technology develops, there is no doubt that motion capture will become one of the basic tools of the animator's technique.
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
1980-1983: Simon Fraser University — Goniometers
Biomechanics labs began to use computers to analyze human motion. Techniques and devices used in these studies launched to make their way into the computer graphics community. In the early 1980's, Tom Calvert, a professor of kinesiology and computer science at Simon Fraser University, attached potentiometers to a body and used the output to drive computer animated figures for choreographic studies and clinical assessment of movement abnormalities. During this time, the analog output was converted to a digital form and fed to the computer animation system. The animation system used the motion capture apparatus together with Labanotation and kinematic specifications to fully specify character motion.[1]