Other virtual humans/ humanoid agents (seen from KIP)

This is an open list which will probably be expanded in the furture





AUTHOREN: Gerd Döben-Henisch
FIRST DATE: May-15, 1995
DATE of LAST CHANGE: May-22, 1996




  • Prof.Dr.Burghard B.Rieger, Lehrstuhl f"ur Computerlinguistik, FB II: Linguistische Datenverarbeitung Universit"at Trier
    --> Development of a 'Semiotic Cognitive Information Processing System' (SCIP) within the project 'Language Learning and Meaning Acquisition' (LLAMA). Reconstruction of those processes which constitute meaning.

  • Prof. Paul Rosenbloom, University of Southern California
    --> Prof. Rosenbloom's AI research activities include responsibility for the Soar Project at USC. Soar has been under development since 1983, and is a multi-disciplinary, multi-site attempt to build a general cognitive architecture. A current application is Soar IFOR (Intelligent Forces), the ultimate intent of which is to develop automated pilots whose behavior in simulated battlefields is nearly indistinguishable from that of human pilots. A prototype was deployed with some success in the STOW-E exercise in 1994, probably the first occasion on which an AI system was a direct participant in an operational military exercise.

  • Prof. Dr.phil. Roland Posner, FB Kommunikations- u. Geschichtswissenschaften, Institut f. Linguistik, Arbeitsstelle f. Semiotik, Berlin
    --> 'Gesture Recognition' within the project 'Gesture Recognition by a data-glove'. Exploration of gesture-codes and the development of symbolic representational systems to represent them.

  • Michael MAULDIN, Carnegie Mellon University
    ---> Software Julia, which mimics a user in a textbased virtual environment (MUD). She is a robot user, with the ability to conduct apparently intelligent conversations with human users, many of whom are unaware that she is not human. Developed over a period of five years by Michael Mauldin, she is currently the most advanced example of what were originally called Maas-Neotek robots, from William Gibson's book 'Neuromancer'. Julia analyses the structure and meaning and context of what is said to her, distinguishes between comments, questions, etc., accesses an encyclopedic database of response components, and assembles plausible conversational English responses, employing humor, sarcasm, politeness, impatience, and diplomacy, as appropriate.

  • Kristinn Thorisson, MIT Media Lab
    --> When people talk to each other they generally use a wealth of gesture, speech, gaze and facial expressions to communicate the intended content. Complex information is combined in a concise manner and representational styles are chosen in real-time as the conversation unfolds. Kris Thorisson has been a researcher at the MIT Media Lab since 1990. His recent work centers on humanoid interface agents, and in particular on capturing elements that are critical to multimodal dialogue between a real and a virtual human. Techniques such as eye tracking, speech recognition, etc. are used to generate responses, including speech and gesture, from the virtual human in real-time.

  • Marc Raibert, Boston Dynamics Inc.
    ---> Marc Raibert, founder of Boston Dynamics, was formerly Professor of Electrical Engineering and Computer Science at MIT. In previous work, he developed laboratory robots that used control systems for balance and to coordinate their motions. These robots had legs on which they ran, jumped, traveled on simple paths, ran fast (13 mph), climbed a simple stairway, and did simple gymnastic maneuvers. Raibert's approach to automated computer characters is to adapt control systems from robotics, and to combine them with physics-based simulation, to allow the creatures to move with physical realism, without an animator specifying all the details. Boston Dynamics creates automated computer characters and engineering simulations for things that move.

  • Prof. Jesssica Hodgins, Georgia Institute of Technology
    ---> Computer animations and virtual environments both require a source of motion for their characters. Prof. Hodgins's group is exploring one possible solution to this problem: applying high-level control algorithms to physically realistic models of the systems to be animated. The goal is to allow the animator to control the system at a high level and without an understanding of the underlying forces and torques or the motion of the individual joints. Her current research focuses on the control of dynamic physical systems, both natural and human-made, and explores techniques that may someday allow robots and animated creatures to plan and control their actions in complex and unpredictable environments.

  • Prof. Norman Badler, University of Pennsylvania, Center for human modeling and simulation
    --->Prof. Badler has been engaged for something over 20 years in human body modeling and simulation. Much of his work at the University of Pennsylvania has centered on the Jack software, widely regarded as the world's most advanced and versatile commercially-available human modeling system. Jack's capabilities include complex articulated motion, with balance-aware motion modification; collision avoidance; gesture and facial expressions; goal-based tasking; natural language processing, and many other features. It is used for a wide range of applications, including industrial ergonomic testing, military simulation and training, and human factors research.

  • Prof. Nadia Magnenat Thalmann, MIRALAB-CUI, University of Geneva
    --> Prof. Nadia Thalmann has pioneered European research into Virtual Humans for over 15 years, and enjoys an outstanding international reputation both for her spectacular state-of-the-art demonstrations, and for the rigorous and intensive academic research programs which make them possible. One of her most celebrated projects was the creation of a lifelike real-time 3D computer graphics articulated model of Marilyn Monroe. The current focus of her work is the development of realistic virtual humans with characteristics such as emotions, clothes and hair.

  • Prof Daniel Thalmann, Swiss Federal Institute of Technology --> MARILYN is a powerful and versatile virtual human simulation system. It was developed during a five-year project funded by the European Union, and has now been released commercially. It includes facial animation, body animation with deformations, grasping and walking, and hair and clothes simulation. It also supports autonomy and perception, and can be used to create simulations in which virtual humans move around in complex environments they may know and recognize, and in which they can for example play ball games based on their visual and tactile perception, and react to other virtual humans, and to real humans. Prof. Thalmann will speak on the subject of autonomous and perceptive virtual humans, and will demonstrate some of the work which has been carried out using the MARILYN software.

  • Jeff Kleiser/ Diana Walczak, Kleiser/Walczak Construction Co.
    ---> Jeff Kleiser's and Diana Walczak's background and credits in the computer animation and special effects fields range from 'Tron' and 'Flight of the Navigator', via 'Stargate', to 'Clear and Present Danger' and 'Honey I Shrunk the Theater'. Their ground-breaking human animation work on 'Judge Dredd' , based around a 3D full body scan of Sylvester Stallone, received international acclaim, and is an example of the 'synthespian' concept, created (and trademarked) by Kleiser/Walczak in the late 1980's. The company recently opened Synthespian Studios, a production facility designed specifically to create computer-generated characters. Jeff Kleiser lectures widely on the subject of computer animation, to both academic and commercial audiences.

  • Linda Jacobson, Silicon Graphics Inc.
    --> Linda Jacobson, Silicon Graphics's 'Virtual Reality Evangelist', has for some years also been a leading figure in the field of performance animation. This typically involves a human performer, equipped with anything from a face tracker to a full body motion capture system, controlling in real-time the movement, gestures and speech of a computer-generated graphical creature. Performance animation has been widely used at marketing events, entertainment venues, and in TV shows. With the advent of avatar worlds on the Internet, a wide range of performance animation skills is likely to be required, by professional hosts and performers, and both active and passive visitors and participants.

  • Dr Jonathan Waldern, Virtuality Group plc
    --> The Virtuality Group has been the market leader in the field of Virtual Reality entertainment systems throughout the 1990s. In recent years their games and experiences have incorporated increasingly versatile autonomous creatures and avatars. The company's range of activities and developments has now broadened to include consumer products, including an Internet-compatible immersive VR system currently under development.

  • Mitra, Paragraph International
    --> Avatars for on-line Internet communities have to be designed to operate within very tight processing and network bandwidth constraints. The widespread adoption of VRML and Java will further define the boundaries of achievable avatar appearance, motion and behavior. At the same time however, avatars on the net are expected to constitute the vast majority of the world's virtual humans, and considerable ingenuity will be applied to maximising performance within these constraints. Mitra was the principal architect of the avatar worlds developed by Worlds Inc. and of VRML+, Worlds Inc's VRML superset. He was a respected and leading contributor to the VRML 2.0 standardisation process, in the course of which his former company WorldMaker Inc., jointly with Silicon Graphics and Sony, formulated the Moving Worlds specification.

  • DOGZ: Simple learning dogs (Applikation)
    The first pets to live on a computer, giving you the joys of owning a dog without needing a pooper scooper! Dogz live on your desktop where they can scamper across applications, play games like keep away, fetch and chase, or nap in the corner of your screen while you're working.

Comments are welcome to doeb@inm.de



INM




Daimlerstrasse 32, 60314 Frankfurt am Main, Deutschland. Tel +49- (0)69-941963-0, Tel-Gerd: +49(0)69-941963-10