3dcgi.com - Your guide to cutting-edge 3D graphics technology

Cool 3-D Graphics Technologies
3D Cameras and Scanners
3D Autostereoscopic Displays
3-D Computer Modeling, Painting, Sketching, Sculpting
Learn about 3-D graphics
3-d Graphics Links
3-d Graphics News and Information
Computer Performance Tips

Add to My Yahoo!

3dcgi.com Monthly/Bi-Monthly Newsletter

Subscribe
Unsubscribe


Google
Web 3dcgi




Siggraph 2002 in San Antonio continued

Reachin Display3dcgi has covered the Reachin display before, but now I got to use it. The system is designed so the user indirectly looks at the angled monitor by looking in a mirror. LCD shutter glasses are used to give make objects look 3D. Beneath the mirror is a SensAble Freeform haptic input device.

The combination of software and hardware provide a remarkable modeling medium where it actually felt like I was touching the objects being displayed. As I moved the Freeform pen over a ball I felt that it was smooth and hard. The next object in the scene had a rougher texture like sandpaper. The Reachin display might do wonders for helping to train doctors. The level of realism that can be faked will allow doctors to learn how to perform surgeries on a virtual patient where the consequences of making a mistake are forgivable.

For years ray tracing has been acknowledged as a terrific way to render accurate lighting including reflections, refraction, and shadows. The problem has been how long it takes to render complex scenes using ray tracing. Now along comes a new company, InTrace. InTrace claims their ray tracer offers real-time interactivity and to prove it they were demoing their software, InLight, in the AMD and RackSaver booths. In the RackSaver booth InLight was running on a 64x dual Athlon cluster. The image being rendered was a car headlight, something that would be very hard to fake with typical real-time rasterization methods. InTrace's target customers are designers who need to see what their design will look like in real life. While the demo was interactive it was only barely. As I rotated the camera the scene was not very responsive. There was a short delay before the scene updated and it took a second or two for the scene to be resolved to full quality. Still if you have a need to look at ray traced images and you don't want to wait a long time this setup might be for you. Oh yeah, you'll need a lot of money too. Future improvements include interactive global illumination, hardware support from a GPU, and Renderman support.

Facial animation is always a difficult and time consuming process so naturally there are companies trying to make this easier. Version 2 of Eyematic's FaceStation software was announced at Siggraph. FaceStation 2 uses computer vision and speech analysis technology to automate facial animation. No motion capture markers are necessary. FaceStation 2 is even capable of processing and applying the animation to a character in real-time using only a camcorder and a standard PC.

On my way out of the convention center I ran across this character at the Viewpoint booth. None other than Dr. Evil of Austin Powers fame. If you look carefully you'll even notice he's holding a stuffed Mr. Bigglesworth. The actor's impressions were actually pretty good and he had the audience laughing and taking pictures.

Dr. Evil

Page 1 Previous pageNext pageHome

 

3dcgi.com Home Page