Virtual Data, Virtual Worlds

Bridget M. O’Brien
May 01, 2000

Imagine hundreds of thousands of molecules whizzing around you. You are standing inside a chemistry experiment, zapping the molecules with lasers and watching the results occur. Another scientist joins you. Even though your labs are thousands of miles apart, you are both surrounded by the same experiment. The limiting distances of the real world are irrelevant.

Sound futuristic? Technology is quickly making this kind of dancing with data a reality. First step is the ImmersaDesk, or Idesk: five-foot by five-foot monitor "walls" attached to powerful computer workstations. Through a grant from the National Science Foundation, seven Penn State departments—computer science and engineering, electrical engineering, mathematics, aerospace engineering, mechanical and nuclear engineering, chemistry, and physics—are purchasing three Idesk machines; a fourth currently resides in the Center for Academic Computing. These machines can be linked—over long distances or short—by Internet2, the nationwide high-speed networking initiative. (To imagine the speed of Internet2, think of the Encyclopedia Britannica. A standard modem would require 49 hours to transmit the encyclopedia from one machine to another. Over Penn State's Internet2 connection, the whole thing zips through in just 62 seconds.)

For your study of molecular dynamics, you and your collaborator across the continent each don goggles—to create a 3-D illusion out of the giant displays' stereoscopic images. You each wield a wand that "acts like a three-dimensional mouse," says Paul Plassmann, assistant professor of computer science and engineering. "With the wand you can poke data, change conditions and parameters, and see what happens in real time." A tracker monitors your locations in space—after all, no one wants to get lost in a virtual reality.

The IDesk was first developed at the Electronic Vision Laboratory (EVL) of the University of Illinois at Chicago. The next step up is a "cave": monitor walls and possibly a ceiling and floor. "'CAVE,' the name selected for the virtual reality theater developed at Chicago, is both a recursive acronym (CAVE Automatic Virtual Environment) and a reference to 'The Simile of the Cave' found in Plato's Republic," says Plassman, "in which the philosopher explores the ideas of perception, reality, and illusion. Plato used the analogy of a person facing the back of a cave alive with shadows that are his or her only basis for ideas of what real objects are."

In a virtual reality "cave," adds Lyle Long, a professor of aerospace engineering at Penn State and principal investigator on the Idesk grant, "since you are completely surrounded by the graphics, you have more freedom to turn and look all around you."

A cave-type "Immersive Projection Display," or IPD, is currently being installed at Penn State's Applied Research Lab. "We don't call it a 'cave,'" notes ARL's Richard Stern, though he adds that Plato's analogy still holds. Designed and built by MechDyne Corporation, the IPD is "a projection-based, room-sized, high resolution, 3-D video and audio synthetic environment display," that is, a room measuring 10-by-10-by-9 into the middle of which an experiment can be projected. The IPD will be used for ARL research on defense-related projects, communications, materials, and manufacturing, as well as on computational mechanics, electromagnetics, acoustics, information science and technology, and simulation and training. All colleges within the University will have access to the IPD.

But, notes Long, "A cave is not something you can just walk across campus and simply use for a couple of hours." There's a learning curve involved. "Our effort," he says, meaning the Idesk project, "will put a lot of virtual reality equipment in front of a lot of faculty and students so they can learn how to use it and develop software. They then might be able to use a cave."

Currently, complex computations for problems in molecular chemistry, physics, acoustics, and aerodynamics are performed on clusters of PCs "that are as fast as parallel supercomputers," Long notes. But it is difficult to visualize and analyze the results. IDesks allow you to watch this complex data unfold in 3-D or 4-D (3-D plus time) simulations. Says Long, "You are learning how to use the data." Says Plassmann, "You can tweak the data and the parameters that control the experiment. You learn by doing things. You build intuition and get a better understanding.

"Previous systems can display 3-D problems, but it's like looking at the inside of a house from the outside, through a window," Plassman explains. "Virtual reality allows you to go into the house and rearrange the furniture. It really helps you understand what the computations mean."

Lyle Long, D.SC., is professor of aerospace engineering in the College of Engineering, 233 Hammond Bldg., University Park, PA 16802; 814-865-1172; lnl@psu.edu. Paul Plassmann, Ph.D., is assistant professor of computer science and engineering, 309 Pond Laboratory; 814-865-0193; plassman@cse.psu.edu. Richard Stern, Ph.D., is deputy director of the Applied Research Lab; 814-865-6344; rstern@psu.edu. Virtual reality technology is being provided through grants from the National Science Foundation and to the Applied Research Lab from the Department of Defense (Office of Naval Research) and the University. Writer Bridget M. O'Brien is majoring in biology and writing at Juniata College.

Last Updated May 01, 2000