Huge marketing events were “everyday” during the 1990’s. One night in San Francisco, I was invited to a Casino Night on a party boat in San Francisco Bay. Sure to be a good time, I joined the fun with a bunch of my friends from Sense8.
I soon learned that one of the major prizes for the evening would be a pair of Virtual IO i-glasses. According to VideoGameKraken, these had a list price of $499. Even as a well paid marketing professional, this was a crazy extravagance, so I was determined to win the prize.
The prizes would be awarded during an end-of-evening auction funded by tickets won during various casino games throughout the night. I was able to do quite well at the tables, but recall augmenting my winning by asking various participants for their unused tickets (winnings). By the end of the night, I was comfortably in the lead and ultimately won the grand prize.
While I treasured my new virtual reality headset, I am pretty sure I used them a total of zero times. I suspect I tried to use them with WorldToolKit, but don’t recall any success. The i-glasses have sat wrapped in a blanket since 1995 while technology..and VGA displays…moved on.
I can’t believe I didn’t find all of these use cases until now…
We had a great team supporting the F-5 Avionics upgrade program at Rockwell and we became great friends. Again, orchestrating this picture seemed sort of corny at the time, but I am glad to have it now.
I got my introduction to 3D graphics when I joined Rockwell International’s F-5 Upgrade program in 1991. I was part of the IR&D (internal research and development) team building a Systems Integration Facility (SIF) for prototyping F-5 (and, later, F-4) advanced avionics for sale to other (friendly) countries.
The centerpiece of the SIF was a man-in-the-loop fight simulator featuring a full sized cockpit, upgraded cockpit displays and a large, wrap around “out the window” (OTW) display. The OTW display was powered by a Silicon Graphics Reality Engine running Gemini’s GVS software. The wrap around display was generated with 3 projectors. The cockpit displays were powered by Virtual Prototype’s VAPS running on other SGI workstations.
As you can see on the image above, we were also able to simulate the Heads Up Display (HUD) by overlaying the VAPS graphics on the GVS graphic. The multifunction displays (MFDs) in the cockpit were driven by VAPS on special monitors powered by SGI workstations.
In the image above, you can see the flight stick with upgraded HOTAS (“hands on stick and throttle”) functionality. We were able to pick up all of the HOTAS functions and MFD buttons in the simulation software.
We also had a “portable” version of the simulation that could function at a workstation (Portable Cockpit Simulator or PCS). The HOTAS was the same as in the cockpit. We could monitor the “shared memory” interfaces on the VT100 monitors on top of the cabinet. (That is me dramatically pointing to the out-the-window display.)
In the cage below the monitors was a VME enclosure which brought in signal from the various hardware components and also allowed us to integrate real 1553 devices (like an Inertial Navigation System – INS)
This is the onboard 1553 architecture we were trying to simulate, including all of the flight instruments…as well as the weapon systems interfaces for things like pylon mount Mark 82 bombs, rockets, Hellfire, Marverick and AIM missiles. Also, a “Gun Control Unit”.
The diagram above illustrates the architecture of the SIF computers, including the SGI workstations, some VAXstations for software development, the VME, 1553 and “reflective” (shared) memory interface.
The SGI VGX was our “poor man’s” reality engine. I am pretty sure we later replaced it later with a full refrigerator-size Reality Engine.
Here are some additional pictures of the SIF:
This unit was a yet again scalled down workstation for prototyping cockpit displays. (Basically, just an SGI workstation in a military-style cabinet.)
These are some PR illustrations of the overall SIF…but we never really built more that the facility in the upper left corner (the Cockpit Simulation Facility and the Avionics Integration Laboratory).
Many of these photos were taken for press releases and other documentation. They are all staged and posed. I remember it being very corny at the time. For example, George, my manager, spent very little time in the lab and certainly didn’t know how to operate a vT100.
The photo above (me!) was “mounted” on a foam core block and was on display as part of a collage in the Rockwell Anaheim facility’s entrance lobby. When is was finally taken down, I was able to retrieve the photo as a souvenir.
On the display of “Gazoo”, one of the lab’s VAXstations, is another architecture diagram. It appears I am using “CDA Viewer” which was just a simple way of getting an impressive image on the screen. At the time, I think we were using Interleaf for creating such graphics.
This is probably the most iconic of the Sense8/WorldToolKit demos. I think it was developed originally by Dave Hinkle and was often the first application compiled and tested after each update to WTK. As a result, it was ported to almost every hardware platform of the time.
Here are some note from the readme:
This is a simulation of a proposed mars rover platform. WorldToolKit was used to show problems with this design. The mouse can be used to fly around the terrain without the rover.
This simulation was built by NASA and includes the following features:
- Terrain following
- Hierarchical object attachments
- Physically-based motion
- Support for head-mounted displays and tracking
(ambient music was added to the YouTube video. The original was silent)
I had to find a somewhat slow modern (2022) machine to capture this demo as the app was not designed to maintain a constant framerate. On my laptop (with three screens) this was running at like 250fps.
Here is the README.1ST file:
To run the rover from this directory do the following.
# set model env var to point to the rover parts
setenv WTMODELS ./terrnn:./josh
# set image env var for textures
setenv WTIMAGES ./josh
# set the env var to point where the WTKCODE file is
setenv WTKCODES ../../
# to start execution type
rover
The following keys affect the rover motion:
f - rover moves forward with all wheels moving
b - rover moves backward with all wheels moving
s - rover stops
q - exit program
p - print frame rate
r - reset the rover
v - half the velocity of the rover
V - double the velocity of the rover
F1 - starts rover on preprogrammed path
Each of the six wheels has three states, forward, backward, and stop to toggle the state of a wheel press 1-6 to change the state of a particular wheel. The wheels are arranged as 1,2, & 3 on one side and 4,5, & 6 on the other. For example to turn the rover, press the 1 key.
To continue forward then press the f key. To turn the other way press the 4 key.
Fly around the database and press the middle mouse button every once in a while. This will place flags. After you have placed a few flags then press the f key. The rover will go from flag to flag and then to a "goal"
The object is to get to the goal without turning over.
Have fun
The ROVER.C file make reference to “prior work” from InWorld VR, Inc. From what I can find, they were the manufacturers of the CYBERWAND. Online, I found a reference to a 1994 trademark registration for this “computer peripheral, namely a joystick”. Sadly, I can not find any images of the cyberwand, but also can not see any direct connection to the rover demo other than some timer code.
UPDATE:
After an exchange with Kurt Schwehr and the review of a paper he shared, it seems clear that the ROVER demo was based on the Russian “Marsokhod” rover.
Prototypes of the Marsokhod rover were taken from Russia to the NASA Ames Research Center, where they were jointly developed by the US and Russia. This led to the development of a 'virtual environment control system', which meant the rover could be controlled remotely via an interface on a PC.
One of Sense8’s classic “business” demos was a proposed 3D user interface on top of Computer Associates’ flagship Unicenter product. Branded as CA-Unicenter TNG, the demo would allow you to “fly” from a spinning map of the globe, down to a map of your IT network and ultimately inside servers to address issues or add additional software.
I was never sure if this interface actually shipped with Unicenter or was just a technology demonstrator. From what I can tell from this article in ITPro Today, it was available as the “WorldView” 3D interface.
And this Xerox brochure includes an image of the WTK GUI.
Later in my career, while selling HP OpenView, CA-Unicenter would be a perennial competitor. I must say, the 3D interface never came up in customer discussions so I suspect the metaphor did not become a major differentiator. But it was a good story at the time.
“Sailing” is another classic WorldToolKit demo. While is seems like a simple demo, it has quite a bit of calculating going on behind the scenes.
From the demo’s readme:
The goal of this demo is to sail around the bay and see some of the sights (GG bridge, city, buoys, etc).
This simulation was built in about 5 weeks by Sense8 and includes the following features:
- Physically based wind simulation
- Vertex manipulation in the sail due to wind forces
- 3D sound (on platforms supporting it)
- Support for head-mounted displays, Spaceball, and trackers (when deployed)
Normally when running this demo, I use the number keys to turn the boat around so that I'm sailing for the Golden Gate Bridge. When heading straight into the bridge the sailing will begin luffing. This shows the vertex manipulation feature along with the physically-based wind calculations.
It was quite the go-to demo over the years, ported to virtually every available platform of the period, but I believe it was originally developed as the centerpiece of Sense8 presence at the 1995 Siggraph. At that show, it would have been set up to leverage a FakeSpace Boom.
According to the University of Washington HIT Lab website:
Fake Space Labs have developed a full 6 DOF tracking stereoscopic display that is not mounted on an individual’s head. Using mechanical tracking computations of the viewer’s position occur faster. Larger displays can be used because their weight and size are not noticed due to the counterbalance of the boom. Therefore higher resolution is allowed. viewers can just walk up to the boom and place their face into the optics and move around, like a pair of binoculars attached to skyscraper. Buttons are provided for interaction with the virtual world. The boom is easy to go from real to virtual and back again because there is no head gear to put on or take off.
When I picked up the Sunday NY Times in my driveway on November 7, 2015, I believed the world was about to change. That morning, the Times, in partnership with Google, distributed 1.3 million Cardboard VR headsets. A generation of people would be exposed to VR and the world would never be the same.
Between 2020 and 2022, Meta/Facebook has sold 15 million Quest 2 headsets. We are square in the middle of a new generation of VR usage. (We have 3 Quest 2 headsets in our house.)
This upsurge of interest in virtual reality has me nostalgic for my time in the first VR wave, back in the 1990s (technically, this might have been the second or third wave) but when I started to research on the web, you would see names like Jaron Lanier, Ivan Sutherland, John Carmack and even the Nintendo Virtual Boy, I was upset that the articles din’t mention my corner of the universe (Sense8, Gemini, etc.)
I realized that a good reason these stories didn’t cover Sense8 and Gemini was that there are very few artifacts available on the web. No images. No videos. No stories. So the goal of this blog is to rectify that situation. I found a collection of old DVD with source code, executables, demos and notes. I am going to cull through this info a post what I remember…and what ever I can actually run on computer 25 years in the future.