Pangea interview

While researching the internet for Sense8 and WTK references, I came a cross some 1997 press coverage of our Pangea product, a networking/multi user extension to WorldToolKit. I have to say, I totally forgot that branding (I remember it as “World2World”). That is even more embarrassing when one of the articles I found is one of my few published interviews as Director of Product Marketing!

Interview:

Los Angeles, CA -- At SIGGRAPH '97, Sense8 Corporation, a provider of commercially viable 3D/Virtual Reality (3D/VR) tools and solutions, announced the upcoming fourth quarter release of a networking product named Pangea. The product, which supports Windows NT, SGI and Sun platforms will reportedly allow developers to quickly and efficiently create multi-user 3D/VR simulation applications for intranet and Internet deployment. To explore the enterprise-wide capabilities of this new product, HPCwire interviewed Tom Payne, Sense8's director of product marketing.

---

  HPCwire: Please explain the Pangea 3D multi-user server product and how it came about.

  PAYNE: "What Sense8 sells is software for doing virtual reality applications. Primarily the applications people have done with our products in that past are solitary, single user kind of things. You're immersed in a world, but you are alone. What we've had a number of customers ask for is a way to make building multi-user applications easier. We've always had a network capability in our software, but it's been low level. It's kind of intimidating to start out with, and we felt it was something that would stop customers from using it. So what we decided to do is develop a new product, a second generation product, that's a lot easier to use, a lot more aware of what the end user was going to try to do with the software and then hide as much of the networking stuff as we could from them. We also added the capability of working over the Internet. Right now our networking capability is designed primarily for doing things in the intranet world, like in the lab. It has a very high efficiency for lab uses, but if you want to go out on the Internet, the same techniques are not applicable, so we had to change the way we were working. Now we've got both, we've got Internet capability as well as the high speed local area network things you can do.

 "Basically what Pangea does is take all the information about your WorldToolKit application and literally share it with another version of that software. So you just say, I want the color and position of this object to be shared; so that all the other WorldToolKit applications in your simulation now know about the color and position of that object. If I change the color of it, all the other participants will see the color change. So it's just much easier now at the object level, where a lot of our users deal with things. For example, say I want to load in a model of that boat and I want to load in a model of that car. Many customers don't know anything about polygons, or textures or things like that. They just want to know about the boat and the car. If they want to share the boat's position, that's all they have to do instead of having to write low level packets and talk about TCBIV and routers and all those things. It's a much easier way to deal with the whole multi-user aspect of things.

  "When you're going out over the Internet you're concerned about bandwidth, so Pangea will only send information over the network when it changes. We aggregate data together so if the color and the position don't fit into the same packet, we'll tailor them so we only have to send out one packet rather than two separate ones. That way there's not excess data going over the net."

  HPCwire: Could you expand on Pangea's corporate features.

  PAYNE: "We have a lot of corporate customers and they require having firewalls set up. We're one of the only products that lets you go through those corporate firewalls in a secure way. The product is ideal for collaborative engineering activities, collaborative type experiences, Mutual Reality as people are calling it. For instance, NASA wants to use our products to do astronaut training. They'll have a guy in Houston suit up in a VR spacesuit and a guy in Florida suit up in the same thing, and they will be able to work on the space shuttle together in the same environment. We have a network manager test Computer Associates uses for their UniCenter product. They use WorldToolKit for the front end of that. We're envisioning in the future that while a systems administrator is flying around inside the 3D world, he can see other systems administrators working on other projects. It's effective in keeping them working only on the problem they need to be working on.

  "Primarily the corporate stuff is the bandwidth issue, the aggregate issue, so that it works over a number of different kinds of infrastructures like dial-in networks, a number of ATP's, the ATM network run over T1s, T3s etc. We made sure we tested at all those different configurations. We work over a standard infrastructure. Game people out there require that you dial in to their server and you have to use their technology. Our products are designed to work on any kind of infrastructure that's available, current legacy kind of systems. So it makes it a lot easier for you to integrate it into your world."

  HPCwire: Can you explain the interaction of the server with application such as WorlldToolKit or WorldUp?

  PAYNE: "Basically what you'll do is start up the application, it will log into the server and the server will track how long you've been on. That's another one of the features...in the external world, people that want to do virtual reality stuff want to be able to do things like billing, see how long people have been on, make sure only authorized users are on... the server has that kind of stuff built into it. It's able to do the acknowledgement back and forth between the applications.

  "The way that the server is set up is, it's a query/response kind of system so that if new features are added to the server, it all depends on what questions the server asks you and what your responses are. It's very easy to grow functionality to the server and maintain backwards compatibility with older clients. All of our platforms will run on the Sun, DEC, SGI and NT. I think at this point the servers will only be NT machines and then the next one will probably be Sun, because there's a lot of people out there using Sun now.

  "Once your application is up and running, you log into a server manager which will then hand off your application to a particularly suited server that will then do most of the communications. The server manager will know you are doing the engineering management application and this server over here is the one assigned to do that, it's configured properly for your application. The servers are configured by a text file that will have all the information about how many users can log in effectively, what kind of bandwidth you're going to provide, what kind of data groups you're going to provide/set up for different people. You can have your servers tuned for different applications. Once you establish communication with the server, your application will register entrance data...it'll say I want to share the color and location of this car, but I'm also interested in the color and location of all the other cars in the network. So then the server will send you color and location, and if you're also sharing orientation, but I'm not interested in that, the server will know that and it won't bother sending the orientation information. It filters that kind of stuff out.

  "Once you've established all that stuff you just start moving your objects around like it was a local application and everything else is basically handled for you. The server will transparently start changing the data, sending it back and forth between the client and the server. All the data is maintained on the server so if one of the clients goes down, it doesn't really matter because it's like a repository, it keeps all the simulations in sync. It's got a heart beat so if one of the simulations falls out, the server will know about it and notify the other participants if they've registered interest in knowing about it."

  HPCwire: Is there anything else out there comparable to this system?

  PAYNE: "The things that are out there that people are working with are primarily in the game space there's like r-time, the defense industry has their DIS (distributed interactive simulation), but that's specifically designed for military applications. Right in the protocol it talks about missile hits and explosions, so it's very focused on that application space. We're the only ones working in the business space offering general purpose simulations where custormers can do engineering etc. A lot of our customers are really excited about it, they can't wait to get there hands on it."

  HPCwire: Can you provide an example of how a large corporation might employ a multi-user 3D application?

  PAYNE: "A good example is here at a tradeshow. We have offices in Europe, the east coast and in San Francisco. We were trying to describe to people what the booth would look like, what orientation it would have, where the competitors would be etc. So what you can do very easily is build a 3D mockup of the booth, bring it up during meetings, have the people participate from Europe and the east coast, see the object, make comments interactively, they can say Let's move this thing over here. It's a lot more efficient than sending faxes.

  "If you were Webber bar-b-que and you wanted to put your latest bar-b-que up on your website, you can have a salesperson interactively show it off and have thousands of other people logged into the server watching, asking questions and interacting with the salesperson. It can increase the way you do sales on the internet as well.

-------
Steve Fisher is associate editor of HPCwire.

https://www.hpcwire.com/1997/08/15/sense8s-pangea-supports-multiplatform-3dvr-dvpmnt/

https://techmonitor.ai/technology/sense8_has_3dvr_multi_user_development_tool_for_businesses_1

Pangea was clearly another Sense8 technology that was ahead of its time (1997) with use cases only now (2022). I am sure a lot of people worked on it, but in my mind it will always be Arvind Suthar’s baby.

Posted in Sense8 | Leave a comment

George Rickey Kinetic Sculpture

WorldUp Simulation

When I was at university (Rensselaer Polytechnic Institute), there was a controversial sculpture on campus. Some people hated it, but many engineers found it amazing to watch the huge aluminum slabs sway gently in the breeze.

The sculpture disappeared from campus and year later I conducted a search for the Chrinitoid. Ultimately, I found it (in Switzerland) and documented my quest in the RPI Alumni magazine.

There is also a page on my personal site:

https://paynecentral.com/tompayne_old/chrin

As well as his own mini blog: www.paynecentral.com/chriniblog

The animated GIF above was a quick WorldUp demo that I made while working at Sense8. It probably took 5 minutes to create and animate given it’s basic features…but it may be the only WorldUp artifact I have left.

The Chrinitoid – from RPI Polytechnic
Posted in Demos, Sense8 | Leave a comment

The Caves of Lascaux

The Caves of Lascaux…or simply LASCAUX…was not so much a WTK demo as it was a prominent art exhibit developed, in part, with the Sense8 libraries. It was the brain child of Benjamin Britton, then an associate professor at the University of Cincinnati.

Visual artist Benjamin Britton took a lot of flack from the French culturati when he proposed creating a virtual-reality exhibition based on the prehistoric Lascaux caves of France. “They thought that I would put Mickey Mouse ears on the bison,” recalls Britton.

Wired, Feb 24, 1997

Ben describes the development experience on his own website.

While Ben’s work was already sort of legendary during my time at Sense8, my hands on introduction was during Siggraph ’97 when Sense8 planned to exhibit a CAVE VR system at the show, featuring content from LASCAUX as well are recent data from the Mars rover mission. (Honestly, at the time, I was much more excited about the Mars data, but came to greatly appreciate the Lascaux work.)

Press Release: 

SENSE8 TO HOST 3D/VR WALK-THROUGH OF MARTIAN LANDSCAPE
Mill Valley, CA -- SENSE8 Corporation will host an immersive 3D/Virtual

Reality (3D/VR) walk-through of the Martian landscape and the caves at Lascaux in the SENSE8 booth at SIGGRAPH 97. Participants will take part in a fully immersive stereoscopic experience on the surface of the Red Planet from within a Cave Automatic Virtual Environment (CAVE), engineered by Pyramid Systems. Guests will also have the opportunity to explore a 3D/VR simulation of the world-renowned caves and ancient wall paintings at Lascaux, France.

Pyramid's CAVE is a 10'xlO x10' structure consisting of four
rear-projected screen walls and a front-projected floor displaying a
simulated Martian environment or prehistoric art gallery. The simulation will be controlled by an application built on top of SENSE8 WorldToolKit(WTK) and will run on a Silicon Graphics Onyx supercomputer. With a pair of LCD stereoscopic glasses from Stereographics Corporation, participants will be treated to a truly other worldly experience. The movements of the virtual explorers will be tracked by electronic sensors which allow the CAVE to continuously update its displays.

The CAVE (Cave Automated Virtual Environment) system is itself an impressive setup. It was a 10’x10′ room with rear projected screens on each wall and a top-down projector depicting the floor.

Multiple people could participate in the simulation simultaneously and only needed to wear lightweight LCD shutter glasses to experience the 3D effect.

LCD Shutter glasses

This YouTube video captures Ben’s experience. While somewhat crude by today’s standards (my son notes is has a major Gorilla Tag vibe), the effect was astounding at the time. It helped me start to understand the value of content beyond the technology.

As Siggraph, we would have groups of people enter the darkened CAVE to experience either the Mars landscape or LASCUAX (perhaps we offered both). The participants would stand in place looking around while the tour guide (myself included) would slowly move the world around them. To this day, I distinctively recall the odd sensation of claustrophobia as the wall of LASCAUX became closer and closer although we remained in the same 10’x10′ space.

Ben documented some of his thoughts in the Siggraph proceedings.

It appears Ben clearly taught the culturati a thing or two. Since his LASCAUX experience, it appears there have been at least two more attempts to recrate the cave in VR as well as a “virtual tour” of the cave on its official website.

Lascaux-caves

While I can not find a full Oculus Quest compatible experience, Geoffrey Marchal has shared a model of a portion of the cave on Sketchfab.

Geoffrey Marchal’s model

And it does appear Google has worked to recreate other ancient cave art such as Chauvet.

UPDATE:

I found these picture of Siggraph97 where we set up the CAVE system:

Posted in Demos, Sense8 | Leave a comment

Virtual I-O i-glasses

Huge marketing events were “everyday” during the 1990’s. One night in San Francisco, I was invited to a Casino Night on a party boat in San Francisco Bay. Sure to be a good time, I joined the fun with a bunch of my friends from Sense8.

I soon learned that one of the major prizes for the evening would be a pair of Virtual IO i-glasses. According to VideoGameKraken, these had a list price of $499. Even as a well paid marketing professional, this was a crazy extravagance, so I was determined to win the prize.

The prizes would be awarded during an end-of-evening auction funded by tickets won during various casino games throughout the night. I was able to do quite well at the tables, but recall augmenting my winning by asking various participants for their unused tickets (winnings). By the end of the night, I was comfortably in the lead and ultimately won the grand prize.

Virtual I-O i-glasses

While I treasured my new virtual reality headset, I am pretty sure I used them a total of zero times. I suspect I tried to use them with WorldToolKit, but don’t recall any success. The i-glasses have sat wrapped in a blanket since 1995 while technology..and VGA displays…moved on.

I can’t believe I didn’t find all of these use cases until now…

Posted in Sense8 | Leave a comment

Rockwell F-5 SIF Engineering Team

We had a great team supporting the F-5 Avionics upgrade program at Rockwell and we became great friends. Again, orchestrating this picture seemed sort of corny at the time, but I am glad to have it now.

I will try to identify as many people as I can:

  1. Marilyn Aglubat
  2. Mindy Dinh
  3. James Lawson
  4. Eileen Joyce
  5. Chris Ryan
  6. George Mason
  7. Oliver Watson
  8. Tom Payne
  9. Nick Cory
  10. John Dillon
  11. ???
  12. ???
  13. Jeff Purchatzke
  1. Dan Davidson
  2. Wes Imada
  3. Jim McDermott
  4. Carlos Trillo
  5. Dieter Trost
  6. Debbie Kirste
  7. Monte Betts
  8. Charlie “C-MAC” McCormack
  9. ???
  10. Sudhir Shah
  11. ???
  12. Jerry Hill
  13. Rosemary Valasco

Posted in Rockwell | Leave a comment

F-5 Flight Simulator

I got my introduction to 3D graphics when I joined Rockwell International’s F-5 Upgrade program in 1991. I was part of the IR&D (internal research and development) team building a Systems Integration Facility (SIF) for prototyping F-5 (and, later, F-4) advanced avionics for sale to other (friendly) countries.

The centerpiece of the SIF was a man-in-the-loop fight simulator featuring a full sized cockpit, upgraded cockpit displays and a large, wrap around “out the window” (OTW) display. The OTW display was powered by a Silicon Graphics Reality Engine running Gemini’s GVS software. The wrap around display was generated with 3 projectors. The cockpit displays were powered by Virtual Prototype’s VAPS running on other SGI workstations.

As you can see on the image above, we were also able to simulate the Heads Up Display (HUD) by overlaying the VAPS graphics on the GVS graphic. The multifunction displays (MFDs) in the cockpit were driven by VAPS on special monitors powered by SGI workstations.

In the image above, you can see the flight stick with upgraded HOTAS (“hands on stick and throttle”) functionality. We were able to pick up all of the HOTAS functions and MFD buttons in the simulation software.

We also had a “portable” version of the simulation that could function at a workstation (Portable Cockpit Simulator or PCS). The HOTAS was the same as in the cockpit. We could monitor the “shared memory” interfaces on the VT100 monitors on top of the cabinet. (That is me dramatically pointing to the out-the-window display.)

In the cage below the monitors was a VME enclosure which brought in signal from the various hardware components and also allowed us to integrate real 1553 devices (like an Inertial Navigation System – INS)

This is the onboard 1553 architecture we were trying to simulate, including all of the flight instruments…as well as the weapon systems interfaces for things like pylon mount Mark 82 bombs, rockets, Hellfire, Marverick and AIM missiles. Also, a “Gun Control Unit”.

SIF Architecture

The diagram above illustrates the architecture of the SIF computers, including the SGI workstations, some VAXstations for software development, the VME, 1553 and “reflective” (shared) memory interface.

The SGI VGX was our “poor man’s” reality engine. I am pretty sure we later replaced it later with a full refrigerator-size Reality Engine.

Here are some additional pictures of the SIF:

This unit was a yet again scalled down workstation for prototyping cockpit displays. (Basically, just an SGI workstation in a military-style cabinet.)

These are some PR illustrations of the overall SIF…but we never really built more that the facility in the upper left corner (the Cockpit Simulation Facility and the Avionics Integration Laboratory).

Many of these photos were taken for press releases and other documentation. They are all staged and posed. I remember it being very corny at the time. For example, George, my manager, spent very little time in the lab and certainly didn’t know how to operate a vT100.

The photo above (me!) was “mounted” on a foam core block and was on display as part of a collage in the Rockwell Anaheim facility’s entrance lobby. When is was finally taken down, I was able to retrieve the photo as a souvenir.

On the display of “Gazoo”, one of the lab’s VAXstations, is another architecture diagram. It appears I am using “CDA Viewer” which was just a simple way of getting an impressive image on the screen. At the time, I think we were using Interleaf for creating such graphics.

Posted in Rockwell | Leave a comment

WTK Mars Rover Demo

This is probably the most iconic of the Sense8/WorldToolKit demos. I think it was developed originally by Dave Hinkle and was often the first application compiled and tested after each update to WTK. As a result, it was ported to almost every hardware platform of the time.

Here are some note from the readme:

This is a simulation of a proposed mars rover platform.  WorldToolKit was used to show problems with this design.  The mouse can be used to fly around the terrain without the rover.

This simulation was built by NASA and includes the following features:
	- Terrain following
	- Hierarchical object attachments
	- Physically-based motion
	- Support for head-mounted displays and tracking 

(ambient music was added to the YouTube video. The original was silent)

I had to find a somewhat slow modern (2022) machine to capture this demo as the app was not designed to maintain a constant framerate. On my laptop (with three screens) this was running at like 250fps.

Here is the README.1ST file:

To run the rover from this directory do the following.


# set model env var to point to the rover parts
setenv WTMODELS ./terrnn:./josh
# set image env var for textures
setenv WTIMAGES ./josh
# set the env var to point where the WTKCODE file is
setenv WTKCODES ../../
# to start execution type
rover

The following keys affect the rover motion:

  f  - rover moves forward with all wheels moving
  b  - rover moves backward with all wheels moving
  s  - rover stops
  q - exit program
  p - print frame rate
  r - reset the rover
  v - half the velocity of the rover
  V - double the velocity of the rover
  F1 - starts rover on preprogrammed path
  
Each of the six wheels has three states, forward, backward, and stop to toggle the state of a wheel press 1-6 to change the state of a particular wheel.  The wheels are arranged as 1,2, & 3 on one side and 4,5, & 6 on the other.  For example to turn the rover, press the 1 key.

To continue forward then press the f key.  To turn the other way press the 4 key.


Fly around the database and press the middle mouse button every once in a while. This will place flags.  After you have placed a few flags then press the f key.  The rover will go from flag to flag and then to a "goal"

The object is to get to the goal without turning over.

Have fun

The ROVER.C file make reference to “prior work” from InWorld VR, Inc. From what I can find, they were the manufacturers of the CYBERWAND. Online, I found a reference to a 1994 trademark registration for this “computer peripheral, namely a joystick”. Sadly, I can not find any images of the cyberwand, but also can not see any direct connection to the rover demo other than some timer code.

UPDATE:

After an exchange with Kurt Schwehr and the review of a paper he shared, it seems clear that the ROVER demo was based on the Russian “Marsokhod” rover.

According to Wikipedia:

Prototypes of the Marsokhod rover were taken from Russia to the NASA Ames Research Center, where they were jointly developed by the US and Russia. This led to the development of a 'virtual environment control system', which meant the rover could be controlled remotely via an interface on a PC.

The Players

Christian Bauer – chrisbauer.comLinkedIn

Dr. Butler HineLinkedIn

Laurent Piquet

Michael Wagner

Ben Disco – may have added the “tracks” (quite innovative at the time)

Posted in Demos, Sense8 | Leave a comment

WTK CA-Unicenter Demo

One of Sense8’s classic “business” demos was a proposed 3D user interface on top of Computer Associates’ flagship Unicenter product. Branded as CA-Unicenter TNG, the demo would allow you to “fly” from a spinning map of the globe, down to a map of your IT network and ultimately inside servers to address issues or add additional software.

https://techmonitor.ai/technology/ca_uses_explorer_as_unicenter_tng_web_interface

I was never sure if this interface actually shipped with Unicenter or was just a technology demonstrator. From what I can tell from this article in ITPro Today, it was available as the “WorldView” 3D interface.

https://www.itprotoday.com/compute-engines/unicenter-tng

And this Xerox brochure includes an image of the WTK GUI.

Later in my career, while selling HP OpenView, CA-Unicenter would be a perennial competitor. I must say, the 3D interface never came up in customer discussions so I suspect the metaphor did not become a major differentiator. But it was a good story at the time.

Posted in Demos, Sense8 | Leave a comment

WTK Sailing Demo

“Sailing” is another classic WorldToolKit demo. While is seems like a simple demo, it has quite a bit of calculating going on behind the scenes.

From the demo’s readme:

The goal of this demo is to sail around the bay and see some of the sights (GG bridge, city, buoys, etc).

This simulation was built in about 5 weeks by Sense8 and includes the following features:

- Physically based wind simulation
- Vertex manipulation in the sail due to wind forces
- 3D sound (on platforms supporting it)
- Support for head-mounted displays, Spaceball, and trackers (when deployed)

Normally when running this demo, I use the number keys to turn the boat around so that I'm sailing for the Golden Gate Bridge.  When heading straight into the bridge the sailing will begin luffing.  This shows the vertex manipulation feature along with the physically-based wind calculations.

It was quite the go-to demo over the years, ported to virtually every available platform of the period, but I believe it was originally developed as the centerpiece of Sense8 presence at the 1995 Siggraph. At that show, it would have been set up to leverage a FakeSpace Boom.

According to the University of Washington HIT Lab website:

Fake Space Labs have developed a full 6 DOF tracking stereoscopic display that is not mounted on an individual’s head. Using mechanical tracking computations of the viewer’s position occur faster. Larger displays can be used because their weight and size are not noticed due to the counterbalance of the boom. Therefore higher resolution is allowed. viewers can just walk up to the boom and place their face into the optics and move around, like a pair of binoculars attached to skyscraper. Buttons are provided for interaction with the virtual world. The boom is easy to go from real to virtual and back again because there is no head gear to put on or take off.

HIT Lab website

While not the same application, this picture illustrates the Fakespace Boom in action:

From further references, it appear the demo was also used at I/ITSEC, the military simulation conference.

Posted in Demos, Sense8 | Leave a comment

Absent from VR history

When I picked up the Sunday NY Times in my driveway on November 7, 2015, I believed the world was about to change. That morning, the Times, in partnership with Google, distributed 1.3 million Cardboard VR headsets. A generation of people would be exposed to VR and the world would never be the same.

Between 2020 and 2022, Meta/Facebook has sold 15 million Quest 2 headsets. We are square in the middle of a new generation of VR usage. (We have 3 Quest 2 headsets in our house.)

This upsurge of interest in virtual reality has me nostalgic for my time in the first VR wave, back in the 1990s (technically, this might have been the second or third wave) but when I started to research on the web, you would see names like Jaron Lanier, Ivan Sutherland, John Carmack and even the Nintendo Virtual Boy, I was upset that the articles din’t mention my corner of the universe (Sense8, Gemini, etc.)

Here are some of those histories:

I realized that a good reason these stories didn’t cover Sense8 and Gemini was that there are very few artifacts available on the web. No images. No videos. No stories. So the goal of this blog is to rectify that situation. I found a collection of old DVD with source code, executables, demos and notes. I am going to cull through this info a post what I remember…and what ever I can actually run on computer 25 years in the future.

Posted in Uncategorized | 1 Comment