Absent from VR history

When I picked up the Sunday NY Times in my driveway on November 7, 2015, I believed the world was about to change. That morning, the Times, in partnership with Google, distributed 1.3 million Cardboard VR headsets. A generation of people would be exposed to VR and the world would never be the same.

Between 2020 and 2022, Meta/Facebook has sold 15 million Quest 2 headsets. We are square in the middle of a new generation of VR usage. (We have 3 Quest 2 headsets in our house.)

This upsurge of interest in virtual reality has me nostalgic for my time in the first VR wave, back in the 1990s (technically, this might have been the second or third wave) but when I started to research on the web, you would see names like Jaron Lanier, Ivan Sutherland, John Carmack and even the Nintendo Virtual Boy, I was upset that the articles din’t mention my corner of the universe (Sense8, Gemini, etc.)

Here are some of those histories:

I realized that a good reason these stories didn’t cover Sense8 and Gemini was that there are very few artifacts available on the web. No images. No videos. No stories. So the goal of this blog is to rectify that situation. I found a collection of old DVD with source code, executables, demos and notes. I am going to cull through this info a post what I remember…and what ever I can actually run on computer 25 years in the future.

Posted in Uncategorized | 1 Comment

VR in the 90s

Clearly, I am not the only one recalling the world of VR in the 90s. Ben Delaney released this book back in 2017.

https://www.vrinthe90s.com/

Bob Stone referenced this in a LinkedIn post where he provides an overview of VR in the 90s based in the UK. The writeup he posted can be found here, too.

Posted in Bob Stone | Leave a comment

Bob Stone’s VRS memories

More from Prof Bob Stone

Can’t believe it’s 30 years to the month that Andrew Connell and I launched the world’s first collaborative VR initiative – VRS (Virtual Reality & Simulation) from our base at the time, Advanced Robotics Research Limited (the operating company behind the UK’s Advanced Robotics Research Centre in Salford). This was the first VR programme in the world that was fully funded by the industrial sector in an attempt to share experiences before committing to adoption. The “try-before-you-buy” Initiative was officially launched by the Lord Wade of Chorlton and was initially supported by Bell Northern Research (Europe), British Nuclear Fuels plc, GEC Alsthom Engineering Systems Limited, Hunting Engineering Limited, ICI Chemicals & Polymers Limited, M W Barber Group Limited (an SME involved in surveying and civil engineering), Multi-Design Consultants Limited (architectural design SME), North West Water Group, Rolls-Royce plc, United Kingdom Nirex Limited, University of Salford, Vickers Shipbuilding and Engineering Limited (today part of the BAE Systems empire) and Westlakes Research Institute. Later collaborators included the NHS and Sainsburys. Some amazing concept projects were delivered in that time, with a good number convincing the sponsors to go on to adopt VR in their businesses. Real pioneering days!

from LinkedIn 7/23/23
Posted in Bob Stone | Leave a comment

Some 90s memories from Bob Stone

On LinkedIn

Posted in Bob Stone | Leave a comment

SpaceRocks Lives

SpaceRocks was my ongoing Sense8 WorldToolKit demo app back in the 90s. While I no longer have access to the WTK libraries, I have been working to bring it back to life. My latest version (Dec 2021), for the Oculus Quest 2 and built with Unity, can be found here:

https://paynecentral.com/files/SpaceRocksVR.apk (42MB)

Here is a link to my original SpaceRocks demo built back in 1997 when I was working for Sense8. The idea was to be a tribute to the classic Asteroids game built in 3D and using photos from the Hubble Space Telescope as textures for the background and asteroids.

Instructions for “sideloading” a APK file to your Quest can be found with a quick internet search, but here is one option:

https://uploadvr.com/sideloading-quest-how-to/embed/#?secret=vjnmYZNAex#?secret=56HpbZXyk4

Here are some screenshots:

Lots of Space Rocks!
Nebulae
Annoying Alien
Posted in Uncategorized | Leave a comment

Virtual History

Professor Bob Stone recently found an 1999 paper on VR for UNESCO. I was excited to see the paper referenced the Stonehenge work he had done for the UK Heritage folks with WTK

https://www.linkedin.com/posts/prof-bob-stone-21b86918_unesco-vr-paper-1999-activity-7025394499233140736-NkcG?utm_source=share&utm_medium=member_desktop

Here is the paper stored locally:

http://paynecentral.com/vr90/wp-content/uploads/sites/7/2023/02/UNESCO_VR_1999.pdf

Posted in Sense8 | Leave a comment

3dfx

I never worked for 3dfx but do recall working with a number of their people, especially Ross Smith.

Other people mentioned:

This video clearly provides an alternative point of view:

Tech Tales

3Dfx Demo reels:

I do have one of their boards in my collection/museum. Not sure what I can do with it these days.

tom’s 3dfx board
Posted in Other Players | Leave a comment

OpenGVS Demos

I found a bunch of OpenGVS demos on a YouTube page curated by Gary Donovan. He appears to have been either involved or a fan of 3Dfx and Quantum3D (the company that bought Gemini Technologies)

Posted in Demos, Gemini | 1 Comment

Radiosity in Real Time Graphics

I found a number of Sense8/WTK demos in a directory labeled “radiocity”. A number of the files were in NFF format while others were WRL (VRML) format. The NFF files were references in a readme noting they were developed in “Real Light from ATMA Rendering Systems srl in Milan, Italy”. I am pretty sure the WRL files we examples from Lightscape.

hallV1.wrl

Radiosity involved pre-calculating lighting effects into the model. In the NFF models, this was captured in the color of the individual polygons.

This sounded quite similar to what is now known as “baking in” the lighting. I asked chat.openai.com to explain the difference:

Radiosity was one of the first techniques used in baked lighting, although it has been updated and improved over time. The radiosity method was first developed in the 1980s as a way of simulating the diffuse global illumination of a 3D environment. It was initially used in computer graphics and animation, and it was later adapted for use in game development.

In the early days of game development, radiosity was used to calculate the static lighting of a scene, which was then baked into the game engine. This allowed games to have more realistic lighting effects, without the need for real-time lighting calculations, which were computationally expensive and would have slowed down the game.

However, as game technology has advanced and become more powerful, more advanced lighting techniques and algorithms have been developed, such as photon mapping, path tracing, and voxel cone tracing. These methods can produce more accurate and realistic lighting effects, but they also require more computational power. As a result, radiosity is still used in some game engines, but it is not the only method used for baked lighting.

chat.openai.com explanation
bathroom.nff
labv1.wrl
wireframe lab

You can see in the wireframe rendering of “labV1.wrl” how the lighting is captured directly into the geometry model (particularly in the upward wall sconces and the reflected sunlight around the windows).

Roy Latham’s Real Time Graphics newsletter from April/May 1998 featured one of the radiosity models, touting the capabilities of the a new ELSA board:

Real Time Graphics

It was great to find this same model in the radiocity directory!

chamberv1.wrl

I recall this being a pretty popular “benchmark” model at the time. Surprisingly, the only other current reference to this model I can find is in a 1996 VRML whitepaper.

Posted in Demos, Sense8 | Leave a comment

Virtual Stonehenge

The following is courtesy of Professor Robert Stone, who led this project for VR Solutions/Virtual Presence in the mid 1990’s.


English Heritage: Virtual Stonehenge

“…the largest and most challenging PC-based heritage reconstruction carried out to date”

Virtual Heritage Conference & Exhibition, December 1996

In 1995, English Heritage completed the most intensive survey of the Stonehenge area ever undertaken, generating a large database of information.  It is the nature of databases that, whilst they contain much information that is significant or useful, this information is difficult to differentiate.  English Heritage saw VR as a possible solution to their problem. The brief to VP Group was to produce a high quality and accurate record of the stones and their environs in their present state.  Whilst not designed to replace the real experience, the visualisation was to be detailed enough to allow people to “walk” amongst the stones and inspect the different textures in 3D – something the general public is no longer allowed to do.

During the initial project review stage, Intel Corporation (UK) approached English Heritage with an offer to co-sponsor the Project, through their Community Liaison Programme.  In conjunction with Intergraph (UK), Intel selected the Pentium Pro-based TDZ/GLZ

Workstation series, on which the model was to be developed and finally demonstrated.  Before the team could begin the time-consuming process of inputting all the information from English Heritage’s digital survey into Sense8’s VR package, WorldToolKit, a surface representation of each stone was manually built up from point data extrapolated from hundreds of stereo photographs.  Around 60,000 points made up each of the 80 or so surveyed stones.  This figure had to be reduced to allow real-time rendering to take place on the target Intergraph platforms.  A painstaking manual process gradually “decimated” these point data.  The result was 10 separate models of each stone, the level of detail on each chosen to correspond to a variety of end user viewing distances – the further away, the lower the level of detail.  At run-time the software selects the 5 most appropriate levels of detail, based on the characteristics of the computer being used. 

One from each of a stereo pair of photographs was then digitised, processed and texture mapped onto the geometry of the relevant stone. Even small surface features such as cracks, lichens and fungi are clearly visible.  The full version of the Stonehenge model requires 80 Mb of texture RAM.  Lower resolution versions (26 Mb and 8 Mb have, however, been produced).  Stonehenge’s virtual landscape was created from digital topographic information derived from aerial photography and boasts all the features contained within the real area – barrows, ditches, roads, the Avenue and the current Visitors’ Centre.  In geometric terms, the entire model contained 50,000 polygons – 40,000 of these described the stones and immediate terrain, the remaining 10,000 occupying the more distant terrain (area: 2.5 x 2.5 kilometres). 

Other historical features – ditches, banks and the like have been geometrically exaggerated, otherwise they would not be visible to the user when at normal eye height in the virtual world.  It took four developers six months to complete the Project.  Virtual Stonehenge was launched at the London Planetarium on June 20, 1996, a few hours before the actual Summer Solstice.  Following a description of the Solstice by the renowned astronomer and celebrity Patrick Moore, English Heritage’s Chairman, Sir Jocelyn Stevens, donned a VR headset and set off to explore Virtual Stonehenge, pausing to view the night-time sky (see below), the real-time sunrise (see below) and to remove all 20th Century, man-made artefacts, returning the site to a near-“virgin” condition, as is planned for the year 2000.

Astronomical Mapping.  The basic source data for the star positions were originally downloaded from the Internet using Right Ascension and Declination for stars with a greater Apparent Visual Magnitude than 3.55.  This form was chosen, as it was not practical to represent the stars according to their “real world” positions (in this case the virtual world would have had a bounding box measured in light years!). Right Ascension is measured in hours (24) and had to be converted to degrees, and Declination was measured in degrees (-90o to +90o).  These can be thought of as the longitude and latitude lines that span the Earth.  The star positions were then projected onto a sphere surrounding the Stonehenge model from the celestial equator (ie. centre of the earth).  Once the stars were spherically projected, they were scaled according to their Apparent Visual Magnitude.  The position of Stonehenge from the centre of the Earth then had to be taken into consideration as the “star sphere” was being projected from the celestial equator.  This involved shifting the whole star sphere and then spinning the sphere around an axis close to the North Star.

Sunrise Effect.  Various methods of achieving a real-time sunrise effect were discussed.  It was decided that a method based on using smooth shaded ellipses would take full advantage of the Intergraph hardware and Sense8’s WorldToolKit.  For the sunrise effect to work, there were two objects primarily interacting with each other – the hemisphere surrounding Stonehenge (the sky) and a “virtual” sun.  The sun had various parameters which could be set.  The actual sun object can be thought of as a number of bands of differing circumference (each representing a different fixed colour) centred around a point.  If one imagines a point travelling from the centre of the bands to the outside, the colour of the point would then gradually change.  If the distance between bands is increased and the point is travelling at the same speed, then the perceived change in colour be would less obvious.  This is useful when one wishes to fade gradually from night to day over a long period, and for the extreme effects when the fade changes from red to yellow over very short distances.  The interpolation between the different colour bands was computed and stored in a colour look-up table to optimise execution time.  Each point in the hemisphere (sky dome) was individually coloured according to its distance from the sun.  This was achieved through interaction between the sun object and the sky dome.  The virtual sun was initially placed at its furthest band distance from the dome and then gradually moved inwards.  This created the essence of the sunrise effect. To enhance the effect of a genuine sunrise further, the spheres (bands) were changed to ellipses, thus recreating the atmospheric refraction  that is seen on Earth.  Through the use of an ASCII text file, the developers were then allowed to experiment with various parameters for each band (eg. the number of band, the colour, individual radii, the number of colour interpolations and the three dimensional elliptical shape used).  By using this method it meant the demonstration takes advantage of Gouraud shading, thereby using a minimum amount of texture memory and keeping run-time efficiency at an optimum level.

Posted in Bob Stone, Sense8 | Leave a comment

WTK Stonehenge

In 1996, Intel and the English Heritage foundation sponsored a virtual reality simulation of Stonehenge build with WorldToolkit.

According to Robert Stone’s YouTube post, this was “the first ever Virtual Stonehenge demo created for English Heritage and presented by the late Sir Patrick Moore at the 1996 Summer Solstice (June) at the London Planetarium”, while a subsequent post noted this was “merely to publicize Intel’s Pentium Pro-based TDZ/GLZ Workstations”

From what I can tell from examining the various datafile included in the demo, a lot of attention was paid to recreating the star fields above the ancient monument along with the sunrise…all intrinsic elements of the monument.

Getting the application to work again was pretty challenging. Many of the file locations were hardcoded to specific directories. I actually needed to use a hex editor on the executable to find some of these locations. And while the monument was modeled with 4 “levels of detail” for some reason, as I approached the monolith, the stones would mysteriously disappear.

And while I have always been fascinated by Stonehenge, what really intrigues me about this model is the recreation of the Visitor Center. Access to the monument has been greatly curtailed. While the monument has remained the same for centuries, the visitor center from 1996 is long gone, replaced sometime in 2013.

I am not sure if these turnstiles are from this picture, but since you are no longer allowed to physically approach the stones, I am certain they have been permanently removed.

The simulation includes a model of walkway/ramp the allowed visitors to safely walk under the nearby A360 motorway. It includes an image of the complete monument while the recreation could be seen in the distance.

Proposed A360 Walkway

UPDATE

According to Professor Bob Stone, creator of the experience, while primarily educational, part of the reason for the project was to explore the idea of removing the old Visitor Center.

With and without the old Visitor Center

The Players:

  • Prof Bob Stone“Yep, I led this project with the VR Solutions/Virtual Presence lot, liaising with English Heritage to get the stone circle and historical elements right (before it was hijacked by Intel to show off their Pentium Pro chipset). Still have the project details, images and grainy video of the project too!”
  • Andrew Connell“That was one of the last Sense8 apps I wrote myself. I remember late night hacking to get the sunrise and star effects working for the planetarium launch. We used the phrase ‘mathematically accurate’ on the press info at the time, but mostly I just fiddled with it until I liked the effect!
    But it was an acheivement to get it all working back in 95 using laser scan models and decent resolution unique image sets for every stone.”
  • Glenn Johnson“recall ‘starting’ to work on those cloud meshes for Andrew Connell. Not enough hours in the day :)”
Posted in Bob Stone, Demos, Sense8 | 1 Comment