Skip to main content

Professor Alan Chalmers

Alan Chalmers is working to create Real Virtuality - true high-fidelity multi-sensory virtual environments - at Warwick Digital Lab. An expert in human perception in virtual environments, Professor Chalmers is using extremely novel multi-sensory techniques to render 'as there' virtual reality environments in order to deliver a virtual high-fidelity, multi-sensory experience in real-time on existing technology.

Using understanding of human perception, Professor Chalmers and his team are able to substantially reduce the amount of computing power needed to deliver a wide range of complex, physically-based, multi-sensory environments.

He said: "Human perception is good, but it's not that good. For example, when people look at an image, they aren't seeing the whole picture all at once: they are concentrating on parts of the image related to the task they are doing. This means that we don't have to use valuable computing power to deliver in high quality what they are not attending to."

Professor Chalmers builds up a pattern of where the human is attending based on:

  • what we know about the perceptual importance of objects in a scene: motion, certain smells, the intensity of sounds; etc;
  • what can be discovered about what the subject is doing in the environment: eg a fire warden would notice fire extinguishers along a corridor before anything else; and
  • how the sensory stimuli are propagated within the environment.

It's then possible to deliver only a portion of the whole scene at high quality and the remainder at a much lower quality and it will be indistinguishable, to the user, from the environment delivered fully at high quality. The savings made in computing power can be used to ensure a real-time experience so that, as a user, say, turns their head, they see the corresponding view immediately without loss of quality and with no delay.

Professor Chalmers adds: "What's really important is that these images are based on real things - they are not artists’ impressions. In order for this to be used in industry, we have to be able to prove that we are simulating the exact multi-sensory environment as it would be in life, otherwise the results will be meaningless."

The project has applications in a number of fields:

  • in training simulators: pilots or drivers can have an authentic experience, such as glare off displays, the smell of burning cabling, the real dusts flows as they land their helicopter;
  • in industry: new products can be designed as if they were real to experience how users might react to them in the context for which they are intended;
  • in architecture: building comfort levels, including the temperature of the offices, the smell from the toilets on a hot day, the sound from the stairwell, can be tested knowing that the result will be true in reality.

The key to this research is its multi-sensory nature. Professor Chalmers explains: "The real world is multi-sensory. The interaction of our senses, the so called crossmodal effects, can significantly influence our perception of an environment and thus possibly our behavior. A classic example is the ventriloquism effect in which we are fooled into thinking that the sound source emanates from the dummy, when of course it doesn't."

Another important aspect is the need to deliver real-world lighting to the virtual environment. This is achieved through High Dynamic Range (HDR) techniques. Professor Chalmers elaborates: "HDR visuals introduce a step change in image quality compared to traditional imaging techniques , enabling the wide range of real world lighting to be captured and delivered digitally. If a human eye can see something, HDR technology can capture and display it, without any over-, or under-exposure, even on conventional display. The impact is enormous, for example, the ability to clearly see an object when it moves from the shadow into bright sunshine. Furthermore, HDR provides depth perception, without the need for any 3D glasses".

To achieve Real Virtuality Professor Chalmers has brought together experts in virtual environments, neuroscience, perception, engineering, computer science, and systems modeling. This unique combination of complementary skills will enable us to capitalize on the intrinsic properties of the human perceptual system (limitations, resolution, illusions, crossmodalities) to deliver to the virtual environment in real-time in high quality only those parts of the real scene to which the user is attending. This high quality delivery will be perceptually equivalent in fidelity to the real scene being depicted. The remainder will be delivered at a much lower quality, without the user being aware of this quality difference.

Alan Chalmers has an MSc with distinction from Rhodes University, 1985 and a PhD from University of Bristol, 1991. He has published over 190 papers in journals and international conferences on multi-sensory perception, high-fidelity graphics, High Dynamic Range (HDR) imaging, virtual archaeology and parallel rendering. He is Honorary President of Afrigraph and a former Vice President of ACM SIGGRAPH. Together with SpheronVR, a high-precision German camera company, he was instrumental in the development of the world’s first HDR video camera, which was completed in July 2009. He is the Founder and a Director of the spin-out company goHDR Ltd., which aims to be the leader in the software which enables HDR technology.