You can make realistic buildings with VR, but what about the people that populate your presentations? New facial and eye tracking tools aim to bring avatars to life
Virtual reality has transformed the way architects experience their creations but even the most advanced systems struggle to mimic the intimacy of real-life human interaction. Effective collaboration and immersion in a 3D world is harder to achieve when you’re surrounded by blank-faced avatars looking like characters in a 1980s video game.
VR device maker HTC is changing all that with its new facial and eye tracking tools intended to help users more easily convey emotion when plugged into its business VR headset Vive Focus 3.
According to a release from the company, the Focus 3 Facial Tracker device attaches to the Vive headset via a USB port and features a mono tracking camera that accurately captures facial expressions and mouth movements across the lips, jaw, cheeks, chin, teeth, and tongue. These can be mapped onto VR avatars in real time in VR collaboration software to more naturally convey emotions and help other users read intentions.
Fabian Nappenbach, director of product marketing at HTC Vive said: ‘Capturing 38 different blend shapes across the lower face enables users to enhance their avatars in VR collaboration tools like VIVE Sync and precisely capture true-to-life facial expressions and mouth movements.’
The 60Hz tracking rate and ‘optimised runtime for facial tracking’ synchronises lips to voice with minimal latency, so when a client tells you you’ve ruined their vision for the pool house you’ll feel that sense of crushing despair even more immediately.
The Focus 3 Eye Tracker is a lightweight module that attaches magnetically to the headset, exploiting a dual camera to capture realistic eye movements and blinks to project onto avatars. This helps better convey collaborators’ non-verbal cues and the focus of their attention – are they directing it to your beautifully detailed brickwork or simply gazing into virtual space?
The technology is capable of capturing data on gaze origin and direction, pupil size and position, and eye openness, which could help designers improve their designs
The technology is capable of capturing data on gaze origin and direction, pupil size and position, and eye openness, which we assume could help designers improve their understanding of how clients experience a project in VR and hone their designs in response. So it could prove useful to measure and adapt their presentation skills.
According to HTC, eye tracking can also lighten GPU loads by prioritising rendering on areas where users are actually focused.
Commenting on the benefits of the tools, David Munro, senior consultant at development consultancy Cityscape Digital, said: ‘From a planning point of view it helps us to validate the visual impact of a newly proposed building from a human perspective. From a vision and early design viewpoint it helps us to connect with our end user and appreciate what is important to them. This leads to real engagement with the audience and ultimately enables us to build better places.’
HTC Vive’s eye and facial trackers support the games engines Unity, Unreal and Native and cost £216 and £83 respectively.