Skip to main content Skip to Footer

BLOG


November 20, 2018
XR strategies to reduce simulation sickness
By: Dave Arendash and Brent Blum

By Dave Arendash, Accenture Extended Reality (XR) Developer, and Brent Blum, Accenture Extended Reality (XR), North America Capability Lead

In simple terms, VR-related nausea or "simulation sickness" (sim-sickness for short) is caused either by a poorly designed immersion or under-powered equipment that does not create a believable experience.

"Simulation sickness" can be a problem with #VR technology. But through #XR, we see solutions.

 
 

While a very small portion of the population may always experience discomfort (in much the same way motion sickness while flying is an issue for some people), sim-sickness is largely a legacy technology problem. It is generally caused by less-than-optimal computers, outdated video cards with low refresh rates or head-mounted displays that present choppy and unnatural movements.

Fortunately, both VR viewer technologies and developer skillsets are advancing quickly. But companies still need to emphasize the high quality of the user experience from the beginning. This is the case whether building enterprise-grade VR solutions for immersive learning, product design simulators for employees or VR product purchasing apps for customers.

The two main approaches to minimize sim-sickness are creating a “sense of presence” in the VR experience itself and paying attention to the technical details during development.

Embedding presence into the experience

Presence can be described as the emotional and cognitive effect possible by creating a true immersion experience in VR—and it offers many positive effects when done well.

The primary method is to create avatars in the experience so that participants do not feel disembodied. Depending on the requirements for the immersion environment, these avatars can be virtual representations of hands all the way up to full body avatars. A good starting point is to develop for hand avatars (which is better than using just controller visualizations) and expand to full arm or upper body if needed. Head avatars are generally only necessary in multi-user environments, or if there are mirrors in the environment.

Another option is to use inverse kinematics to help the user infer where the elbow, shoulder, torso, waist and legs should be. As for developing avatars, the trend is toward using open specifications so that a person’s avatar can be used across multiple VR systems and experiences, such as what Oculus is providing with its cross-platform support.1

Along these same lines, developing a VR experience so that it has a “virtual nose” creates a constant presence for the participant in much the same way staring at the horizon provides a stable viewing platform for someone on a boat. Wherever she looks, the experience will follow.

Adding ambient sounds that are appropriate to the environment is another smart choice. These subtle cues, such as hearing a bird chirp in an outdoor experience or a scraping noise when picking up a tool, help to assure users that the experience is real, whereas silence or a sanitized audio environment are noticeably wrong. Noises should also be spatialized correctly depending on the room model or virtual environment; this will replicate the proper echo and filtering effect of how humans hear sound.

A final note on presence is to carefully navigate the participant through the experience in a human-like way similar to the motions people perform in their daily lives. Striding forward or backward, for example, or jumping up onto a box or climbing down a ladder are all normal movements, assuming they are not replicated in the VR experience in a jarring way. Also take care to avoid turning the user sideways and minimize rolling or rotational movement.

Technical tricks in development

Based on tried-and-true approaches, companies should consider these tips in the VR development phase.

Start by reducing the field of view when a participant is moving through the VR world to more closely mirror a person’s natural field of view. This will minimize the peripheral vision and avoid activating the “flight or fight” response. Several XR developer platforms offer this feature as a plug-in—and a few are now automating it in their software.

Keep the frame-rate high with an advanced graphic card and processing unit while making the latency low. In this context, latency is the time from when a command is given via the hand controller or the head-mount moves, to the time when the VR experience responds. Latency rates depend in part on hardware, but more on software. In most cases, WiFi or Bluetooth can provide enough speed; however, companies developing a VR experience for delivery over the Internet or a multi-participant VR solution may prefer a cabled solution.

Another good idea is to optimize scene geometry and scene lighting to get the fastest performance possible from the hardware being used—or employ multi-threading as much as possible. Up to now, companies had to pre-render objects or organize them through association to maximize the VR experience. But new optimization techniques are breaking these boundaries. Unity, for example, recently demoed the MegaCity VR environment, which uses a high-performance engine and prefabricated workflows containing 4.5 million rendered objects.2

Finally, consider developing for six degrees of freedom (6DOF) headsets, which track both movement and orientation versus 3DOF headsets, which only track orientation--even for seated VR experiences. The lack of parallax in 3DOF--defined as the effect whereby the position or direction of an object appears to differ when viewed from different positions--that a participant experiences as he moves his body even slightly can be a trigger for sim-sickness. The newer tethered head-mounted displays from Oculus, Vive and Windows Mixed Reality offer 6DOF, while the Lenovo Mirage Solo, Google stand-alone DayDream and Pico Neo all provide mobile 6DOF headsets with 6DOF hand-controllers coming soon.

Looking into the future, VR innovations to reduce sim-sickness may include galvanic vestibular stimulation, which sends electric messages to the nerve in the ear to maintain a person’s balance. This approach has been used in medical applications to treat motion sickness. Now researchers and VR developers, such as VMocion, are exploring ways to use galvanic vestibular stimulation in VR headsets to induce a false sense of movement, to go along with the VR visual that presents movement.

Overall, these “human experience revolution” approaches will help fast-track your enterprise VR app development process while minimizing inadvertent side effects. To learn more, visit the Accenture Extended Reality (XR) site.


1https://developer.oculus.com/blog/whats-next-for-oculus-avatars/

2https://www.mcvuk.com/development/unity-unleashes-megacity-demo-millions-of-objects-in-a-huge-cyberpunk-world

Popular Tags

    More blogs on this topic

      Archive