Simulator Sickness


  • “Simulator sickness” refers to symptoms of discomfort that arise from using simulated environments.
  • Conflicts between the visual and bodily senses are the cause.
  • Numerous factors contribute to simulator sickness, including:
    • Acceleration—minimize the size and frequency of accelerations
    • Degree of control—don’t take control away from the user
    • Duration of simulator use—allow and encourage users to take breaks
    • Altitude— avoid filling the field of view with the ground
    • Binocular disparity—some find viewing stereoscopic images uncomfortable
    • Field-of-View—reducing the amount of visual field covered by the virtual environment may also reduce comfort
    • Latency—minimize it; lags/dropped frames are uncomfortable in VR
    • Distortion correction—use Oculus VR’s distortion shaders
    • Flicker—do not display flashing images or fine repeating textures
    • Experience—experience with VR makes you resistant to simulator sickness (which makes developers inappropriate test subjects)
  • Locking the background to the player’s inertial reference frame has been found to be effective at reducing simulator sickness.
  • Various methods are currently being explored for greater comfort in VR.
  • The SSQ can be used as a means of gathering data on how comfortable your experience is.


Simulator sickness is a form of induced motion sickness, which differs from your everyday motion sickness. Whereas the motion sickness with which people are most familiar results from actual motion (such as the bobbing of a boat that causes seasickness), the primary feelings of discomfort associated with simulator sickness occur when visual information from a simulated environment signals self-motion in the absence of any actual movement. In either case, there are conflicts among the visual, vestibular (balance), and proprioceptive (bodily position) senses that give rise to discomfort. Furthermore, simulator sickness includes symptoms that are unique to using a virtual environment, such as eye strain/fatigue (though not necessarily for the same reason as bodily discomfort). Some users might experience some degree of simulator sickness after a short period of time in a headset, while others may never experience it.

Simulator sickness poses a comfort problem to users and developers alike. No matter how fundamentally appealing your content is or how badly a user wants to enjoy it, almost no one wants to endure the discomfort of simulator sickness. Therefore, it is extremely important to understand its causes and implement strategies to minimize its occurrence. The exact causes of simulator sickness (and in fact all forms of motion sickness) are still being researched. Simulator sickness has a complex etiology of factors that are sufficient but not necessary for inducing discomfort, and maximizing user comfort in the VR experience requires addressing them all.

Simulator sickness has a constellation of symptoms, but is primarily characterized by disorientation (including ataxia, a sense of disrupted balance), nausea (believed to stem from vection, the illusory perception of self-motion) and oculomotor discomfort (e.g., eyestrain). These are reflected in the subscales of the simulator sickness questionnaire (SSQ),[1] which researchers have used to assess symptomatology in users of virtual environments.

Factors Contributing to Simulator Sickness

It can be difficult to track down a particular cause for simulator sickness. Different users will have different experiences, sensitivity to different types of stimuli can vary, and the symptoms can take a while (anywhere from minutes to hours) to manifest. As a VR designer, you will be spending long periods of time immersed in VR, and long exposure to virtual environments can train the brain to be less sensitive to their effects.[2] As such, dedicated VR developers will be less susceptible to simulator sickness than most users. Objectively predicting whether a user will experience discomfort from your content without obtaining feedback from inexperienced users can be difficult.

Motion sickness susceptibility varies in the population and correlates with the intensity of simulator sickness experiences.[3] This means users who know they tend to experience motion sickness in vehicles, rides, and other contexts should approach using VR carefully, and you should alert users to this point in your warnings and instructions. Applying the recommendations throughout this manual can help reduce the possibility that users will experience simulator sickness.

The following section lists factors that have been studied as potential contributors to simulator sickness. Some factors are less under the designer’s control than others, but understanding them can help you minimize user discomfort. Also note that some of this information overlaps with other sections, but this section offers more detailed explanations of their role in simulator sickness.

Speed of Movement and Acceleration

Speed of movement is directly proportional to the speed of onset of simulator sickness, but not necessarily the subsequent intensity or rate of increase.[4] Although slower movement speeds will generally feel more comfortable, the most important issue is acceleration, which is the stimulus to which the inner ear vestibular organs respond. Acceleration (linear or angular, in any direction) conveyed visually but not to the vestibular organs constitutes a sensory conflict that can cause discomfort. An instantaneous burst of acceleration is more comfortable than an extended, gradual acceleration to the same movement velocity.

Discomfort will increase as a function of the frequency, size, and duration of acceleration. Because any period of visually-presented acceleration represents a period of conflict between the senses, it is best to avoid them as much as possible.

Degree of Control

Taking control of the camera away from the user or causing it to move in ways not initiated by the user can lead to simulator sickness. Some theories suggest the ability to anticipate and control the motion experienced plays a role in staving off motion sickness,[5] and this principle appears to hold true for simulator sickness as well. Therefore, unexpected camera movement (or cessation of movement) outside the user’s control can be uncomfortable. Having an avatar that foreshadows impending camera movement can help users anticipate and prepare for the visual motion, potentially improving the comfort of the experience.[6]

If you have a significant event for the user to watch (such as a cut scense or critical environmental event), avoid moving their gaze for them. Instead, encourage users to move their own gaze, for example by having non-player characters (NPCs) looking towards the scene or event, cuing them to events with sound effects, or by placing some task-relevant target (such as enemies or pick-ups) near it.

As stated previously, do not decouple the user’s movements from the camera’s movements in the virtual environment.


The longer you remain in a virtual environment, the more likely you are to experience simulator sickness. Users should always have the freedom to suspend their game, then return to the exact point where they left off at their leisure. Well-timed suggestions to take a break, such as at save points or breaks in the action, are also a good reminder for users who might otherwise lose track of time.


The altitude of the user — that is, the height of the user’s point of view (POV) — can be an indirect factor in simulator sickness. The lower the user’s POV, the more rapidly the ground plane changes and fills the user’s FOV, creating a more intense display of visual flow. This can create an uncomfortable sensation for the same reason moving up staircases, which also creates an intense visual flow across the visual field, is so discomforting.

Binocular Display

Although binocular disparity is one of the Rift’s key and compelling depth cues, it is not without its costs. As described in Binocular Vision, Stereoscopic Imaging and Depth Cues, stereoscopic images can force the eyes to converge on one point in depth while the lens of the eye accommodates (focuses itself) to another. Although you will necessarily make use of the full range of depth in VR, it is important to place content on which you know users will be focusing for extended periods of time (such as menus or a 3rd-person avatar) in a range of 0.75 to 3.5 Unity units (meters) away.

Some people find viewing stereoscopic images uncomfortable, and research has suggested that reducing the degree of disparity between the images (i.e., reducing the inter-camera distance) to create a monoscopic[7] (i.e., zero-inter-camera distance) or microstereoscopic[8] (i.e., reduced inter-camera distance) display can make the experience more comfortable. In the Rift, it is important that any scaling of the IPD is applied to the entire head model.

As stated elsewhere, you should set the inter-camera distance in the Rift to the user’s IPD from the config tool to achieve a veridical perception of depth and scale. Any scaling factors applied to eye separation (camera distance) must be also applied to the entire head model so that head movements correspond to the appropriate movements of the virtual rendering cameras.

Field of View

Field of view can refer to two kinds of field of view: the area of the visual field subtended by the display (which we call “display FOV” or dFOV in this guide), and the area of the virtual environment that the graphics engine draws to the display (which we call “camera FOV” or cFOV).

A wide dFOV is more likely to contribute to simulator sickness primarily for two reasons related to the perception of motion. First, motion perception is more sensitive in the periphery, making users particularly susceptible to effects from both optic flow and subtle flicker in peripheral regions. Second, a larger display FOV, when used in its entirety, provides the visual system with more input than a smaller display FOV. When that much visual input suggests to the user that they are moving, it represents an intense conflict with bodily (i.e., vestibular and proprioceptive) senses, leading to discomfort.

Reducing display FOV can reduce simulator sickness,[9] but also reduces the level of immersion and situational awareness with the Rift. To best accommodate more sensitive users who might prefer that compromise, you should allow for user-adjustable display FOV. Visibility of on-screen content should not be adversely affected by changing display FOV.

Having a cockpit or vehicle obscuring much of the vection-inducing motion in the periphery may also confer a similar benefit for the same reasons. Note also that the smaller the user’s view of their environment, the more they will have to move their head or virtual cameras to maintain situational awareness, which can also increase discomfort.

Manipulating camera FOV can lead to unnatural movement of the virtual environment in response to head movements (for example, if a 10° rotation of the head creates a rotation of the virtual world that would normally require a 15° rotation in reality). In addition to being discomforting, this can also cause a temporary but maladaptive condition known as vestibular-ocular reflex (VOR) gain adaptation.[10] Your eyes and vestibular system normally work together to determine how much the eyes must move during a head movement in order to maintain stable fixation on an object. If the virtual environment causes this reflex to fail to maintain stable fixation, it can lead to an uncomfortable re-calibration process both inside the Rift and after terminating use.

Latency and Lag

Although developers have no control over many aspects of system latency (such as display updating rate and hardware latencies), it is important to make sure your VR experience does not lag or drop frames on a system that meets minimum technical specifications. Many games can slow down as a result of numerous or more complex elements being processed and rendered to the screen. While this is a minor annoyance in traditional video games, it can have an uncomfortable effect on users in VR.

Past research findings on the effects of latency are somewhat mixed. Many experts recommend minimizing latency to reduce simulator sickness because lag between head movements and corresponding updates on the display can lead to sensory conflicts and errors in the vestibular-ocular reflex. We therefore encourage minimizing latency as much as possible.

It is worth noting that some research with head-mounted displays suggests a fixed latency creates about the same degree of simulator sickness whether it’s as short as 48 ms or as long as 300 ms;[11] however, variable and unpredictable latencies in cockpit and driving simulators create more discomfort the longer they become on average.[12] This suggests that people can eventually get used to a consistent and predictable bit of lag, but fluctuating, unpredictable lags are increasingly discomforting the longer they become on average.

Still, adjusting to latency (and other discrepancies between the real world and VR) can be an uncomfortable process that leads to further discomfort when the user adjusts back to the real world outside of VR. The experience is similar to getting on and off a cruise ship. After a period feeling seasick from the rocking of the boat, many people become used to the regular, oscillatory motion and the seasickness subsides; however, upon returning to solid land, many of those same people will actually experience a “disembarkment sickness” as the body has to readjust once again to its new environment.[13]

The less you have to make the body adjust to entering and exiting VR, the better. Developers are urged to use the Performance HUD and Oculus Debug Tool to measure motion-to-photon latency to ensure it is as short and consistent as possible. Further documentation on its use is available in the SDK.

Distortion Correction

The lenses in the Rift distort the image shown on the display, and this is corrected by the post-processing steps given in the SDK. It is extremely important that this distortion be done correctly and according to the SDK’s guidelines and the example demos provided. Incorrect distortion can “look” fairly correct, but still feel disorienting and uncomfortable, so attention to the details is paramount. All of the distortion correction values need to match the physical device—none of them may be user-adjustable (the SDK demos allow you to play with them just to show what is happening behind the scenes).

We carefully tune our distortion settings to the optics of the Rift lenses and are continually working on ways of improving distortion tuning even further. All developers must use the official Oculus VR distortion settings to correctly display content on the Rift.


Flicker plays a significant role in the oculomotor component of simulator sickness. It can be worsened by high luminance levels, and is perceived most strongly in the periphery of your field of view. Although flicker can become less consciously noticeable over time, it can still lead to headaches and eyestrain.

Although they provide many advantages for VR, OLED displays carry with them some degree of flicker, similar to CRT displays. Different people can have different levels of sensitivity, but the 90-hz display panels of the Rift are fast enough that the majority of users will not perceive any noticeable flicker. This is more or less out of your hands as a developer, but it is included here for completeness.

Your responsibility is to refrain from creating purposely flickering content. High-contrast, flashing (or rapidly alternating) stimuli, can trigger photosensitive seizures in some people. Related to this point, high-spatial-frequency textures (such as fine black-and-white stripes) can also trigger photosensitive seizures. The International Standards Organization has been studying photosensitive seizures and image safety and is in the process of developing a standard to reduce the risk of photosensitive seizures from image content.[14] The standard addresses potentially harmful flashes and patterns. You must ensure that your content conforms to standards and best practices on image safety.


The more experience users have had with a virtual environment, the less likely they are to experience simulator sickness.[14] Theories for this effect involve learned—sometimes unconscious—mechanisms that allow the user to better handle the novel experience of VR. For example, the brain learns to reinterpret visual anomalies that previously induced discomfort, and user movements become more stable and efficient to reduce vection. The good news is that developers should not be afraid to design intense virtual experiences for more experienced users; the bad news is that most users will need time to acclimate to the Rift and the game before they can be expected to handle those experiences.

This has a few important ramifications. First, developers who test their own games repeatedly will be much more resistant to simulator sickness than a new user, and therefore need to test the experience with a novice population with a variety of susceptibility levels to simulator sickness to assess how comfortable the experience actually is. Second, new users should not be thrown immediately into intense game experiences; you should begin them with more sedate, slower-paced interactions that ease them into the game. Even better, you should implement the recommendations in this guide for user-controlled options to adjust the intensity of the experience. Third, games that do contain intense virtual experiences should provide users with warning of the content in the game so they may approach it as they feel most comfortable.

Combating Simulator Sickness

Player-Locked Backgrounds (a.k.a. Independent Visual Backgrounds)

The simulator sickness research literature has provided at least one purely visual method of reducing simulator sickness that can be implemented in VR content. Experimenters put people in a virtual environment that either did or did not contain what they called an independent visual background. [15] This constituted a simple visual backdrop, such as a grid or skybox, that was visible through the simulator’s primary content and matched the behavior of the stable real-world environment of the user. For example, a driving simulator might indicate movement through the environment via the ground plane, trees, and buildings passing by; however, the skybox, containing a few clouds, would remain stationary in front of the user, even when the car would turn.[16] Using a virtual environment with an independent visual background has been found to significantly reduce the experience of simulator sickness compared to a virtual environment with a typically behaving background.

This combats the sensory conflict that normally leads to discomfort by allowing the viewer’s brain to form an interpretation in which the visual and vestibular senses are consistent: the user is indeed stationary with the background environment, but the foreground environment is moving around the user. Our particular implementation has used a player-locked skybox that is rendered at a distance farther away than the main environment which the player navigates. A variety of backdrops appear to be effective in our preliminary testing, ranging from realistic (a sea, horizon line, and clouded sky above) to artificial (a black, grid-lined box). As soon as the player begins any locomotion or rotation in the foreground environment with a controller or keyboard, they will notice that the distant backdrop remains stationary, locked to their real-world body’s position. However, they can still look around the backdrop with head movements at any time. The overall effect is that the player feels like they are in a gigantic “room” created by the backdrop, and the main foreground environment is simply moving around them.

This method has been found to be effective in reducing simulator sickness in a variety of technologies, and the Rift is no exception. However, this method is not without its limitations. The sickness-reducing effect is contingent upon two factors: the visibility of the background, and the degree to which it is perceived as further out from the player than the foreground environment. Not all virtual environments will be outdoors or otherwise somewhere where a player-locked background will be readily visible and intuitively make sense.

These practical limitations motivated us to attempt applying our grid-lined room pattern to all virtual environments as a translucent overlay, using binocular disparity and aerial perspective (i.e., fog) as depth cues that the grid is far off in the distance. Although this generally felt effective, this can potentially reduce the user’s suspension of disbelief. In addition, we found that any cues that cause the player to perceive the grid as positioned between their eyes and the foreground environment (such as making the grid opaque) abolish any benefits.

Still, employed properly, this method holds promise for allowing developers to provide a wider variety of experiences to players with less impact on comfort. Furthermore, it can also serve as a means of helping users get acclimated to the virtual environment; players might turn the locked background on when first engaging your content, then have the option to disable or attenuate the effect with time. Even the most compelling VR experience is useless if almost no one can enjoy it comfortably. Player-locked backgrounds can broaden your audience to include more sensitive users who might otherwise be unable to use your content. If an effective form of independent visual background can be implemented in your content, consider including it as a player-configurable option.

Novel Approaches

Developers have already begun exploring methods for making conventional video game experiences as comfortable in VR as they are on a computer screen. What follows are descriptions of a few of the methods we have seen to date. Although they may not be compatible or effective with your particular content, we include them for your consideration.

Because locomotion leads to vection and, in turn, discomfort, some developers have experimented with using various means of teleporting the player between different locations to move them through a space. Although this method can be effective at reducing simulator sickness, users can lose their bearings and become disoriented.[17]

Some variants attempt to reduce the amount of vection the user experiences through manipulations of the camera. An alternative take on the “teleportation” model pulls the user out of first-person view into a “god mode” view of the environment with the player’s avatar inside it. The player moves the avatar to a new position, then returns to first-person view from the new perspective.

Yet another approach modifies the way users turn in the virtual environment. Rather than smoothly rotating, pressing left or right on a controller causes the camera to immediately jump by a fixed angle (e.g., 30°) in the desired direction. The idea is to minimize the amount of vection to which the user is exposed during rotation, while also generating a regular, predictable movement to prevent disorientation.

Measurement and Testing

A wide variety of techniques have been used in the measurement and evaluation of simulator sickness. On the more technical side, indirect measurements have included galvanic skin response, electroencephalogram (EEG), electrogastrogram (EGG), and postural stability. Perhaps the most frequently used method in the research literature, however, is a simple survey: the simulator sickness questionnaire (SSQ).

Like any other questionnaire, the SSQ carries some inherent limitations surrounding the validity of people’s self-reported insights into their own minds and bodies. However, the SSQ also has numerous advantages. Unlike indirect, physiological measures, the SSQ requires no special equipment or training—just a pen-and-paper and some arithmetic. Anyone can deliver the questionnaire, compute scores, and interpret those scores based on past data. For respondents, the questionnaire is short and simple, taking only a minute of time out of a playtest. The SSQ therefore provides a lot of informational value for very little cost to the tester, and is one potential option for assessing comfort in playtesting.

[1] Kennedy, R. S., Lane, N. E., Berbaum, K. S., & Lilienthal, M. G. (1993). Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The International Journal of Aviation Psychology, 3(3), 203-220.

[2] Kennedy, R., Stanney, K., & Dunlap, W. (2000). Duration and exposure to virtual environments: Sickness curves during and across sessions. Presence, 9(5), 463-472.

[3] Stanney, K. M., Hale, K. S., Nahmens, I., & Kennedy, R. S. (2003). What to expect from immersive virtual environment exposure: influences of gender, body mass index, and past experience. Human factors, 45(3), 504–20.

[4] So, R.H.Y., Lo, W.T., & Ho, A.T.K. (2001). Effects of navigation speed on motion sickness caused by an immersive virtual environment. Human Factors, 43(3), 452-461.

[5] Rolnick, a, & Lubow, R. E. (1991). Why is the driver rarely motion sick? The role of controllability in motion sickness. Ergonomics, 34(7), 867–79.

[6] Lin, J. J., Abi-Rached, H., & Lahav, M. (2004, April). Virtual guiding avatar: An effective procedure to reduce simulator sickness in virtual environments. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 719-726). ACM.

[7] Ehrlich, J.A. & Singer, M.J. (1996). Simulator sickness in stereoscopic vs. monoscopic helmet mounted displays. In: Proceedings of the Human Factors and Ergonomics Society 40th Annual Meeting.

[8] Siegel, M., & Nagata, S. (2000). Just Enough Reality: Comfortable 3-D Viewing. IEEE Transactions on Circuits and Systems for Video Technology, 10(3), 387–396.

[9] Draper, M.H., Viire, E.S., Furness, T.A., & Gawron, V.J. (2001). Effects of image scale and system time delay on simulator sickness within head-coupled virtual environments. Human Factors, 43 (1), 129-146.

[10] Stoffregen, T.A., Draper, M.H., Kennedy, R.S., & Compton, D. (2002). Vestibular adaptation and aftereffects. In Stanney, K.M. (ed.), Handbook of virtual environments: Design, implementation, and applications (pp.773-790). Mahwah, New Jersey: Lawrence Erlbaum Associates, Publishers.

[11] Draper, M.H., Viire, E.S., Furness, T.A., Gawron, V.J. (2001). Effects of image scale and system time delay on simulator sickness with head-coupled virtual environments. Human Factors, 43(1), 129-146.

[12] Kolasinski, E.M. (1995). Simulator sickness in virtual environments (ARTI-TR-1027). Alexandria, VA: Army Research Institute for the Behavioral and Social Sciences. Retrieved from

[13] Reason, J.T. & Brand, J.J. (1975). Motion Sickness. Academic Press, Inc.

[14] Welch, R.B. (2002). Adapting to virtual environments. In Stanney, K.M. (ed.). Handbook of Virtual Environments: Design, Implementation, and Application. Lawrence Erlbaum Associates, Publishers: Mahwah, NJ.

[15]Prothero, J.D., Draper, M.H., Furness, T.A., Parker, D.E., and Wells, M.J. (1999). The use of an independent visual background to reduce simulator side-effects. Aviation, Space, and Environmental Medicine, 70(3), 135-187.

[16] Lin, J. J.-W., Abi-Rached, H., Kim, D.-H., Parker, D.E., and Furness, T.A. (2002). A “natural” independent visual background reduced simulator sickness. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 46, 2124-2128.

[17] Bowman, D. Koller, D., & Hodges, L.F. (1997). Travel in immersive virtual environments: an evaluation of viewpoint motion control techniques,” Proceedings of the Virtual Reality Annual International Symposium, pp. 45-52.