![it might not be compatible with tridef 3d it might not be compatible with tridef 3d](https://www.mtbs3d.com/phpbb/download/file.php?avatar=8684_1411122758.jpg)
![it might not be compatible with tridef 3d it might not be compatible with tridef 3d](https://img.informer.com/screenshots/5384/5384267_1.jpg)
I can easily induce vertigo in a demo by placing someone high up, as you would get vertigo in real life doing this.
#It might not be compatible with tridef 3d software
Some people get vertigo from playing vr - technically not a fault of the tech (its acting as intended) but rather poor software design. So the terms for general vr sickness, sometimes erroneously called motion sickness, is actually an umbrella term for several similar sicknesses caused by different things. Further, you literally aren't rendering two frames, you have one single frame buffer at all times. This is still a great performance hit, but it's not quite the same as rendering two frames. Render everything far away once, then copy it to both viewports, then render near-view objects in stereoscopy twice. You render in multiple passes to keep your load down. My left eye is not seeing things very differently 100' away compared to my right eye. Example - you lose the ability to discern stereoscopy and fine parallax past about 20', meaning, in layman's terms, things that are really far away essentially appear flat in our vision. To begin with, during the rendering pipeline, many parts of your scene can be used wholesale from each viewport. The commonly repeated refrain (not from you, necessarily) is that you need to do twice the work to render "two frames." This isn't true. Ok, this is a great bit of misconception about how VR renders.