A small project done at Virtual Reality Lab at CPDM, IISc. The PPT talks about an idea of mapping Reality and Virtual Reality using electromagnetic position trackers and 3D head mounted display. This project is pure Virtual Reality based implementation and is not dependent on camera based Augmented Reality techniques.
3. Field of View (FoV) - Eye vs HMD
Accessibility for the Disabled - A Design Manual for a Barrier Free Environment
Eye
HMD
Text
62º
62º
20º
20º
120º
40º
Eye
HMD
15º
15º
Wednesday, 4 June 14
4. THX HDTV Setup Guidelines
40º FOV
Differentiable areas of Eye FOV
Things that can be done...
✓Identify Text written at far distance
✓Identify Shapes at medium distance
✓Identify Color at near distance
Field of View (FoV) - Eye vs HMD
Wednesday, 4 June 14
5. Challenges [1]
Due to limited FoV of HMD it is expected
that we’ll be facing challenges for following
tasks
• Identification of Text and Shapes at near
distance
• Identification of Color and Contrast at
far distance.
Wednesday, 4 June 14
6. Variable FoV of Eye
Wide FoV
Narrow FoV
eye-lookat
eye-lookat
Wednesday, 4 June 14
8. Challenges [2]
• Human Eye FoV (viewport) varies
according to the point where eye is
looking-at.
• In Graphics camera viewport is fixed since
the point where eye is looking-at it not
known.
• Challenges are expected in mapping Real
World toVirtual World because of dynamic
FoV-viewport.
Wednesday, 4 June 14
9. Setup
• Design a Tabletop environment to perform simple
interaction tasks to map physical world with
virtual world.
• Complete Geometric approach for Colocation of
Real andVirtual World - No Augmented Reality !
• Using nVis SX60 HMD, Polhemous Trackers and
a 100cm by 80cm table - andVector Algebra!
Wednesday, 4 June 14
10. The problem of Colocation
Physical World
• Tracker World
• Gives the position and
orientation of Table and
Head.
• Receiver of the tracker has
it own frame of reference
• Trackers have their own
frames of reference
Virtual World
• OpenGL World
• Graphics has its own frame
of reference
• Position and Orientation of
Camera w.r.t head to be
identified
Wednesday, 4 June 14
11. Table Frame to GL Frame
• Coordinates we
are getting from
Table-receiver are
converted to
OpenGL World
coordinates
Receiver Frame
GL Frame
x
y
z
x
y
z
(0,0,0)
(0,0,0)
Tab2GL
Wednesday, 4 June 14
14. HeadTracker to Table
Table Frame
x
y
z
(0,0,0)
Head Tracker Frame
Head2Tab
•To nullify the effect of mis-orientation of head tracker
•Align Tracker Frame with Table Frame
Wednesday, 4 June 14
17. HeadTracker to L/R Camera (Eye)
Table Frame
Left Camera Frame
Head2LCam
Head2RCam
Wednesday, 4 June 14
18. Head TrackingYaw
Pitch
Roll
• Based on Position and Orientation of Head, the view of
the scene can be changed.
• Look around - Look closer
Wednesday, 4 June 14
19. Left Eye Right Eye
Real World
Virtual World
(Actual 3D view may differ based on head orientation)
Colocation
Wednesday, 4 June 14
20. Issues
• Limited HMD FoV : 40º
• Increasing the FoV more than 40º makes the scene
skew and impedes in proper depth perception.
• The present working area is restricted to 40º FoV for
realistic view and depth perception.
• VariableViewport
• Viewport / FoV of eye changes dynamically according to
the point where eye is looking-at which is not possible in
the case of virtual world as we don't know where real
eye is looking at in the the scene.
• The issue Binocular Eye trackers solve the above
problem which are under process of acquisition.
Wednesday, 4 June 14
21. Issues
• Non Smooth EM Tracker Data
• Position / Orientation data obtained via Electromagnetic
Trackers is full of noise
• Scene Rendering is highly dependent on Tracker Data -
that too - Multiple trackers
• Render Scene appears jittery
• Solution: Use of Kalman Filters for Smoothing the data
Wednesday, 4 June 14