SlideShare uma empresa Scribd logo
1 de 19
DK2 and Latency Mitigation
Cass Everitt
Oculus VR
Being There
• Conventional 3D graphics is cinematic
– Shows you something
• On a display, in your environment
• VR graphics is immersive
– Takes you somewhere
• Controls everything you see, defines your environment
• Very different constraints and challenges
Realism and Presence
• Being there is largely about sensor fusion
– Your brain’s sensor fusion
– Trained by reality
– Can’t violate too many hard-wired expectations
• Realism may be a non-goal
– Not required for presence
– Expensive
– Uncanny valley
Oculus Rift DK2
• 90°-110° FOV
• 1080p OLED screen
– 960x1080 per eye
• 75 Hz refresh
• Low persistence
• 1 kHz IMU
• Positional tracking
Low Persistence
• Stable image as you turn - no motion blur
• Rolling shutter
– Right-to-left
– 3ms band of light
– Eyes offset temporally
Positional Tracking
• External camera, pointed at user
• 80° x 64° FOV
• ~2.5m range
• ~0.05mm @ 1.5m
• ~19ms latency
– Only 2ms of that is vision processing
Position Tracking
+ =
technology magic
The good news: You don’t need to know.
Image Synthesis
• Conventional planar projection
– GPUs like this because
• Straight edges remain straight
• Planes remain planar after projection
• Synthesis takes “a while”
– So we predict the position / orientation
– A long range prediction: ~10-30ms out
Note on Sample Distribution
• Conventional planar projection, not great for
very wide FOV
– Big angle between samples at center of view
Alternative Sample Distributions
• Direct render to cube map may be appealing
• Tiled renderers could do piecewise linear
– Brute force will do in the interim
– But not much FOV room left at 100°
Optical Distortion
Distortion Correction
Optical Distortion
• HMD optics cause different sample
distribution – and chromatic aberration
• Requires a resampling pass
– Synthesis distribution -> delivery distribution
– Barrel distortion to counteract lens’s distortion
• Could be built in to a “smarter” display engine
– Handled in software today
• Requires either CPU, separate GPU, or shared GPU
Display Engine (detour)
• In modern GPUs, the 3D synthesis engine
builds buffers to be displayed
• A separate engine drives the HDMI / DP / DVI
output signal using that buffer
• This engine just reads rows of the image
• More on this later…
Time Warp
• Optical resampling provides an opportunity
– Synthesized samples have known location
• Global shutter, so constant time
– Actual eye orientation will differ
• Long range prediction had error
• Better prediction just before resampling
• Both predictions are for the same target time
• So resample for optics and prediction error
simultaneously!
• Note: This just corrects the view of an “old” snapshot
of the world
Time Warp + Rolling Shutter
• Rolling shutter adds time variability
– But we know time derivative of orientation
• Can correct for that as well
– Tends to compress sampling when turning right
– And stretch out sampling when turning left
Asynchronous Time Warp
• So far, we have been talking about 1 synthesized
image per eye per display period
– @75 Hz, that’s 150 Hz for image synthesis
– Many apps cannot achieve these rates
• Especially with wide-FOV rendering
• Display needs to be asynchronous to synthesis
– Just like in conventional pipeline
– Needs to be isochronous – racing the beam
– Direct hardware support for this would be
straightforward
Asynchronous Time Warp
• Slower synthesis requires wider FOV
– Will resample the same image multiple times
• Stuttering can be a concern
– When display and synthesis frequencies “beat”
– Ultra-high display frequency may help this
– Tolerable synthesis rate still TBD
• End effect is, your eyes see the best information
we have
– Regardless of synthesis rate
Questions?
• cass.everitt@oculusvr.com
• For vision questions:
– dov.katz@oculusvr.com

Mais conteúdo relacionado

Mais procurados

Basic photography intro
Basic photography introBasic photography intro
Basic photography intro
jduchesne
 

Mais procurados (19)

Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill) Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill)
 
Apple M1 & Ionic: Should I switch?
Apple M1 & Ionic: Should I switch?Apple M1 & Ionic: Should I switch?
Apple M1 & Ionic: Should I switch?
 
Zoom Rooms Kit from Video Conference Gear featuring the Logitech Rally Camera...
Zoom Rooms Kit from Video Conference Gear featuring the Logitech Rally Camera...Zoom Rooms Kit from Video Conference Gear featuring the Logitech Rally Camera...
Zoom Rooms Kit from Video Conference Gear featuring the Logitech Rally Camera...
 
Zoom Rooms Kit from Video Conference Gear featuring the Poly Studio Video Sou...
Zoom Rooms Kit from Video Conference Gear featuring the Poly Studio Video Sou...Zoom Rooms Kit from Video Conference Gear featuring the Poly Studio Video Sou...
Zoom Rooms Kit from Video Conference Gear featuring the Poly Studio Video Sou...
 
LCD Hd TVs
LCD Hd TVs
LCD Hd TVs
LCD Hd TVs
 
E ball ppt1
E ball ppt1E ball ppt1
E ball ppt1
 
Basic photography intro
Basic photography introBasic photography intro
Basic photography intro
 
SAE AR/VR - The challenges of creating a VR application with Unity
SAE AR/VR - The challenges of creating a VR application with UnitySAE AR/VR - The challenges of creating a VR application with Unity
SAE AR/VR - The challenges of creating a VR application with Unity
 
VR - Creating the ultimate reality
VR - Creating the ultimate realityVR - Creating the ultimate reality
VR - Creating the ultimate reality
 
Cameras & lenses
Cameras & lensesCameras & lenses
Cameras & lenses
 
Kinect for Windows Quickstart Series
Kinect for Windows Quickstart SeriesKinect for Windows Quickstart Series
Kinect for Windows Quickstart Series
 
Photographic Film and Slide Scanning Alternatives
Photographic Film and Slide Scanning AlternativesPhotographic Film and Slide Scanning Alternatives
Photographic Film and Slide Scanning Alternatives
 
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
 
Zoom Rooms Kit from Video Conference Gear featuring the AVer CAM540 and Phoen...
Zoom Rooms Kit from Video Conference Gear featuring the AVer CAM540 and Phoen...Zoom Rooms Kit from Video Conference Gear featuring the AVer CAM540 and Phoen...
Zoom Rooms Kit from Video Conference Gear featuring the AVer CAM540 and Phoen...
 
Kit of the Day feature the Zoom Rooms Poly Studio Solution for Huddle Rooms
Kit of the Day feature the Zoom Rooms Poly Studio Solution for Huddle RoomsKit of the Day feature the Zoom Rooms Poly Studio Solution for Huddle Rooms
Kit of the Day feature the Zoom Rooms Poly Studio Solution for Huddle Rooms
 
Nokia N8 Imaging & Video Tips
Nokia N8 Imaging & Video TipsNokia N8 Imaging & Video Tips
Nokia N8 Imaging & Video Tips
 
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
Zoom Rooms Kit from Video Conference Gear featuring the Poly EagleEye Directo...
 
Infinity Blade and beyond
Infinity Blade and beyondInfinity Blade and beyond
Infinity Blade and beyond
 
In-Depth: 
4K Mirrorless Cameras: Sony A7 Series, Panasonic GH4, and More
In-Depth: 
4K Mirrorless Cameras: Sony A7 Series, Panasonic GH4, and MoreIn-Depth: 
4K Mirrorless Cameras: Sony A7 Series, Panasonic GH4, and More
In-Depth: 
4K Mirrorless Cameras: Sony A7 Series, Panasonic GH4, and More
 

Destaque

Transporte de Hidrocarburos
Transporte de HidrocarburosTransporte de Hidrocarburos
Transporte de Hidrocarburos
USM
 
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
David Rivera
 
Approaching zero driver overhead
Approaching zero driver overheadApproaching zero driver overhead
Approaching zero driver overhead
Cass Everitt
 
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBAARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
Ana Rey
 

Destaque (20)

Gl efficiency
Gl efficiencyGl efficiency
Gl efficiency
 
Robasics
RobasicsRobasics
Robasics
 
SeikenWear その1
SeikenWear その1SeikenWear その1
SeikenWear その1
 
Hiyoshi Jumpの作り方
Hiyoshi Jumpの作り方Hiyoshi Jumpの作り方
Hiyoshi Jumpの作り方
 
Unity講習会(初級)
Unity講習会(初級)Unity講習会(初級)
Unity講習会(初級)
 
Felwyrld Tech
Felwyrld TechFelwyrld Tech
Felwyrld Tech
 
Indicadores de Desarrollo Sostenible Geo Referenciados
Indicadores de Desarrollo Sostenible Geo ReferenciadosIndicadores de Desarrollo Sostenible Geo Referenciados
Indicadores de Desarrollo Sostenible Geo Referenciados
 
CS 354 Shadows (cont'd) and Scene Graphs
CS 354 Shadows (cont'd) and Scene GraphsCS 354 Shadows (cont'd) and Scene Graphs
CS 354 Shadows (cont'd) and Scene Graphs
 
CEDEC2014: アンリアル・エンジン4を技術者が活用するための最新ノウハウ
CEDEC2014: アンリアル・エンジン4を技術者が活用するための最新ノウハウCEDEC2014: アンリアル・エンジン4を技術者が活用するための最新ノウハウ
CEDEC2014: アンリアル・エンジン4を技術者が活用するための最新ノウハウ
 
Unity + Oculus Rift + LeapMotion 우주 체험 프로젝트 후기
Unity + Oculus Rift + LeapMotion 우주 체험 프로젝트 후기Unity + Oculus Rift + LeapMotion 우주 체험 프로젝트 후기
Unity + Oculus Rift + LeapMotion 우주 체험 프로젝트 후기
 
Transporte de Hidrocarburos
Transporte de HidrocarburosTransporte de Hidrocarburos
Transporte de Hidrocarburos
 
Puns - Homophones and Homonyms
Puns - Homophones and HomonymsPuns - Homophones and Homonyms
Puns - Homophones and Homonyms
 
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
Peter Burke - Visto no visto. El uso de la imagen como documento histórico.
 
JavaScript難読化読経
JavaScript難読化読経JavaScript難読化読経
JavaScript難読化読経
 
Approaching zero driver overhead
Approaching zero driver overheadApproaching zero driver overhead
Approaching zero driver overhead
 
Jogos
JogosJogos
Jogos
 
Gêneros diversos
Gêneros diversosGêneros diversos
Gêneros diversos
 
Textos da tradição oral
Textos da tradição oralTextos da tradição oral
Textos da tradição oral
 
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBAARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
ARTE ISLÁMICO. LA MEZQUITA DE CÓRDOBA
 
Descripción de personas
Descripción de personasDescripción de personas
Descripción de personas
 

Semelhante a Oculus Rift Developer Kit 2 and Latency Mitigation techniques

Crysis Next-Gen Effects (GDC 2008)
Crysis Next-Gen Effects (GDC 2008)Crysis Next-Gen Effects (GDC 2008)
Crysis Next-Gen Effects (GDC 2008)
Tiago Sousa
 
Digital image and file formats
Digital image and file formatsDigital image and file formats
Digital image and file formats
Ram Chandran
 

Semelhante a Oculus Rift Developer Kit 2 and Latency Mitigation techniques (20)

COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
Crysis Next-Gen Effects (GDC 2008)
Crysis Next-Gen Effects (GDC 2008)Crysis Next-Gen Effects (GDC 2008)
Crysis Next-Gen Effects (GDC 2008)
 
DLP Projection systems
DLP Projection systemsDLP Projection systems
DLP Projection systems
 
Learning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VRLearning The Rules to Break Them: Designing for the Future of VR
Learning The Rules to Break Them: Designing for the Future of VR
 
Digital image and file formats
Digital image and file formatsDigital image and file formats
Digital image and file formats
 
Storm 2012 03-29
Storm 2012 03-29Storm 2012 03-29
Storm 2012 03-29
 
Storm 2012-03-29
Storm 2012-03-29Storm 2012-03-29
Storm 2012-03-29
 
HiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use CasesHiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use Cases
 
Klony Lieberman (Sixdof Space): Ultra Fast 6DOF Tracking
Klony Lieberman (Sixdof Space): Ultra Fast 6DOF TrackingKlony Lieberman (Sixdof Space): Ultra Fast 6DOF Tracking
Klony Lieberman (Sixdof Space): Ultra Fast 6DOF Tracking
 
COMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionCOMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human Perception
 
Looking out for anomalies
Looking out for anomaliesLooking out for anomalies
Looking out for anomalies
 
Color me intrigued: A jaunt through color technology in video
Color me intrigued: A jaunt through color technology in videoColor me intrigued: A jaunt through color technology in video
Color me intrigued: A jaunt through color technology in video
 
Why Virtual Reality Is Hard (And Where It Might Be Going)
Why Virtual Reality Is Hard (And Where It Might Be Going)Why Virtual Reality Is Hard (And Where It Might Be Going)
Why Virtual Reality Is Hard (And Where It Might Be Going)
 
Spectacle for computer use
Spectacle  for  computer  useSpectacle  for  computer  use
Spectacle for computer use
 
Data Logging and Telemetry
Data Logging and TelemetryData Logging and Telemetry
Data Logging and Telemetry
 
"Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embe...
"Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embe..."Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embe...
"Using Inertial Sensors and Sensor Fusion to Enhance the Capabilities of Embe...
 
"Designing a Stereo IP Camera From Scratch," a Presentation from ELVEES
"Designing a Stereo IP Camera From Scratch," a Presentation from ELVEES"Designing a Stereo IP Camera From Scratch," a Presentation from ELVEES
"Designing a Stereo IP Camera From Scratch," a Presentation from ELVEES
 
Cinematography 1 (camera operation)
Cinematography 1 (camera operation)Cinematography 1 (camera operation)
Cinematography 1 (camera operation)
 
Panorama photography pdf
Panorama photography pdfPanorama photography pdf
Panorama photography pdf
 
Intro ch 04_a
Intro ch 04_aIntro ch 04_a
Intro ch 04_a
 

Último

Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Último (20)

Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
A Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source MilvusA Beginners Guide to Building a RAG App Using Open Source Milvus
A Beginners Guide to Building a RAG App Using Open Source Milvus
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024Manulife - Insurer Transformation Award 2024
Manulife - Insurer Transformation Award 2024
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
Apidays Singapore 2024 - Scalable LLM APIs for AI and Generative AI Applicati...
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 

Oculus Rift Developer Kit 2 and Latency Mitigation techniques

  • 1. DK2 and Latency Mitigation Cass Everitt Oculus VR
  • 2. Being There • Conventional 3D graphics is cinematic – Shows you something • On a display, in your environment • VR graphics is immersive – Takes you somewhere • Controls everything you see, defines your environment • Very different constraints and challenges
  • 3. Realism and Presence • Being there is largely about sensor fusion – Your brain’s sensor fusion – Trained by reality – Can’t violate too many hard-wired expectations • Realism may be a non-goal – Not required for presence – Expensive – Uncanny valley
  • 4. Oculus Rift DK2 • 90°-110° FOV • 1080p OLED screen – 960x1080 per eye • 75 Hz refresh • Low persistence • 1 kHz IMU • Positional tracking
  • 5. Low Persistence • Stable image as you turn - no motion blur • Rolling shutter – Right-to-left – 3ms band of light – Eyes offset temporally
  • 6. Positional Tracking • External camera, pointed at user • 80° x 64° FOV • ~2.5m range • ~0.05mm @ 1.5m • ~19ms latency – Only 2ms of that is vision processing
  • 7. Position Tracking + = technology magic The good news: You don’t need to know.
  • 8. Image Synthesis • Conventional planar projection – GPUs like this because • Straight edges remain straight • Planes remain planar after projection • Synthesis takes “a while” – So we predict the position / orientation – A long range prediction: ~10-30ms out
  • 9. Note on Sample Distribution • Conventional planar projection, not great for very wide FOV – Big angle between samples at center of view
  • 10. Alternative Sample Distributions • Direct render to cube map may be appealing • Tiled renderers could do piecewise linear – Brute force will do in the interim – But not much FOV room left at 100°
  • 13. Optical Distortion • HMD optics cause different sample distribution – and chromatic aberration • Requires a resampling pass – Synthesis distribution -> delivery distribution – Barrel distortion to counteract lens’s distortion • Could be built in to a “smarter” display engine – Handled in software today • Requires either CPU, separate GPU, or shared GPU
  • 14. Display Engine (detour) • In modern GPUs, the 3D synthesis engine builds buffers to be displayed • A separate engine drives the HDMI / DP / DVI output signal using that buffer • This engine just reads rows of the image • More on this later…
  • 15. Time Warp • Optical resampling provides an opportunity – Synthesized samples have known location • Global shutter, so constant time – Actual eye orientation will differ • Long range prediction had error • Better prediction just before resampling • Both predictions are for the same target time • So resample for optics and prediction error simultaneously! • Note: This just corrects the view of an “old” snapshot of the world
  • 16. Time Warp + Rolling Shutter • Rolling shutter adds time variability – But we know time derivative of orientation • Can correct for that as well – Tends to compress sampling when turning right – And stretch out sampling when turning left
  • 17. Asynchronous Time Warp • So far, we have been talking about 1 synthesized image per eye per display period – @75 Hz, that’s 150 Hz for image synthesis – Many apps cannot achieve these rates • Especially with wide-FOV rendering • Display needs to be asynchronous to synthesis – Just like in conventional pipeline – Needs to be isochronous – racing the beam – Direct hardware support for this would be straightforward
  • 18. Asynchronous Time Warp • Slower synthesis requires wider FOV – Will resample the same image multiple times • Stuttering can be a concern – When display and synthesis frequencies “beat” – Ultra-high display frequency may help this – Tolerable synthesis rate still TBD • End effect is, your eyes see the best information we have – Regardless of synthesis rate
  • 19. Questions? • cass.everitt@oculusvr.com • For vision questions: – dov.katz@oculusvr.com

Notas do Editor

  1. You see a movie, watch a tv show, even play a 3D game. You’re not there. You’re looking at a sequence of images captured by a physical or virtual camera. And you’re looking at it displayed on a screen of some sort, somewhere in your environment. When you enter virtual reality, the system provides the environment directly. Total ocular override. It’s a big responsibility. With substantially different constraints and challenges.
  2. Most of you are experiencing reality at this very moment! A variety of sensors are telling you what your environment looks like, sounds like, the temperature, direction of gravity, orientation of your body, eyes, approximate rates of motion… Lots of stuff. And it does it at a pretty incredible rate, pretty much all the time you’re conscious. It’s exhausting. But reality trains you what to expect. Ideally we would control all these inputs and provide believable stimuli. In practice we have to start with the most important ones first, and figure out what kinds of margins we have for error on them. Inputs that violate our hard-wired expectations can often result in unpleasant user experiences. The interesting thing is, obviously synthetic virtual environments don’t detract from presence. And in many ways, going after total visual realism can be a distraction that doesn’t enhance the user’s experience.
  3. The DK2 makes some substantial and important improvements over DK1. Specifically, resolution, low persistence, higher refresh rate, and full 6DOF tracking.
  4. For each pixel, we can predict which direction vector it corresponds to at the time the pixel lights up. This is dead simple with a global shutter, but not too bad with a rolling shutter.
  5. Start with some IR LEDs on the HMD. Add in a USB camera. Plus a little invisible software, and alacazam! You have position tracking! Dov Katz might kill me for pretending it’s that simple. Suffice to say, it’s not. But also, you don’t need to worry about it. The mechanisms will change, but mostly developers don’t have to care.
  6. For wide FOV, conventional planar projection has awful sample density in the view direction, and great sample density at the periphery. Notice the angle subtended by each dash in the diagram. Larger angles (like in the center) mean poorer sampling density.
  7. Coming from the other side, the optics tell us the ray direction that each pixel corresponds to. And it’s not the same as the planar projection. And because of chromatic aberration, the ray is different for each component of each pixel. This picture shows a DK2 eye piece resting on a monitor. You can clearly see both the pincushion effect and the chromatic aberration that the lens produces. The barrel distortion you see when looking at HMD rendering without the optics cancels out this pincushion effect.
  8. Methods of computing and predicting eye position and orientation are important. But we don’t really have to care, as long as the information we get is good.