SlideShare uma empresa Scribd logo
1 de 88
Baixar para ler offline
LECTURE 10: AR
TECHNOLOGY: TRACKING
COMP 4010 – Virtual Reality
Semester 5 – 2016
Bruce Thomas, Mark Billinghurst
University of South Australia
October 18th 2016
Augmented Reality Definition
•  Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
Augmented RealityTechnology
•  Combining Real and Virtual Images
•  Display technologies
•  Interactive in Real-Time
•  Input and interactive technologies
•  Registered in 3D
•  Viewpoint tracking technologies
Display
Processing
Input Tracking
AR TRACKING AND
REGISTRATION
AR RequiresTracking and Registration
• Registration
•  Positioning virtual object wrt real world
•  Fixing virtual object on real object when view is fixed
• Tracking
•  Continually locating the users viewpoint when view moving
•  Position (x,y,z), Orientation (r,p,y)
AR TRACKING
Tracking Requirements
• Augmented Reality Information Display
•  World Stabilized
•  Body Stabilized
•  Head Stabilized
Increasing Tracking
Requirements
Head Stabilized Body Stabilized World Stabilized
Tracking Technologies
!  Active
•  Mechanical, Magnetic, Ultrasonic
•  GPS, Wifi, cell location
!  Passive
•  Inertial sensors (compass, accelerometer, gyro)
•  Computer Vision
•  Marker based, Natural feature tracking
!  Hybrid Tracking
•  Combined sensors (eg Vision + Inertial)
Tracking Types
Magnetic
Tracker
Inertial
Tracker
Ultrasonic
Tracker
Optical
Tracker
Marker-Based
Tracking
Markerless
Tracking
Specialized
Tracking
Edge-Based
Tracking
Template-Based
Tracking
Interest Point
Tracking
Mechanical
Tracker
MechanicalTracker
• Idea: mechanical arms with joint sensors
• ++: high accuracy, haptic feedback
• -- : cumbersome, expensive
Microscribe
MagneticTracker
• Idea: coil generates current when moved in
magnetic field. Measuring current gives position
and orientation relative to magnetic source.
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
Flock of Birds (Ascension)
InertialTracker
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drifts over time, hysteresis effect, only 3DOF
IS300 (Intersense)
Wii Remote
UltrasonicTracker
• Idea: time of Flight or phase-Coherence Sound Waves
• ++: Small, Cheap
• -- : 3DOF, Line of Sight, Low resolution, Affected by
environmental conditons (pressure, temperature)
Ultrasonic
Logitech IS600
Global Positioning System (GPS)
• Created by US in 1978
• Currently 29 satellites
• Satellites send position + time
• GPS Receiver positioning
• 4 satellites need to be visible
• Differential time of arrival
• Triangulation
• Accuracy
• 5-30m+, blocked by weather, buildings etc.
Mobile Sensors
• Inertial compass
• Earth’s magnetic field
• Measures absolute orientation
• Accelerometers
• Measures acceleration about axis
• Used for tilt, relative rotation
• Can drift over time
OPTICAL TRACKING
Why Optical Tracking for AR?
•  Many AR devices have cameras
•  Mobile phone/tablet, Video see-through display
•  Provides precise alignment between video and AR overlay
•  Using features in video to generate pixel perfect alignment
•  Real world has many visual features that can be tracked from
•  Computer Vision well established discipline
•  Over 40 years of research to draw on
•  Old non real time algorithms can be run in real time on todays devices
Common AR Optical Tracking Types
• Marker Tracking
•  Tracking known artificial markers/images
•  e.g. ARToolKit square markers
• Markerless Tracking
•  Tracking from known features in real world
•  e.g. Vuforia image tracking
• Unprepared Tracking
•  Tracking in unknown environment
•  e.g. SLAM tracking
Marker tracking
•  Available for more than 10 years
•  Several open source solutions exist
•  ARToolKit,ARTag,ATK+, etc
•  Fairly simple to implement
•  Standard computer vision methods
•  A rectangle provides 4 corner points
•  Enough for pose estimation!
Demo: ARToolKit
https://www.youtube.com/watch?v=0wo_3u8-rxU
Marker Based Tracking: ARToolKit
http://www.artoolkit.org
Goal:Find Camera Pose
•  Goal is to find the camera pose in maker coordinate frame
•  Knowing:
•  Position of key points in on-screen video image
•  Camera properties (focal length, image distortion)
Coordinates for Marker Tracking
Coordinates for Marker Tracking
Marker Camera
• Final Goal
• Rotation & Translation
1: Camera Ideal Screen
• Perspective model
• Obtained from Camera Calibration
2: Ideal Screen Observed Screen
• Nonlinear function (barrel shape)
• Obtained from Camera Calibration
3: Marker Observed Screen
• Correspondence of 4 vertices
• Real time image processing
MarkerTracking – Overview
Fiducial
Detection
Rectangle
Fitting
Pattern
Checking
Lens
Undistortion
Pose
Estimation
Identified
Markers
Camera
Image
Contours
Rectangles
Estimated
Poses
Undistorted
Corners
MarkerTracking – Fiducial Detection
•  Threshold the whole image to black and white
•  Search scanline by scanline for edges (white to black)
•  Follow edge until either
•  Back to starting pixel
•  Image border
•  Check for size
•  Reject fiducials early that are too small (or too large)
MarkerTracking – Rectangle Fitting
•  Start with an arbitrary point “x” on the contour
•  The point with maximum distance must be a corner c0
•  Create a diagonal through the center
•  Find points c1 & c2 with maximum distance left and right of diag.
•  New diagonal from c1 to c2
•  Find point c3 right of diagonal with maximum distance
MarkerTracking – Pattern checking
•  Calculate homography using the 4 corner points
•  “Direct Linear Transform” algorithm
•  Maps normalized coordinates to marker coordinates
(simple perspective projection, no camera model)
•  Extract pattern by sampling and check
•  Id (implicit encoding)
•  Template (normalized cross correlation)
MarkerTracking – Corner refinement
•  Refine corner coordinates
•  Critical for high quality tracking
•  Remember: 4 points is the bare minimum!
•  So these 4 points should better be accurate…
•  Detect sub-pixel coordinates
•  E.g. Harris corner detector
•  Specialized methods can be faster and more accurate
•  Strongly reduces jitter!
•  Undistort corner coordinates
•  Remove radial distortion from lens
Marker tracking – Pose estimation
•  Calculates marker pose relative to the camera
•  Initial estimation directly from homography
•  Very fast, but coarse with error
•  Jitters a lot…
•  Iterative Refinement using Gauss-Newton method
•  6 parameters (3 for position, 3 for rotation) to refine
•  At each iteration we optimize on the error
•  Iterat
From MarkerTo Camera
•  Rotation & Translation
TCM : 4x4 transformation matrix
from marker coord. to camera coord.
Tracking challenges inARToolKit
False positives and inter-marker confusion
(image by M. Fiala)
Image noise
(e.g. poor lens, block
coding /
compression, neon tube)
Unfocused camera,
motion blur
Dark/unevenly lit
scene, vignetting
Jittering
(Photoshop illustration)
Occlusion
(image by M. Fiala)
Other MarkerTracking Libraries
But - You can’t cover world with ARToolKit Markers!
Markerless Tracking
Magnetic Tracker Inertial
Tracker
Ultrasonic
Tracker
Optical
Tracker
Marker-Based
Tracking
Markerless
Tracking
Specialized
Tracking
Edge-Based
Tracking
Template-Based
Tracking
Interest Point
Tracking
•  No more Markers! "Markerless Tracking
Mechanical
Tracker
Natural Feature Tracking
• Use Natural Cues of Real Elements
•  Edges
•  Surface Texture
•  Interest Points
• Model or Model-Free
• No visual pollution
Contours
Features Points
Surfaces
TextureTracking
Tracking by Keypoint Detection
•  This is what most trackers do…
•  Targets are detected every frame
•  Popular because
tracking and detection
are solved simultaneously
Keypoint detection
Descriptor creation
and matching
Outlier Removal
Pose estimation
and refinement
Camera Image
Pose
Recognition
What is a Keypoint?
•  It depends on the detector you use!
•  For high performance use the FAST corner detector
•  Apply FAST to all pixels of your image
•  Obtain a set of keypoints for your image
•  Describe the keypoints
Rosten, E., & Drummond, T. (2006, May). Machine learning for high-speed corner detection.
In European conference on computer vision (pp. 430-443). Springer Berlin Heidelberg.
FAST Corner Keypoint Detection
Example:FAST Corner Detection
https://www.youtube.com/watch?v=pJ2xSrIXy_s
Descriptors
•  Describe the Keypoint features
•  Can use SIFT
•  Estimate the dominant keypoint
orientation using gradients
•  Compensate for detected
orientation
•  Describe the keypoints in terms
of the gradients surrounding it
Wagner	D.,	Reitmayr	G.,	Mulloni	A.,	Drummond	T.,	Schmals<eg	D.,		
Real-Time	Detec<on	and	Tracking	for	Augmented	Reality	on	Mobile	Phones.	
	IEEE	Transac<ons	on	Visualiza<on	and	Computer	Graphics,	May/June,	2010
Database Creation
•  Offline step – create database of known features
•  Searching for corners in a static image
•  For robustness look at corners on multiple scales
•  Some corners are more descriptive at larger or smaller scales
•  We don’t know how far users will be from our image
•  Build a database file with all descriptors and their
position on the original image
Real-time tracking
•  Search for known keypoints
in the video image
•  Create the descriptors
•  Match the descriptors from the
live video against those
in the database
•  Brute force is not an option
•  Need the speed-up of special
data structures
•  E.g., we use multiple spill trees
Keypoint detection
Descriptor creation
and matching
Outlier Removal
Pose estimation
and refinement
Camera Image
Pose
Recognition
NFT – Outlier removal
•  Removing outlining features
•  Cascade of removal techniques
•  Start with cheapest, finish with most expensive…
•  First simple geometric tests
•  E.g., line tests
•  Select 2 points to form a line
•  Check all other points being on correct side of line
•  Then, homography-based tests
NFT – Pose refinement
•  Pose from homography makes good starting point
•  Based on Gauss-Newton iteration
•  Try to minimize the re-projection error of the keypoints
•  Part of tracking pipeline that mostly benefits
from floating point usage
•  Can still be implemented effectively in fixed point
•  Typically 2-4 iterations are enough…
NFT – Real-time tracking
•  Search for keypoints
in the video image
•  Create the descriptors
•  Match the descriptors from the
live video against those
in the database
•  Remove the keypoints that
are outliers
•  Use the remaining keypoints
to calculate the pose
of the camera
Keypoint detection
Descriptor creation
and matching
Outlier Removal
Pose estimation
and refinement
Camera Image
Pose
Recognition
Demo: Vuforia Texture Tracking
https://www.youtube.com/watch?v=1Qf5Qew5zSU
Edge Based Tracking
•  Example: RAPiD [Drummond et al. 02]
•  Initialization, Control Points, Pose Prediction (Global Method)
Demo: Edge Based Tracking
https://www.youtube.com/watch?v=UJtpBxdDVDU
Line Based Tracking
•  Visual Servoing [Comport et al. 2004]
Model BasedTracking
• Tracking from 3D object shape
• Example: OpenTL - www.opentl.org
•  General purpose library for model based visual tracking
Demo: OpenTL Model Tracking
https://www.youtube.com/watch?v=laiykNbPkgg
Demo: OpenTL Face Tracking
https://www.youtube.com/watch?v=WIoGdhkfNVE
Marker vs.Natural FeatureTracking
•  Marker tracking
•  Usually requires no database to be stored
•  Markers can be an eye-catcher
•  Tracking is less demanding
•  The environment must be instrumented
•  Markers usually work only when fully in view
•  Natural feature tracking
•  A database of keypoints must be stored/downloaded
•  Natural feature targets might catch the attention less
•  Natural feature targets are potentially everywhere
•  Natural feature targets work also if partially in view
Tracking from an Unknown Environment
•  What to do when you don’t know any features?
•  Very important problem in mobile robotics - Where am I?
•  SLAM
•  Simultaneously Localize And Map the environment
•  Goal: to recover both camera pose and map structure
while initially knowing neither.
•  Mapping:
•  Building a map of the environment which the robot is in
•  Localisation:
•  Navigating this environment using the map while keeping
track of the robot’s relative position and orientation
Visual SLAM
•  Early SLAM systems (1986 - )
•  Computer visions and sensors (e.g. IMU, laser, etc.)
•  One of the most important algorithms in Robotics
•  Visual SLAM
•  Using cameras only, such as stereo view
•  MonoSLAM (single camera) developed in 2007 (Davidson)
Example:Kudan MonoSLAM
https://www.youtube.com/watch?v=HdNtiabm3k0
How SLAMWorks
•  Three main steps
1.  Tracking a set of points through successive camera frames
2.  Using these tracks to triangulate their 3D position
3.  Simultaneously use the estimated point locations to calculate
the camera pose which could have observed them
•  By observing a sufficient number of points can solve for for both
structure and motion (camera path and scene structure).
SLAM Optimization
•  SLAM is an optimisation problem
•  compute the best configuration of camera poses and point
positions in order to minimise reprojection error
•  difference between a point's tracked location and where it is expected to be
•  Can be solved using bundle adjustment
•  a nonlinear least squares algorithm that finds minimum error
•  But – time taken grows as size of map increases
•  Multi-core machines can do localization and mapping on different threads
•  Relocalisation
•  Allows tracking to be restarted when it fails
Evolution of SLAM Systems
•  MonoSLAM (Davidson, 2007)
•  Real time SLAM from single camera
•  PTAM (Klein, 2009)
•  First SLAM implementation on mobile phone
•  FAB-MAP (Cummins, 2008)
•  Probabilistic Localization and Mapping
•  DTAM (Newcombe, 2011)
•  3D surface reconstruction from every pixel in image
•  KinectFusion (Izadi, 2011)
•  Realtime dense surface mapping and tracking using RGB-D
Demo:MonoSLAM
https://www.youtube.com/watch?v=saUE7JHU3P0
LSD-SLAM (Engel 2014)
•  A novel, direct monocular SLAM technique
•  Uses image intensities both for tracking and mapping.
•  The camera is tracked using direct image alignment, while
•  Geometry is estimated as semi-dense depth maps
•  Supports very large scale tracking
•  Runs in real time on CPU and smartphone
Demo:LSD-SLAM
https://www.youtube.com/watch?v=GnuQzP3gty4
Direct Method vs.Feature Based
•  Direct uses all information in image, cf feature based approach that
only use small patches around corners and edges
Applications of SLAM Systems
•  Many possible applications
•  Augmented Reality camera tracking
•  Mobile robot localisation
•  Real world navigation aid
•  3D scene reconstruction
•  3D Object reconstruction
•  Etc..
•  Assumptions
•  Camera moves through an unchanging scene
•  So not suitable for person tracking, gesture recognition
•  Both involve non-rigidly deforming objects and a non-static map
Hybrid Tracking
Combining several tracking modalities together
SensorTracking
•  Used by many “AR browsers”
•  GPS, compass, accelerometer, gyroscope
•  Not sufficient alone (drift, interference)
Inertial Compass Drifting Over Time
Combining Sensors andVision
•  Sensors
•  Produces noisy output (= jittering augmentations)
•  Are not sufficiently accurate (= wrongly placed augmentations)
•  Gives us first information on where we are in the world,
and what we are looking at
•  Vision
•  Is more accurate (= stable and correct augmentations)
•  Requires choosing the correct keypoint database to track from
•  Requires registering our local coordinate frame (online-
generated model) to the global one (world)
Example: Outdoor Hybrid Tracking
•  Combines
•  computer vision
•  inertial gyroscope sensors
•  Both correct for each other
•  Inertial gyro
•  provides frame to frame prediction of camera
orientation, fast sensing
•  drifts over time
•  Computer vision
•  Natural feature tracking, corrects for gyro drift
•  Slower, less accurate
OutdoorARTracking System
You, Neumann,Azuma outdoor AR system (1999)
Robust OutdoorTracking
• Hybrid Tracking
•  ComputerVision, GPS, inertial
• Going Out
•  Reitmayr & Drummond (Univ. Cambridge)
Reitmayr, G., & Drummond, T. W. (2006). Going out: robust model-based tracking for outdoor augmented
reaity. In Mixed and Augmented Reality, 2006. ISMAR 2006. IEEE/ACM International Symposium on (pp.
109-118). IEEE.
Handheld Display
REGISTRATION
Spatial Registration
The Registration Problem
• Virtual and Real content must stay properly aligned
• If not:
• Breaks the illusion that the two coexist
• Prevents acceptance of many serious applications
t = 0 seconds t = 1 second
Sources of Registration Errors
• Static errors
• Optical distortions (in HMD)
• Mechanical misalignments
• Tracker errors
• Incorrect viewing parameters
• Dynamic errors
• System delays (largest source of error)
• 1 ms delay = 1/3 mm registration error
Reducing Static Errors
• Distortion compensation
• For lens or display distortions
• Manual adjustments
• Have user manually alighn AR andVR content
• View-based or direct measurements
• Have user measure eye position
• Camera calibration (video AR)
• Measuring camera properties
View Based Calibration (Azuma 94)
Dynamic errors
•  Total Delay = 50 + 2 + 33 + 17 = 102 ms
•  1 ms delay = 1/3 mm = 33mm error
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
Reducing dynamic errors (1)
• Reduce system lag
• Faster components/system modules
• Reduce apparent lag
• Image deflection
• Image warping
Reducing System Lag
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
Faster Tracker Faster CPU Faster GPU Faster Display
ReducingApparent Lag
Tracking
Update
x,y,z
r,p,y
Virtual Display
Physical
Display
(640x480)
1280 x 960
Last known position
Virtual Display
Physical
Display
(640x480)
1280 x 960
Latest position
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
Reducing dynamic errors (2)
• Match video + graphics input streams (video AR)
• Delay video of real world to match system lag
• User doesn’t notice
• Predictive Tracking
• Inertial sensors helpful
Azuma / Bishop 1994
PredictiveTracking
Time
Position
Past Future
Can predict up to 80 ms in future (Holloway)
Now
PredictiveTracking (Azuma 94)
Wrap-up
• Tracking and Registration are key problems
• Registration error
• Measures against static error
• Measures against dynamic error
• AR typically requires multiple tracking technologies
• Computer vision most popular
• Research Areas:
• SLAM systems, Deformable models, Mobile outdoor
tracking
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Mais conteúdo relacionado

Mais procurados

Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsMark Billinghurst
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsMark Billinghurst
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityMark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRMark Billinghurst
 

Mais procurados (20)

Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
COMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR SystemsCOMP 4010 - Lecture 3 VR Systems
COMP 4010 - Lecture 3 VR Systems
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
 
Comp4010 lecture6 Prototyping
Comp4010 lecture6 PrototypingComp4010 lecture6 Prototyping
Comp4010 lecture6 Prototyping
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and PrototypingComp4010 Lecture5 Interaction and Prototyping
Comp4010 Lecture5 Interaction and Prototyping
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual RealityCOMP 4010 Lecture7 3D User Interfaces for Virtual Reality
COMP 4010 Lecture7 3D User Interfaces for Virtual Reality
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 

Destaque

COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysMark Billinghurst
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARMark Billinghurst
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle NavigationMark Billinghurst
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesMark Billinghurst
 
COMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and TrackingCOMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and TrackingMark Billinghurst
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesMark Billinghurst
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented RealityMark Billinghurst
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3Mark Billinghurst
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARMark Billinghurst
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysMark Billinghurst
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5Mark Billinghurst
 
COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyCOMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyMark Billinghurst
 

Destaque (20)

COMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR DisplaysCOMP 4010 Lecture9 AR Displays
COMP 4010 Lecture9 AR Displays
 
COMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR InteractionCOMP 4010: Lecture11 AR Interaction
COMP 4010: Lecture11 AR Interaction
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
COMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in ARCOMP 4010 Lecture12 Research Directions in AR
COMP 4010 Lecture12 Research Directions in AR
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 
Using AR for Vehicle Navigation
Using AR for Vehicle NavigationUsing AR for Vehicle Navigation
Using AR for Vehicle Navigation
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input Devices
 
Easy Virtual Reality
Easy Virtual RealityEasy Virtual Reality
Easy Virtual Reality
 
COMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and TrackingCOMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and Tracking
 
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic ExperiencesVSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
VSMM 2016 Keynote: Using AR and VR to create Empathic Experiences
 
Introduction to Augmented Reality
Introduction to Augmented RealityIntroduction to Augmented Reality
Introduction to Augmented Reality
 
2016 AR Summer School Lecture3
2016 AR Summer School Lecture32016 AR Summer School Lecture3
2016 AR Summer School Lecture3
 
COMP 4026 - Lecture 1
COMP 4026 - Lecture 1COMP 4026 - Lecture 1
COMP 4026 - Lecture 1
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
AR in Education
AR in EducationAR in Education
AR in Education
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
COMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface TechnologyCOMP 4026 Lecture4: Processing and Advanced Interface Technology
COMP 4026 Lecture4: Processing and Advanced Interface Technology
 
Ismar 2016 Presentation
Ismar 2016 PresentationIsmar 2016 Presentation
Ismar 2016 Presentation
 

Semelhante a COMP 4010 Lecture10: AR Tracking

COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyMark Billinghurst
 
2016 AR Summer School Lecture2
2016 AR Summer School Lecture22016 AR Summer School Lecture2
2016 AR Summer School Lecture2Mark Billinghurst
 
Computer Vision - cameras
Computer Vision - camerasComputer Vision - cameras
Computer Vision - camerasWael Badawy
 
Motion capture technology
Motion capture technologyMotion capture technology
Motion capture technologyParvez Hassan
 
Mobile Augmented Reality
Mobile Augmented RealityMobile Augmented Reality
Mobile Augmented RealityMarios Bikos
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetrychinmay khadke
 
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Aritra Sarkar
 
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November)
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November) Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November)
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November) Lviv Startup Club
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMark Billinghurst
 
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptx
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptxBiologically Inspired Turn Control for Autonomous Mobile Robots.pptx
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptxdocmarcoantoniosotov
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyMark Billinghurst
 
Europa Presentation 2011
Europa Presentation 2011Europa Presentation 2011
Europa Presentation 2011Chris Churchill
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMark Billinghurst
 
Programming Augmented Reality - Xamarin Evolve
Programming Augmented Reality - Xamarin EvolveProgramming Augmented Reality - Xamarin Evolve
Programming Augmented Reality - Xamarin EvolveFrank Krueger
 

Semelhante a COMP 4010 Lecture10: AR Tracking (20)

COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR Technology
 
2016 AR Summer School Lecture2
2016 AR Summer School Lecture22016 AR Summer School Lecture2
2016 AR Summer School Lecture2
 
Computer Vision - cameras
Computer Vision - camerasComputer Vision - cameras
Computer Vision - cameras
 
Digital image processing 2
Digital image processing   2Digital image processing   2
Digital image processing 2
 
Motion capture technology
Motion capture technologyMotion capture technology
Motion capture technology
 
Mobile Augmented Reality
Mobile Augmented RealityMobile Augmented Reality
Mobile Augmented Reality
 
SIFT.ppt
SIFT.pptSIFT.ppt
SIFT.ppt
 
SIFT.ppt
SIFT.pptSIFT.ppt
SIFT.ppt
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetry
 
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
Elevation mapping using stereo vision enabled heterogeneous multi-agent robot...
 
Introduction to TLS Workflow Presentation
Introduction to TLS Workflow PresentationIntroduction to TLS Workflow Presentation
Introduction to TLS Workflow Presentation
 
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November)
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November) Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November)
Oleg Novosad - "Ar kit vs arcore" - Lviv GameDev Mixer (November)
 
PPT s06-machine vision-s2
PPT s06-machine vision-s2PPT s06-machine vision-s2
PPT s06-machine vision-s2
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
Sensor's inside
Sensor's insideSensor's inside
Sensor's inside
 
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptx
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptxBiologically Inspired Turn Control for Autonomous Mobile Robots.pptx
Biologically Inspired Turn Control for Autonomous Mobile Robots.pptx
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR Technology
 
Europa Presentation 2011
Europa Presentation 2011Europa Presentation 2011
Europa Presentation 2011
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
Programming Augmented Reality - Xamarin Evolve
Programming Augmented Reality - Xamarin EvolveProgramming Augmented Reality - Xamarin Evolve
Programming Augmented Reality - Xamarin Evolve
 

Mais de Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

Mais de Mark Billinghurst (18)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Último

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 

Último (20)

Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 

COMP 4010 Lecture10: AR Tracking

  • 1. LECTURE 10: AR TECHNOLOGY: TRACKING COMP 4010 – Virtual Reality Semester 5 – 2016 Bruce Thomas, Mark Billinghurst University of South Australia October 18th 2016
  • 2. Augmented Reality Definition •  Defining Characteristics [Azuma 97] • Combines Real andVirtual Images • Both can be seen at the same time • Interactive in real-time • The virtual content can be interacted with • Registered in 3D • Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 3. Augmented RealityTechnology •  Combining Real and Virtual Images •  Display technologies •  Interactive in Real-Time •  Input and interactive technologies •  Registered in 3D •  Viewpoint tracking technologies Display Processing Input Tracking
  • 5. AR RequiresTracking and Registration • Registration •  Positioning virtual object wrt real world •  Fixing virtual object on real object when view is fixed • Tracking •  Continually locating the users viewpoint when view moving •  Position (x,y,z), Orientation (r,p,y)
  • 7. Tracking Requirements • Augmented Reality Information Display •  World Stabilized •  Body Stabilized •  Head Stabilized Increasing Tracking Requirements Head Stabilized Body Stabilized World Stabilized
  • 8. Tracking Technologies !  Active •  Mechanical, Magnetic, Ultrasonic •  GPS, Wifi, cell location !  Passive •  Inertial sensors (compass, accelerometer, gyro) •  Computer Vision •  Marker based, Natural feature tracking !  Hybrid Tracking •  Combined sensors (eg Vision + Inertial)
  • 10. MechanicalTracker • Idea: mechanical arms with joint sensors • ++: high accuracy, haptic feedback • -- : cumbersome, expensive Microscribe
  • 11. MagneticTracker • Idea: coil generates current when moved in magnetic field. Measuring current gives position and orientation relative to magnetic source. • ++: 6DOF, robust • -- : wired, sensible to metal, noisy, expensive Flock of Birds (Ascension)
  • 12. InertialTracker • Idea: measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless • -- : drifts over time, hysteresis effect, only 3DOF IS300 (Intersense) Wii Remote
  • 13. UltrasonicTracker • Idea: time of Flight or phase-Coherence Sound Waves • ++: Small, Cheap • -- : 3DOF, Line of Sight, Low resolution, Affected by environmental conditons (pressure, temperature) Ultrasonic Logitech IS600
  • 14. Global Positioning System (GPS) • Created by US in 1978 • Currently 29 satellites • Satellites send position + time • GPS Receiver positioning • 4 satellites need to be visible • Differential time of arrival • Triangulation • Accuracy • 5-30m+, blocked by weather, buildings etc.
  • 15. Mobile Sensors • Inertial compass • Earth’s magnetic field • Measures absolute orientation • Accelerometers • Measures acceleration about axis • Used for tilt, relative rotation • Can drift over time
  • 17. Why Optical Tracking for AR? •  Many AR devices have cameras •  Mobile phone/tablet, Video see-through display •  Provides precise alignment between video and AR overlay •  Using features in video to generate pixel perfect alignment •  Real world has many visual features that can be tracked from •  Computer Vision well established discipline •  Over 40 years of research to draw on •  Old non real time algorithms can be run in real time on todays devices
  • 18. Common AR Optical Tracking Types • Marker Tracking •  Tracking known artificial markers/images •  e.g. ARToolKit square markers • Markerless Tracking •  Tracking from known features in real world •  e.g. Vuforia image tracking • Unprepared Tracking •  Tracking in unknown environment •  e.g. SLAM tracking
  • 19. Marker tracking •  Available for more than 10 years •  Several open source solutions exist •  ARToolKit,ARTag,ATK+, etc •  Fairly simple to implement •  Standard computer vision methods •  A rectangle provides 4 corner points •  Enough for pose estimation!
  • 21. Marker Based Tracking: ARToolKit http://www.artoolkit.org
  • 22. Goal:Find Camera Pose •  Goal is to find the camera pose in maker coordinate frame •  Knowing: •  Position of key points in on-screen video image •  Camera properties (focal length, image distortion)
  • 24. Coordinates for Marker Tracking Marker Camera • Final Goal • Rotation & Translation 1: Camera Ideal Screen • Perspective model • Obtained from Camera Calibration 2: Ideal Screen Observed Screen • Nonlinear function (barrel shape) • Obtained from Camera Calibration 3: Marker Observed Screen • Correspondence of 4 vertices • Real time image processing
  • 26. MarkerTracking – Fiducial Detection •  Threshold the whole image to black and white •  Search scanline by scanline for edges (white to black) •  Follow edge until either •  Back to starting pixel •  Image border •  Check for size •  Reject fiducials early that are too small (or too large)
  • 27. MarkerTracking – Rectangle Fitting •  Start with an arbitrary point “x” on the contour •  The point with maximum distance must be a corner c0 •  Create a diagonal through the center •  Find points c1 & c2 with maximum distance left and right of diag. •  New diagonal from c1 to c2 •  Find point c3 right of diagonal with maximum distance
  • 28. MarkerTracking – Pattern checking •  Calculate homography using the 4 corner points •  “Direct Linear Transform” algorithm •  Maps normalized coordinates to marker coordinates (simple perspective projection, no camera model) •  Extract pattern by sampling and check •  Id (implicit encoding) •  Template (normalized cross correlation)
  • 29. MarkerTracking – Corner refinement •  Refine corner coordinates •  Critical for high quality tracking •  Remember: 4 points is the bare minimum! •  So these 4 points should better be accurate… •  Detect sub-pixel coordinates •  E.g. Harris corner detector •  Specialized methods can be faster and more accurate •  Strongly reduces jitter! •  Undistort corner coordinates •  Remove radial distortion from lens
  • 30. Marker tracking – Pose estimation •  Calculates marker pose relative to the camera •  Initial estimation directly from homography •  Very fast, but coarse with error •  Jitters a lot… •  Iterative Refinement using Gauss-Newton method •  6 parameters (3 for position, 3 for rotation) to refine •  At each iteration we optimize on the error •  Iterat
  • 31. From MarkerTo Camera •  Rotation & Translation TCM : 4x4 transformation matrix from marker coord. to camera coord.
  • 32. Tracking challenges inARToolKit False positives and inter-marker confusion (image by M. Fiala) Image noise (e.g. poor lens, block coding / compression, neon tube) Unfocused camera, motion blur Dark/unevenly lit scene, vignetting Jittering (Photoshop illustration) Occlusion (image by M. Fiala)
  • 34. But - You can’t cover world with ARToolKit Markers!
  • 35. Markerless Tracking Magnetic Tracker Inertial Tracker Ultrasonic Tracker Optical Tracker Marker-Based Tracking Markerless Tracking Specialized Tracking Edge-Based Tracking Template-Based Tracking Interest Point Tracking •  No more Markers! "Markerless Tracking Mechanical Tracker
  • 36. Natural Feature Tracking • Use Natural Cues of Real Elements •  Edges •  Surface Texture •  Interest Points • Model or Model-Free • No visual pollution Contours Features Points Surfaces
  • 38. Tracking by Keypoint Detection •  This is what most trackers do… •  Targets are detected every frame •  Popular because tracking and detection are solved simultaneously Keypoint detection Descriptor creation and matching Outlier Removal Pose estimation and refinement Camera Image Pose Recognition
  • 39. What is a Keypoint? •  It depends on the detector you use! •  For high performance use the FAST corner detector •  Apply FAST to all pixels of your image •  Obtain a set of keypoints for your image •  Describe the keypoints Rosten, E., & Drummond, T. (2006, May). Machine learning for high-speed corner detection. In European conference on computer vision (pp. 430-443). Springer Berlin Heidelberg.
  • 40. FAST Corner Keypoint Detection
  • 42. Descriptors •  Describe the Keypoint features •  Can use SIFT •  Estimate the dominant keypoint orientation using gradients •  Compensate for detected orientation •  Describe the keypoints in terms of the gradients surrounding it Wagner D., Reitmayr G., Mulloni A., Drummond T., Schmals<eg D., Real-Time Detec<on and Tracking for Augmented Reality on Mobile Phones. IEEE Transac<ons on Visualiza<on and Computer Graphics, May/June, 2010
  • 43. Database Creation •  Offline step – create database of known features •  Searching for corners in a static image •  For robustness look at corners on multiple scales •  Some corners are more descriptive at larger or smaller scales •  We don’t know how far users will be from our image •  Build a database file with all descriptors and their position on the original image
  • 44. Real-time tracking •  Search for known keypoints in the video image •  Create the descriptors •  Match the descriptors from the live video against those in the database •  Brute force is not an option •  Need the speed-up of special data structures •  E.g., we use multiple spill trees Keypoint detection Descriptor creation and matching Outlier Removal Pose estimation and refinement Camera Image Pose Recognition
  • 45. NFT – Outlier removal •  Removing outlining features •  Cascade of removal techniques •  Start with cheapest, finish with most expensive… •  First simple geometric tests •  E.g., line tests •  Select 2 points to form a line •  Check all other points being on correct side of line •  Then, homography-based tests
  • 46. NFT – Pose refinement •  Pose from homography makes good starting point •  Based on Gauss-Newton iteration •  Try to minimize the re-projection error of the keypoints •  Part of tracking pipeline that mostly benefits from floating point usage •  Can still be implemented effectively in fixed point •  Typically 2-4 iterations are enough…
  • 47. NFT – Real-time tracking •  Search for keypoints in the video image •  Create the descriptors •  Match the descriptors from the live video against those in the database •  Remove the keypoints that are outliers •  Use the remaining keypoints to calculate the pose of the camera Keypoint detection Descriptor creation and matching Outlier Removal Pose estimation and refinement Camera Image Pose Recognition
  • 48. Demo: Vuforia Texture Tracking https://www.youtube.com/watch?v=1Qf5Qew5zSU
  • 49. Edge Based Tracking •  Example: RAPiD [Drummond et al. 02] •  Initialization, Control Points, Pose Prediction (Global Method)
  • 50. Demo: Edge Based Tracking https://www.youtube.com/watch?v=UJtpBxdDVDU
  • 51. Line Based Tracking •  Visual Servoing [Comport et al. 2004]
  • 52. Model BasedTracking • Tracking from 3D object shape • Example: OpenTL - www.opentl.org •  General purpose library for model based visual tracking
  • 53. Demo: OpenTL Model Tracking https://www.youtube.com/watch?v=laiykNbPkgg
  • 54. Demo: OpenTL Face Tracking https://www.youtube.com/watch?v=WIoGdhkfNVE
  • 55. Marker vs.Natural FeatureTracking •  Marker tracking •  Usually requires no database to be stored •  Markers can be an eye-catcher •  Tracking is less demanding •  The environment must be instrumented •  Markers usually work only when fully in view •  Natural feature tracking •  A database of keypoints must be stored/downloaded •  Natural feature targets might catch the attention less •  Natural feature targets are potentially everywhere •  Natural feature targets work also if partially in view
  • 56. Tracking from an Unknown Environment •  What to do when you don’t know any features? •  Very important problem in mobile robotics - Where am I? •  SLAM •  Simultaneously Localize And Map the environment •  Goal: to recover both camera pose and map structure while initially knowing neither. •  Mapping: •  Building a map of the environment which the robot is in •  Localisation: •  Navigating this environment using the map while keeping track of the robot’s relative position and orientation
  • 57. Visual SLAM •  Early SLAM systems (1986 - ) •  Computer visions and sensors (e.g. IMU, laser, etc.) •  One of the most important algorithms in Robotics •  Visual SLAM •  Using cameras only, such as stereo view •  MonoSLAM (single camera) developed in 2007 (Davidson)
  • 59. How SLAMWorks •  Three main steps 1.  Tracking a set of points through successive camera frames 2.  Using these tracks to triangulate their 3D position 3.  Simultaneously use the estimated point locations to calculate the camera pose which could have observed them •  By observing a sufficient number of points can solve for for both structure and motion (camera path and scene structure).
  • 60. SLAM Optimization •  SLAM is an optimisation problem •  compute the best configuration of camera poses and point positions in order to minimise reprojection error •  difference between a point's tracked location and where it is expected to be •  Can be solved using bundle adjustment •  a nonlinear least squares algorithm that finds minimum error •  But – time taken grows as size of map increases •  Multi-core machines can do localization and mapping on different threads •  Relocalisation •  Allows tracking to be restarted when it fails
  • 61. Evolution of SLAM Systems •  MonoSLAM (Davidson, 2007) •  Real time SLAM from single camera •  PTAM (Klein, 2009) •  First SLAM implementation on mobile phone •  FAB-MAP (Cummins, 2008) •  Probabilistic Localization and Mapping •  DTAM (Newcombe, 2011) •  3D surface reconstruction from every pixel in image •  KinectFusion (Izadi, 2011) •  Realtime dense surface mapping and tracking using RGB-D
  • 63. LSD-SLAM (Engel 2014) •  A novel, direct monocular SLAM technique •  Uses image intensities both for tracking and mapping. •  The camera is tracked using direct image alignment, while •  Geometry is estimated as semi-dense depth maps •  Supports very large scale tracking •  Runs in real time on CPU and smartphone
  • 65. Direct Method vs.Feature Based •  Direct uses all information in image, cf feature based approach that only use small patches around corners and edges
  • 66. Applications of SLAM Systems •  Many possible applications •  Augmented Reality camera tracking •  Mobile robot localisation •  Real world navigation aid •  3D scene reconstruction •  3D Object reconstruction •  Etc.. •  Assumptions •  Camera moves through an unchanging scene •  So not suitable for person tracking, gesture recognition •  Both involve non-rigidly deforming objects and a non-static map
  • 67. Hybrid Tracking Combining several tracking modalities together
  • 68. SensorTracking •  Used by many “AR browsers” •  GPS, compass, accelerometer, gyroscope •  Not sufficient alone (drift, interference) Inertial Compass Drifting Over Time
  • 69. Combining Sensors andVision •  Sensors •  Produces noisy output (= jittering augmentations) •  Are not sufficiently accurate (= wrongly placed augmentations) •  Gives us first information on where we are in the world, and what we are looking at •  Vision •  Is more accurate (= stable and correct augmentations) •  Requires choosing the correct keypoint database to track from •  Requires registering our local coordinate frame (online- generated model) to the global one (world)
  • 70. Example: Outdoor Hybrid Tracking •  Combines •  computer vision •  inertial gyroscope sensors •  Both correct for each other •  Inertial gyro •  provides frame to frame prediction of camera orientation, fast sensing •  drifts over time •  Computer vision •  Natural feature tracking, corrects for gyro drift •  Slower, less accurate
  • 72. Robust OutdoorTracking • Hybrid Tracking •  ComputerVision, GPS, inertial • Going Out •  Reitmayr & Drummond (Univ. Cambridge) Reitmayr, G., & Drummond, T. W. (2006). Going out: robust model-based tracking for outdoor augmented reaity. In Mixed and Augmented Reality, 2006. ISMAR 2006. IEEE/ACM International Symposium on (pp. 109-118). IEEE.
  • 76. The Registration Problem • Virtual and Real content must stay properly aligned • If not: • Breaks the illusion that the two coexist • Prevents acceptance of many serious applications t = 0 seconds t = 1 second
  • 77. Sources of Registration Errors • Static errors • Optical distortions (in HMD) • Mechanical misalignments • Tracker errors • Incorrect viewing parameters • Dynamic errors • System delays (largest source of error) • 1 ms delay = 1/3 mm registration error
  • 78. Reducing Static Errors • Distortion compensation • For lens or display distortions • Manual adjustments • Have user manually alighn AR andVR content • View-based or direct measurements • Have user measure eye position • Camera calibration (video AR) • Measuring camera properties
  • 80. Dynamic errors •  Total Delay = 50 + 2 + 33 + 17 = 102 ms •  1 ms delay = 1/3 mm = 33mm error Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  • 81. Reducing dynamic errors (1) • Reduce system lag • Faster components/system modules • Reduce apparent lag • Image deflection • Image warping
  • 82. Reducing System Lag Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop Faster Tracker Faster CPU Faster GPU Faster Display
  • 83. ReducingApparent Lag Tracking Update x,y,z r,p,y Virtual Display Physical Display (640x480) 1280 x 960 Last known position Virtual Display Physical Display (640x480) 1280 x 960 Latest position Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop
  • 84. Reducing dynamic errors (2) • Match video + graphics input streams (video AR) • Delay video of real world to match system lag • User doesn’t notice • Predictive Tracking • Inertial sensors helpful Azuma / Bishop 1994
  • 85. PredictiveTracking Time Position Past Future Can predict up to 80 ms in future (Holloway) Now
  • 87. Wrap-up • Tracking and Registration are key problems • Registration error • Measures against static error • Measures against dynamic error • AR typically requires multiple tracking technologies • Computer vision most popular • Research Areas: • SLAM systems, Deformable models, Mobile outdoor tracking