Apidays New York 2024 - The value of a flexible API Management solution for O...
AR with vuforia
1. Developing with Vuforia
A Vuforia SDK-based AR application uses the display of the mobile device as a "magic lens" or
looking glass into an augmented world where the real and virtual worlds appear to coexist. The
application renders the live camera preview image on the display to represent a view of the
physical world. Virtual 3D objects are then superimposed on the live camera preview and they
appear to be tightly coupled in the real world
An application developed for Vuforia will give your users a more compelling experience:
Faster local detection of targets
Cloud recognition of up to 1 million targets simultaneously
User-defined targets for run-time target generation
Cylinder targets – Detection and tracking of images on a cylindrical surface
Text recognition – Recognition and tracking of printed text (words), also including alpha-
numeric sequences
Robust tracking – Augmentations stick to the target and are not easily lost as the device
moves
Simultaneous tracking of up to five targets
Better results in real world conditions – Low light, partially covered target
Optimizations that ensure better and more realistic graphics rendered on the target
Extended tracking capabilities, enabling your app to keep tracking targets and maintain a
consistent reference for augmentations even when the targets are no longer visible in
your camera view
This diagram provides an overview of the application development process with the Vuforia
platform. The platform consists of the Vuforia Engine (inside the SDK), the Target Management
System hosted on the developer portal (Target Manager), and optionally, the Cloud Target
Database.
2. Vuforia Components
A developer uploads the input image for the target that he wants to track. The target resources
can then be accessed by the mobile app in two ways:
Accessed from a cloud target database using web services
Downloaded in a device target database to be bundled with the mobile app
For text recognition, the developer can specify a set of words that Vuforia can recognize, using
the following text data sets:
Word lists in the VWL binary format (Vuforia Word List)
Additional word lists, which can be specified via simple text files
Optional word list filters (black or white lists) to explicitly include/exclude the recognition
of specific words
The word lists and filter files are bundled with the mobile app and loaded at runtime using the
Vuforia API.
Vuforia supports your development efforts with the following:
Getting started guides (Android, iOS, Unity Extension) to set up development on
different platforms (Windows, MacOS, Linux)
Device-specific SDKs (Android, iOS) or Extensions (Unity Extension)
Tools and services (Target Manager web UI, Developer Guide, Vuforia Web Services)
Sample apps and video tutorials
Support forum (dedicated technical support engineers, thousands of posts, FAQs)
Vuforia SDK Architecture
A Vuforia SDK-based AR application is composed of the following core components:
1. Camera
3. - The camera component ensures that every preview frame is captured and passed
efficiently to the tracker.
2. Image Converter
- The pixel format converter converts from the camera format to a format suitable for
OpenGL ES rendering and for tracking internally.
3. Tracker
- The tracker component contains the computer vision algorithms that detect and
track real-world objects in camera video frames. The results are stored in a state
object that is used by the video background renderer and can be accessed from
application code.
- The tracker can load multiple dataset at the same time and activate them
4. Video Background Renderer
- Rendering the camera image stored in the state object.
5. Application Code
- Query the state object for newly detected targets, markers or uploaded states of
these elements.
- Update the application logic with the new input data
- Render the augmented graphics overlay.
6. Device Database
- Device database are created using the online Target Manager
7. Cloud Database
- Cloud databases can be created using the Target Manager or using the Vuforia Web
Services API
8. User-Defined Targets
- This feature allows for creating targets on-the-fly from the current camera image.
9. Word targets
- The Vuforia SDK can recognize words and track them similarly to other type of
targets, with two available recognition, “Words” and “Characters”
4. Implement an AR example using Vufuria SDK for Unity
1. Installing the Extension
- Go to the Unity ( https://developer.vuforia.com/resources/sdk/unity ) download
page to get the package for your development platform
- Please enter your email and password when prompted in the download web page
2. Create a new project
- Create a new Unity project. Right click in the Project panel and select Import
Package→Custom Package to import Vuforia Unity package
5. Next, you need to add a Device Database to your project. You can do this in two ways:
Create a target on the Target Manager
Use existing targets from other projects
Managing Targets in a Device Database Using the Target Manager
1. Creating a Database in the Target Manager
- Create a new account and login https://developer.vuforia.com/
- Go to Target Manager from top menu
- Create a new Database
- Go to your new database and add new target
6. - Waiting for processing your image
- Select this target and download it like a package
- Import the target Device Database to your project
2. Add AR assets and prefabs to scene
- Delete the “Main Camera” in your current scene hierarchy, and drag an instance
of the ARCamera prefab into your scene
7. - Drag an instance of the ImageTarget prefab into your scene. This prefab
represents a single instance of an Image Target object.
- Select the ImageTarget object in your scene, and look at the Inspector. There
should be an Image Target Behaviour attached, with a property named Data Set.
This property contains a drop-down list of all available Data Sets for this project.
When a Data Set is selected, the Image Target property drop-down is filled with a
list of the targets available in that Data Set.
8. 3. Add 3D objects to scene and attach to trackables
- Add some 3D objects as a child of the ImageTarget object by selecting it in the
Hierarchy list and dragging it onto the ImageTarget item.
- Add a Directional Light to your project
4. Run your project