2. ï”Based on Two Korean Standards
ïŹTTAK.KO-10.0434 (Dec. 2010): Representation for AR: POI, Content, Link
ïŹTTAK.KO-10.0851 (Dec. 2015): Metadata for AR content visualization & sharing
ï”Proposal on âMetadata: data about dataâ
ïŹMetadata representation for context-aware MAR services
ïwhat::Status, (PoI ID, physical/virtual object ID)
ïwho::Role, (author profile)
ïwhen::Time, (created, modified)
ïwhere::Location, (geographical/relative position)
ïhow::Action, (link/interaction/response method)
ïwhy::Goal, reason
ïŹMeta-tagging: the process of adding metadata with 5W1H to various
physical/virtual assets/entities
3. ï”Metaverse for MAR Services
Modeling
Interaction
Networking
Sensing
Metaverse
Measured
Space
Physical
Space
Virtual
Space
SensingModeling
Anchoring
(Augmentation)
4. ï”UVR, Context-aware MAR for DigiLog World (Woo)
ïŹ3D Link with IoE (inc. IoT) between dual (real & virtual) spaces with
additional information
ïŹAugmentation of Context-of-Interest, not only sight but also
Everything (sound, haptics, smell, taste, etc.)!
ïŹBidirectional Interaction/collaboration in dual spaces
SPACES
Private (3rd skin)
Social
General/Public
Real space
IoT/IoE, SNS
Seamless
Augmentation btw
dual spaces
How
to
Link
Seam
lessly? CoI
Ancho
r
MAR-
Content
Virtual space
7. ï”Holistic Layers : In addition to 3D CG
.
.
.
.
.
.
IoT/IoE
Real/Physical
Virtual/Cyber
Information/
Knowledge
SPACE
TIME
Layers Social wisdom
8. ï”Increased Interest and Demands on MAR
ïŹPopularized since the advent of âsmartphoneâ
ïŹIncreasing # of MAR Apps
ï”However, Incompatible MAR Contents
ïŹMost AR contents are not re-usable in other applications
ïŹDue to different âprocesses, data structure/format, etc.â
ïŹIn addition, differences in describing PoI & linking approach
ï”So, The key of Context-aware MAR services is
ïŹMetadata representation for âVirtual, Real & Anchoringâ
ïŹHowever, no standard âmetadataâ Format for MAR content
9. ï”Now, What do you think âŠ
ïŹWhat should we do for useful MAR services?
MAR
Services
âŠ
DB B
DB CDB D
DB E
10. ï”Metadata: Structured Data Fields with 5W1H
ïŹ(Who, When, Where, What, How) + Why
ïŹRepresentation of Metadata Should be
ïInteroperable: platform independent
ïCa: Semantic searching, filtering, and sharing of content
ïASAP: be concise, compact, yet flexible, extensible to use
ï”Scope: Metadata for refering
ïŹMAR content (image, text, video, audio, 3D model)
ïŹPoI: real entity (person/object/space/location) or content
ïŹMAR Anchor: Link between RC & VC:â contains the
instructions for MAR visualization
13. Metadata Type Definition
Who.Creator String Person or group that have rights or custody on the object
Who.Owner String Person or group that create the object
When.Created Time DateTime The time when the object was created
When.Modified Time DateTime The time when the object was modified
Where.PoI Location GPS GPS coordinate of the PoI
Where.PoI Direction Float
Direction of front of the PoI measured by Compass. If the PoI i
s view-point invariant(i.e. symmetric) then the value is NULL
Where.PoI 3D
Coordinate
Float (x, y
(or, z))
Position of MAR Contents in MAR scene (Can be 2D or 3D)
Where.GPS Location
of Recognition Data
GPS
GPS coordinate of the Recognition Data when the data are
acquired
Where.Contents 3D
Coordinate
Float (x, y
(or, z))
Position of MAR Contents in MAR scene
Where.Contents 3D
Direction
Float (x, y, z)
3D normal vector of the Contents in MAR scene. Absolute
value of normal vector is 1.
Where.User Location GPS GPS coordinate of the user
Where.User Direction Float (x, y, z) 3D value of device pose measured by gyroscope of user device
14. Metadata Type Definition
What.PoI ID String Identification of the PoI
What.Recognition Data ID String Identification of the Recognition Data
What.Recognition Data FileFile Local data file path or URL for Recognition Data
What.Contents ID String Identification of the Contents
What.Contents type String Type of the Contents
What.Contents Reference String References of the Contents
What.Contents Data File File Local data file path or URL for Contents Data
How.Lens Intrinsic
Parameter of Acquisition
Device
Float (fx,fy,
cx,cy,k1,k2,k3,r)
Intrinsic parameter of the embedded lens of capturing devic
e. fx, fy are focal length of the lens. cx, cy are principal point
of the lens. (c.f. Skewness of the cell array is 0.) k1,k2,k3,r ar
e distortion parameters of lens acquired by camera calibratio
n. Those parameters can be converted with respect to camer
a model. (In this case, Pin-hole model)
How.Lens Intrinsic
Parameter of User Device
Float (fx,fy,
cx,cy,k1,k2,k3,r)
Intrinsic parameter of the embedded lens of user device
How.Contents Scale
Float (x, y
(or , z))
Size of the MAR contents in MAR scene. (dimension of the
scale depends on the content)
15. ï”MAR Ontology
ïŹDevised to cover the
Relation-based data
model
ïŹConsists of correlated
MAR entities and
properties to describe
and Literals Value for
the primitive value
ïŹAll object in MAR
Ontology has the URI
(Unified Resource ID)
instead of ID
ïŹThrough the PoI, user
can start the exploring
the MAR ontology
MAR
Entity
Literals
Value
Literals
Value
MAR
Entity
MAR
Entity
Property
Property
MAR Ontology
Property
PoI
MAR
Contents
MAR
Entity Info.
Property
MAR
Contents
MAR
Entity Info.
MAR
Entity Info.
MAR
Anchor
MAR
Anchor
Userâs start point
18. ï”Meta-tagging
WHO WHEN WHERE WHAT HOW
+Virtual
Palace.{Own
er, Creator}
+Virtual Palace.{Created Time,
Modified Time, Accessed Time,
Duration}
+Virtual Palace Info.{Begin
Time, End Time}
+Virtual
Palace
Information.{L
ocation}
+Virtual Palace
Information.{Type,
Description}
+Virtual Palace VO.{Type,
String, Data File}
+Virtual
Palace
Info.{Condi
tion state}
WHO WHEN WHERE WHAT HOW
+Palace
Anchor.{Ow
ner, Author}
+Palace
Anchor.{Authored
time, Modified Time}
+Palace Anchor.{PoI
Coord., Contents
Coord., Coord. Type}
+Palace Anchor.{ID,
Tag, PoI.ID, MAR
Contents.ID}
+Palace Anchor.{Scale,
Interaction, Animation,
Accessibility}
WHO WHEN WHERE WHAT HOW
+Palace
Recognition.
{Owner,
Creator}
+Palace
Recognition.{Create
d Time, Modified
Time}
+Palace PoI.{Location,
Front Direction}
+Palace
Recognition.{Personal
Device Location, PoI 3D
Position}
+Palace PoI.{ID, Name,
Tag, Rec. Data, MAR
Anchor.ID}
+Palace Recognition.{ID,
Data Set ID, Candidate Sub
Data}
+Palace
Recognition.{Sc
ale, Gyroscope,
Compass,
Intrinsic
Parameter}
2
1
3
http:.//Geovid.org
19. ï”Implementation in KCTM
ïŹ As implementation of the MAR reference model with 360 video, we are developing the
MAR service in K-Culture Time Machine project that provide personalized MAR storytelling
of the historical site in Korea
20. ï”Implementation in KCTM
ïŹ As implementation of the MAR reference model with 3D 360 video, we are developing the
MAR service in K-Culture Time Machine project that provide personalized MAR storytelling
of the historical site in Korea
21. ï”Implementation in KCTM
ïŹ As implementation of the MAR reference model, we are developing the phone-based MAR
service in K-Culture Time Machine project that provide personalized MAR storytelling of
the historical site in Korea
22. 5W1H Metadata Schema
who MAR Contents.Essential.Who
when MAR Contents.Essential.When
where MAR Contents.Essential.Where
what
MAR Contents.Essential.What, MAR Contents.{Type, Reference, MAR
Anchor.ID}, VO.{Type, String, Data File}, Media Asset.{Type, Data File}
how MAR Contents.Essential.How
Anchor
MAR Entity:
Metadata
1
3 MAR Conte
nt: Metadat
a
5W1H Metadata Schema
who MAR Anchor.Essential.Who
when MAR Anchor.Essential.When
where
MAR Anchor.{Essential.Where, PoI Coordinate, Contents Coordinate,
Coordinate Type}
what MAR Anchor.{Essential.What, MAR Entity.URI, MAR Contents.ID}
how
MAR Anchor.{Essential.How, Scale, Interaction, Animation,
Accessibility}
MAR Anchor:
Metadata
2
Anchor
MARÂ
Entity
VirtualÂ
Contents
Anchor
5W1H Metadata Schema
who {PoI, Recognition Data, MAR Entity Information}.Essential.Who
when
{PoI, Recognition Data, MAR Entity Information}.Essential.When, MAR Entity
Information.{Begin Time, End Time}
where
{PoI, Recognition Data, MAR Entity Information}.Essential.Where,
PoI.{Location, Front Direction}, Recognition Data.{PoI 3D Position, GPS
Location}, MAR Entity Information.{Location}
what
{PoI, Recognition Data, MAR Entity Information}.Essential.What,
PoI.{Recognition Data.ID, MAR Entity.URI}, Recognition Data.{Data Set ID,
SubâData Set ID, Current Sub Data, Candidate Sub Data}, MAR Entity.{URI,
Label, MAR Entity Information.ID, PoI.ID, MAR Anchor.ID, Property.URI},
MAR Entity Information.{Type, Description}, Property.{URI, Label}, Literal
Value.{Type, Value}
how
{PoI, Recognition Data, MAR Entity Information}.Essential.How, Recognition
Data.{Scale}, MAR Entity Information.{Condition state}
PoI
MARÂ Entity
Information
Related
MARÂ
Entity
Property
23. 5W1H Metadata Schema
who â
when â
where GPS, Gyroscope, Compass
what ID
how Intrinsic Parameter
Essential:
Metadata
1 2 User:
Metadata
5W1H Metadata Schema
who Creator, Owner
when Created Time, Modified Time
where â
what ID, Name, Tag
how â
Essential
Metadata
Anchor
PoI
MARÂ
Contents
MARÂ
Anchor
RecognitionÂ
Data
MARÂ EntityÂ
Information
28. ï”Now, Metadata for MAR ServicesâŠ
ïŹIs it useful or worthy of standard?
ïŹFinally, I prepared for NWIPâŠ
MAR
Services
âŠ
DB B
DB CDB D
DB E
29. âThe future is already here. It is just not
uniformly distributedâ -William Gibson (SF writer)
ïœMore Information
Woontack Woo, Ph.D.
Mail: wwoo@kaist.ac.kr
Web: http://uvr.kaist.ac.kr