College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
Updates on Benchmarking of Vision-based Geometric Registration and Tracking Methods for MAR
1. Updates on Benchmarking of
Vision-based Geometric
Registration and
Tracking Methods for MAR
(ISO/IEC NP 18520)
Takeshi Kurata (AIST, Japan)
TrakMark WG
SC 24/WG 9 subcommittee in Japan
ISO IEC/JTC 1/SC 24/WG 9 meeting (2016/1/13)
2. Outline
–Intermediate report on WD (ISO/IEC
NP 18520)
–Tracking competition in ISMAR 2015 in
Fukuoka, Japan
–Grassroots activity for standardization
on PDR (Pedestrian Dead Reckoning)
benchmarking (separated slides)
2
3. Obtained IS documents
• Benchmarking, Measurement process, Quality evaluation
– ISO/IEC 29155-1:2011, ISO/IEC 29155-2:2013, ISO/IEC FDIS
29155-3, Systems and software engineering -- Information
technology project performance benchmarking framework
– ISO/IEC 15939:2007, Systems and software engineering --
Measurement process
– ISO/IEC 25012:2008, Software engineering -- Software product
Quality Requirements and Evaluation (SQuaRE) -- Data quality
model
• View Model
– ISO/IEC/IEEE 42010:2011, Systems and software engineering
— Architecture description
– ISO/IEC 10746-1:1998, Information technology — Open
Distributed Processing — Reference model: Overview
3
4. WG 9 Japanese subcommittee meeting
• held on Nov 9, 2015.
• Revising and Brushing up
– Introduction: Once deleted, but now back on track.
– Scope: Stakeholders added, etc.
– Terms: Added.
– Online benchmarking should be included.
– Dataset, Indicator, Process -> Process, Indicator, Dataset
– Reference structure -> Reference framework
– Benchmark standard -> Benchmarking standard
– Benchmark process -> Benchmarking process
– Benchmark result -> Benchmarking outcome
– Temporal indicator -> Temporality indicator
4
5. Outline
–Intermediate report on WD (ISO/IEC
NP 18520)
–Tracking competition in ISMAR 2015 in
Fukuoka, Japan
–Grassroots activity for standardization
on PDR (Pedestrian Dead Reckoning)
benchmarking (separated slides)
7
6. Tracking competition in ISMAR 2015
8
• ISMAR 2015 was held from Sep 29 to Oct 3 in Fukuoka, Japan.
• PRMU (SIG on Pattern Recognition and Media Understanding) algorithm
contest committee and ISMAR 2015 Tracking Competition committee
including TrakMark WG collaborated for organizing tracking competition
in ISMAR 2015.
• Off-site and on-site competitions were prepared (Off-site competition
would be the first attempt for ISMARs).
• TrakMark WG and ISO/IEC NP 18520 mutually have got feedbacks
though this kind of events.
Previous
tracking
competitions:
7. Findings on On-site (Online)
Competition in ISMAR 2015
Rules and Regulations (Benchmarking process,
Conformance)
• Even if the details of rules and regulations
were documented, they were not well
understood by the contestants and it became
problems during the tracking competition.
• We should have made the contestant
reconfirm the rules and regulations with visual
aids such as presentation slides, etc.
9
Special thanks to Prof. H. Uchiyama, Prof. S. Ikeda, and Prof. F. Shibata!
8. Findings on On-site (Online)
Competition in ISMAR 2015
Simplification
• Simplification of benchmarking process is often
necessary for practical operation of tracking
competition, but the pros and cons should be
considered.
10
9. Findings on On-site (Online)
Competition in ISMAR 2015
Simplification: Equality (Benchmarking process, Conformance)
• In this competition, strictly speaking, the condition for each
contestant wasn't equal.
• Each contestant was supposed to mark on a textured paper
attached on a wall to indicate a challenge point estimated by
each MAR system. For simplification and time-saving of
competition operation, after some contestant marked on a
textured paper attached on a wall, we didn't change the
paper for following contestants. So the paper with the mark
became a part of the environment for following contestants.
• This might not be fair because the appearance of the paper
changed due to the mark, the size of which is small, though.
11
10. Findings on On-site (Online)
Competition in ISMAR 2015
Simplification: How to evaluate (Benchmarking process, Benchmark
indicator)
• Ideally the measurement for accuracy/precision evaluation
should be based on the distance between the 3D coordinates of
a challenge point given by us (organizers), and the 3D
coordinates of the challenge point estimated by the contestant's
MAR system in the 3D space.
– However, the 2D coordinates on a partition wall which are the
projection of the 3D coordinates of the challenge point onto the wall.
– Practically speaking, it is quite difficult to measure the distance of
arbitrary points in the air in the 3D space. Measuring the distance on a
plane is much easier by either way of manual or automatic.
– More than three challenge point on a plane are necessary, if we want
to strictly evaluate the 3D position and orientation of the camera with
such 2D coordinates.
12
11. Findings on On-site (Online)
Competition in ISMAR 2015
How to evaluate (Benchmarking indicator,
Benchmarking process)
• In this competition, we took the following way of
evaluation:
– [the number of challenge points each contestant finds]
& [the mean distance] & [time for trial completion.]
• Actually just the number of points did matter since
only one contestant completed the trial.
13
12. Findings on On-site (Online)
Competition in ISMAR 2015
How to measure (Benchmarking process, Conformance)
• Measuring the ground truth in preparation phase: On-site
competitions are supposed to be held at various places, so
it would be better to be able to use the same/standardized
tool for measuring the environments and ground truth, and
for making correspondence between the real world
coordinate system and the local one.
• Measuring the error in trial phase: In this competition, the
error in distance was measured manually with a ruler. It
was possible because a challenge point is supposed to be
located onto a plane, but if not, how to measure?
14
13. Findings on On-site (Online)
Competition in ISMAR 2015
Contestants’ behavior (Benchmarking process)
• Since the user of the MAR system was
supposed to mark on the wall paper while
using it, the camera getting closer to the paper
made image registration and tracking often
unstable.
• As expected somehow, one of key techniques
for obtaining high scores in the competition
was to devise and master how to move the
camera.
15
14. Findings on On-site (Online)
Competition in ISMAR 2015
Difficulty level design (Benchmarking process, Benchmark
indicator)
• The difficulty of each trial of online registration and
tracking strongly depends on the combination of objects
aligned in the competition environment and their
positions.
• So adjusting the difficulty moderately was hard, however,
plural challenge points alleviated the problem to some
extent.
• Only for the first challenge point, millimeter accuracy had
some meaning. However, for following challenge points
involved along with 10 to 20m movement, millimeter
evaluation does not matter, and what is the matter is how
to stably keep tracking.
16
15. Findings on On-site (Online)
Competition in ISMAR 2015
Modeling (Benchmark indicator, Benchmarking process)
• There are two choices on acquiring/constructing
environmental 3D models in each MAR system by
each contestant; beforehand and online (during trial,
SLAM).
• For both choices, the same measurement for
evaluation, which is the distance on a plane, can be
used. In this competition, we chose online acquisition.
• However, how about other measurement/benchmark
indicators such as temporality indicators?
17
16. Findings on On-site (Online)
Competition in ISMAR 2015
As a public/open event (Benchmarking Process)
• By showing screen-shot video of each MAR
system during competition to a large screen,
we successfully had the audience enjoy the
competition and made it a open competition.
18
17. Findings on Off-site (Offline)
Competition in ISMAR 2015
Projection error of virtual objects (PEVO for short)
(Benchmark indicator)
• PEVO is the most direct and intuitive indicator for
vision-based geometric registration and tracking
methods for MAR. If other indicators or criteria
are used, we would need to introduce other
assumptions, constraints, etc. for narrowing down
the ambiguity of evaluation.
• PEVO was working well since it is sensitive to the
error of the position and orientation of the
camera. There were some cases in which PEVO
was large even if re-projection error of image
features was small.
19
18. Findings on Off-site (Offline)
Competition in ISMAR 2015
Ambiguous regulation without PEVO
(Benchmarking process, Benchmark indicator)
• We had asked each contestant for submitting the
projective camera matrices for data with lens
distortion. As the result, some contestants
submitted the normalized camera matrices which
consist of position and orientation and do not
contain the intrinsic parameter of the camera, and
some other contestants the projective camera
matrices.
• It made us fairly/equally evaluate each
contestant. We should have asked for submitting
the normalized camera matrices.
20
19. Findings on Off-site (Offline)
Competition in ISMAR 2015
Scalability (Benchmarking process)
• It is really getting hard to deal with various
submissions according to an increase in the
number of contestants. It is also hard to make
each contestant thoroughly understand the
rules (same as in onsite).
• It seems that procedures such that software
for evaluation and the datasets are jointly
open to the public and each contestant
submits only specific parts are appropriate.
21
20. Findings on Off-site (Offline)
Competition in ISMAR 2015
Fine tuning/Cheating (Benchmarking process,
Conformance)
• It might be better for judges to evaluate each
method by themselves, for instance, by
making each contestant submit the binary
(executable) program.
• Otherwise, it would be difficult to check
whether each method does not use future data
such as global optimization though the entire
data.
22
21. Findings on Off-site (Offline)
Competition in ISMAR 2015
Difficulty level design (Benchmarking process,
Benchmark indicator)
• The difficulty level gap between Level 2 and 3
was large. It is hard to design each difficulty
level (same as in onsite).
23
22. Outline
–Intermediate report on WD (ISO/IEC
NP 18520)
–Tracking competition in ISMR2014 and
ISMAR 2015 in Fukuoka, Japan
–Grassroots activity for standardization
on PDR (Pedestrian Dead Reckoning)
benchmarking (separated slides)
24