SlideShare uma empresa Scribd logo
1 de 74
Baixar para ler offline
Robot Localisation: An Introduction
Speaker: Luis Contreras | Tamagawa University
Time: June 09, 2020 (Tue) 09:00~11:00 (GMT+8)
https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series
RoboCup@Home Education
ONLINE CLASSROOM
Invited Lecture Series
Highlights
● Probabilistic in robot localisation
● Probabilistic model for robot motion
and particle filters
Luis Contreras received his Ph.D. in Computer Science at the Visual
Information Laboratory, in the Department of Computer Vision, University of
Bristol, UK. Currently, he is a research fellow at the Advanced Intelligence &
Robotics Research Center, Tamagawa University, Japan. He has also been
an active member of the Bio-robotics Laboratory at the Faculty of
Engineering, National Autonomous University of Mexico, Mexico. He has
been working on service robots and has tested his latest results at the
RoboCup and similar robot competitions for the last ten years.
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Robot Localisation: An Introduction
● Speaker: Luis Contreras | Tamagawa University
● Host: Jeffrey Tan | @HomeEDU
● Date and Time:
○ June 09, 2020 (Tue) 09:00~11:00 (GMT+8 China/Malaysia)
○ June 08, 2020 (Mon) 21:00~23:00 (EDT New York)
○ June 08, 2020 (Mon) 03:00~05:00 (CEST Italy/France)
○ Web: https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series
** Privacy reminder: Video will be recorded and published online **
RoboCup@Home Education Online Classroom
2
RoboCup@Home Education | www.RoboCupatHomeEDU.org
RoboCup@Home Education is an educational initiative in RoboCup@Home that promotes educational
efforts to boost RoboCup@Home participation and artificial intelligence (AI)-focused service robot
development.
Under this initiative, currently there are 4 efforts in active operation:
1. RoboCup@Home Education Challenge events (national, regional, international)
2. Open Source Educational Robot Platforms for RoboCup@Home (service robotics)
3. OpenCourseWare for the learning of AI-focused service robot development
4. Outreach Programs (local workshops, international academic exchanges, etc.)
Web: https://www.robocupathomeedu.org/
FB: https://www.facebook.com/robocupathomeedu/
RoboCup@Home Education
3
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Special Online Challenge Tracks
● Open Platform Online Classroom [EN]
● Open Platform Online Classroom [CN]
● Standard Platform Pepper 2.9 Online
Classroom [EN]
● Standard Platform Pepper 2.5 Online
Classroom [CN]
More details:
https://www.robocupathomeedu.org/learn/online
-classroom
Invited Lecture Series
● Robotics Development with MATLAB [EN]
● Robot Localisation: An Introduction [EN]
● World Representation Through Artificial
Neural Networks: An Introduction [EN]
● ROS with AI [TH]
Regular Online Classroom Tracks
● Introduction to Service Robotics [EN]
○ 6 weeks
○ ROS, Python
○ Speech, Vision, Navigation, Arm
RoboCup@Home Education Online Classroom
4
RoboCup@Home Education | www.RoboCupatHomeEDU.org
Luis Contreras | Tamagawa University
5
Luis Contreras received his Ph.D. in
Computer Science at the Visual Information
Laboratory, in the Department of Computer
Vision, University of Bristol, UK. Currently, he is
a research fellow at the Advanced Intelligence
& Robotics Research Center, Tamagawa
University, Japan. He has also been an active
member of the Bio-robotics Laboratory at the
Faculty of Engineering, National Autonomous
University of Mexico, Mexico. He has been
working on service robots and has tested his
latest results at the RoboCup and similar robot
competitions for the last ten years.
tamagawa.jp
Robot Localisation: An Introduction
Luis Angel Contreras-Toledo, PhD
Advance Intelligence and Robotics Research Center
Tamagawa University
https://aibot.jp/
2020
tamagawa.jp
Content
• Robot Localisation: An Introduction
• An introduction to Robot Vision
tamagawa.jp
Motivation
Metric map Topologic map
Probabilistic map
Symbolic map
Wall
Floor
tamagawa.jp
Motivation
Global path
Local path
tamagawa.jp
Localisation
s0
map m
tamagawa.jp
Localisation
s0
map m
u1
tamagawa.jp
Localisation
s1=f(s0,u1)
map m
u1
tamagawa.jp
Localisation
p(s1¦s0,u1)
map m
u1~N(μ,σ)
?
tamagawa.jp
Localisation
p(s1¦s0,u1)
map m
u1~N(μ,σ) z1
tamagawa.jp
Localisation
p(s1¦u1, z1)
map m
u1~N(μ,σ)
z1~N(μ,σ)
tamagawa.jp
Localisation
map m
u2
p(s1¦u1, z1)
tamagawa.jp
Localisation
map m
u2 z2
p(s2¦u1 , u2, z1 , z2)
tamagawa.jp
Localisation
Given a map m, with ui~N(μ,σ) and zi~N(μ,σ), at time T we have
ST ={s0, s1, s2, …, sT}
UT ={u1, u2, u3, …, uT}
ZT ={z1, z2, z3, …, zT}
The localisation problem is then defined as
p(ST¦UT, ZT,m)
tamagawa.jp
Error model
o x
y
𝑠𝑡 =
𝑥 𝑡
𝑦𝑡
𝜃𝑡
tamagawa.jp
Error model
o x
y
ut+1 = (d, α)
𝑠𝑡+1 =
𝑥 𝑡+1
𝑦𝑡+1
𝜃𝑡+1
=
𝑥 𝑡 + 𝑑 cos 𝜃𝑡+1
𝑦𝑡 + 𝑑 sin 𝜃𝑡+1
𝜃𝑡 + 𝜶
tamagawa.jp
Error model
o x
y
ut+1 = (d, α)
𝑠𝑡+1 =
𝑥 𝑡+1
𝑦𝑡+1
𝜃𝑡+1
=
𝑥 𝑡 + 𝒅 cos 𝜃𝑡+1
𝑦𝑡 + 𝒅 sin 𝜃𝑡+1
𝜃𝑡 + 𝛼
tamagawa.jp
Error model
o x
y
ut+1 = (d+ε, α+φ)
𝑠𝑡+1 =
𝑥 𝑡+1
𝑦𝑡+1
𝜃𝑡+1
≈
𝑥 𝑡 + (𝑑 + 𝜀) cos 𝜃𝑡+1
𝑦𝑡 + (𝑑 + 𝜀) sin 𝜃𝑡+1
𝜃𝑡 + 𝛼 + 𝜑
tamagawa.jp
Error model
X X
Distribution of positions after
several trials
Original position of the robot
tamagawa.jp
Error model
X X X X
X
X
XXXXX
After 10 steps
tamagawa.jp
Error model
-Position error
-Orientation error
START GOAL
START GOAL
0
0
𝜎𝜀
𝜎 𝜑
tamagawa.jp
Error model
-Pose (i.e. position and orientation) error
ut+1 = (d+ε, α+φ)
𝜀 = 0 + 𝜎𝜀 ∙ randn 1,1 , a random Gaussian number with μ=0 and σ=σε.
𝜑 = 0 + 𝜎 𝜑 ∙ randn(1,1) , a random Gaussian number with μ=0 and σ=σφ.
𝑠𝑡+1 =
𝑥 𝑡+1
𝑦𝑡+1
𝜃𝑡+1
=
𝑥 𝑡 + (𝑑 + 𝜀) cos 𝜃𝑡+1
𝑦𝑡 + (𝑑 + 𝜀) sin 𝜃𝑡+1
𝜃𝑡 + 𝛼 + 𝜑
𝜎𝜀
𝜎 𝜑
tamagawa.jp
Error model
Sensor error
z
Error from reported distance. It can be modelled as
a probability function, e.g. a Gaussian distribution,
given a reading z = r and a distance to the obstacle
d, then given x = |r – d| we have
𝑃 𝑥 =
1
2𝜋𝜎𝑧
2
𝑒
−
𝑥2
2𝜎 𝑧
2
0
𝜎𝑧
d
tamagawa.jp
Error model
Sensor error
z
Where, for a number of readings z = {𝑟1, 𝑟2, … , 𝑟𝑛}
𝜎𝑧 =
σ𝑖=1
𝑛
𝑟𝑖 − ҧ𝑟 2
𝑛 − 1
0
𝜎𝑧
d
tamagawa.jp
A probabilistic robot
Uniform distribution
After one measurement, uncertainty is centred around possible locations
Images from S. Thrun et al. “Probabilistic Robotics”. MIT Press, 2005.
tamagawa.jp
After moving to the right, uncertainty is propagated to
After a further measurement uncertainty reduces
And carries on...
tamagawa.jp
The weighted particle representation
map m
𝑠𝑡, 𝑤𝑡 , where 𝑠𝑡 =
𝑥 𝑡
𝑦𝑡
𝜃𝑡
𝜃𝑡
𝑧𝑡
𝑐1 𝑐2
𝑑1
tamagawa.jp
Key concepts
Bayes Formula
𝑃 𝑠𝑖 𝑧 =
𝑃 𝑧 𝑠𝑖 𝑃(𝑠𝑖)
𝑃(𝑧)
=
likelihood ∙ prior
evidence
tamagawa.jp
Key concepts
Probability P(S = si) = P(si) that random variable S takes on value si .
Prior (probability distribution) P(si) models uncertainty before new
data is collected.
Likelihood P(z | si) that sensor measurement takes on value z given
that the robot is at pose si .
Posterior (probability distribution) P(si | z) expresses uncertainty
after measurement.
tamagawa.jp
Key concepts
Bayes Formula
Suppose a robot detects when a door is open. If it gets
measurement z=d±σz, what is P(open|z) .
𝑑
𝜎𝑧
z
Error model
tamagawa.jp
Key concepts
Bayes Formula
P(open|z) is diagnostic.
P(z|open) is causal (it counts frequency).
*Often, causal knowledge is easier to obtain.
𝑃 open 𝑧 =
𝑃 𝑧 open 𝑃(open)
𝑃(𝑧)
tamagawa.jp
Key concepts
Example
𝑃 𝑧 open = 0.6 𝑃 𝑧 ¬open = 0.3
𝑃 open = P ¬open = 0.5
𝑃 open 𝑧 =
𝑃 𝑧 open 𝑃(open)
𝑃(𝑧)
=
𝑃 𝑧 open 𝑃(open)
𝑃(𝑧|open)𝑃(open) + 𝑃(𝑧|¬open)𝑃(¬open)
tamagawa.jp
Key concepts
Example
𝑃 𝑧 open = 0.6 𝑃 𝑧 ¬open = 0.3
𝑃 open = P ¬open = 0.5
𝑃 open 𝑧 =
0.6 ∙ 0.5
0.6 ∙ 0.5 + 0.3 ∙ 0.5
𝑃 open 𝑧 = 0.67
tamagawa.jp
The weighted particle representation
map m
𝑠𝑡, 𝑤𝑡 , where 𝑠𝑡 =
𝑥 𝑡
𝑦𝑡
𝜃𝑡
𝜃𝑡
𝑟𝑡
𝑐1 𝑐2
𝑑1
𝑟𝑡 =
𝑐1,𝑦 − 𝑐2,𝑦 𝑐2,𝑥 − 𝑥 𝑡 − (𝑐1,𝑥 − 𝑐2,𝑥)(𝑐2,𝑦 − 𝑦𝑡)
𝑐1,𝑦 − 𝑐2,𝑦 cos 𝜃 − (𝑐1,𝑥 − 𝑐2,𝑥) sin 𝜃
tamagawa.jp
The weighted particle representation
Get the likelihood of z given groundtruth rt at state st
The weight of a particle might be calculated as
To avoid some particles disappearing too quickly, we can add a damping
factor
𝑃 𝑧|𝑠𝑡 ∝
1
2𝜋𝜎𝑧
2
𝑒
−
(𝑧−𝑟𝑡)2
2𝜎 𝑧
2
𝑤 ∝ 𝑃 𝑧|𝑟
𝑤 ∝ 𝑃 𝑧|𝑟 + k
tamagawa.jp
Particle filter localisation
Use particle distribution to represent uncertainty of robot position
and orientation (state).
Each particle is a hypothesis of the state of the robot.
The particles’ weight indicates the credibility of that hypothesis.
Particle propagation after robot motion considers uncertainty in the
actuators, while particles’ weights consider sensor’s uncertainty.
tamagawa.jp
Particle filter localisation
Also known as Montecarlo filters, Condensation, or Factored
Sampling, this method probabilistically estimates where the robot
is.
It is a Bayesian estimator.
Also considered an Evolutionary Algorithm since the fittest
individuals (particles) survive.
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
Remember Bayes formula
𝑃 𝑠𝑖 𝑧 =
𝑃 𝑧 𝑠𝑖 𝑃(𝑠𝑖)
𝑃(𝑧)
Considering 𝑃(𝑠𝑖) and 𝑃(𝑧) constant for every particle, then
𝑃 𝑠𝑖 𝑧 ∝ 𝑤𝑖
Normalising for all particles
𝑤𝑖 =
𝑤𝑖
σ 𝑗=1
𝑁
𝑤𝑗
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
tamagawa.jp
Particle filter localisation
map m
Given 𝑠𝑖, 𝑤𝑖 , where 𝑠𝑖 =
𝑥𝑖
𝑦𝑖
𝜃𝑖
In general, 𝑠𝑡 can be given by
ෝ𝑠𝑡 = ෍
𝑖=1
𝑁
𝑠𝑖 𝑤𝑖
tamagawa.jp
Particle filter localisation
0. Spread particles uniformly in the virtual map.
1. Motion prediction: Move real robot and each particle inside the map.
2. Particle update: Take a measurement with the real robot and weight
particles according to virtual readings from each particle inside the
virtual world.
3. Re-sampling: Particles with better match between real and virtual
measurement will get higher weight.
4. Go to Step 1 unless the robot is lost, in that case go to Step 0.
tamagawa.jp
Particle filter localisation
tamagawa.jp
Content
• Robot Localisation: An Introduction
• An introduction to Robot Vision
tamagawa.jp
An introduction to Robot Vision
We consider robot vision a crucial skill for a service robot to meet its expectations and
therefore in this paper we presents a tutorial to computer vision for robotic applications,
so new students can have a clear idea where and how to start.
We first present the basic concepts of image publishers and subscribers in ROS and then
we apply some basic commands to introduce the students to the digital image
processing theory; finally, we present some RGBD and point cloud notions and
applications.
tamagawa.jp
Install
You should access to: https://gitlab.com/trcp/introvision
and follow the instructions there. Basically, creat a ROS workspace:
$ cd ~
$ mkdir -p erasers_ws/src
$ cd erasers_ws
$ catkin_make
and clone the repository:
$ cd ~/erasers_ws/src
$ git clone https://gitlab.com/trcp/introvision.git
$ cd ..
$ catkin_make
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Installation
You should access to: https://gitlab.com/trcp/introvision
tamagawa.jp
Image Publishers and Subscribers in ROS
We present a series of steps so the learners can start programming in the ROS
environment while they learn the ROS concepts. The templates provided here can serve as
a basic platform for more complex lessons or projects they develop after finishing all the
lessons.
ros::NodeHandle nh;
image_transport::ImageTransport it(nh);
image_transport::Publisher pub = it.advertise("camera/image", 1);
ros::NodeHandle nh;
image_transport::ImageTransport it(nh);
image_transport::Subscriber sub = it.subscribe("camera/image", 1, callback_image);
void callback_image(const sensor_msgs::ImageConstPtr& msg){
…
}
tamagawa.jp
RGB Image Processing with OpenCV and ROS
We understand the images as a 2D array, or matrix, where each element (also known as
pixel) in the array has a color value. We use three color channels per element in the array:
Red, Green, and Blue. The origin of this image matrix is at the top-left corner and columns
values increase positively from left to right while rows values increase positively from top to
bottom.
tamagawa.jp
RGB Image Processing with OpenCV and ROS
We introduce the students to the basic elements in an image and how to perform some
built-in OpenCV functions. Finally, we show them how to perform their own operations by
accessing to the pixel elements in their image.
tamagawa.jp
Point Cloud processing with ROS
We present and introduction to Point Cloud data in ROS and propose a simple task where
the students should track a person moving in front of a RGBD camera mounted in a mobile
robot. We start by introducing what is a Depth image and how to interpret it.
tamagawa.jp
Point Cloud processing with ROS
Then, we introduce some concepts on point clouds of 3D points and how to use them to
perform the target task where we divide the 3D space into a series of 2D planes so the
student can interpret and select the appropriate information to perform the task at hand.
tamagawa.jp
Summary
In this work we have provided new comers to computer vision and robotics a short guide
with a number of examples and exercises that they can use to solve the proposed task and
extend them to solve their own applications. Moreover, by providing a series of rosbags,
they do not need to have a real robot to start thinking of robot vision. We hope these work
motivates them to continue in this field.
tamagawa.jp
Robot Localisation: An Introduction
Luis Angel Contreras-Toledo, PhD
Advance Intelligence and Robotics Research Center
Tamagawa University
https://aibot.jp/
2020
Web: https://www.robocupathomeedu.org/
FB: https://www.facebook.com/robocupathomeedu/
GitHub: https://github.com/robocupathomeedu/
Online Classroom: https://www.robocupathomeedu.org/learn/online-classroom
Contact: oc@robocupathomeedu.org
RoboCup@Home Education
ONLINE CLASSROOM
Invited Lecture Series
RoboCup@Home Education
ONLINE CLASSROOM
Invited Lecture Series
Luis Contreras received his Ph.D. in Computer Science at the Visual
Information Laboratory, in the Department of Computer Vision, University of
Bristol, UK. Currently, he is a research fellow at the Advanced Intelligence &
Robotics Research Center, Tamagawa University, Japan. He has also been
an active member of the Bio-robotics Laboratory at the Faculty of
Engineering, National Autonomous University of Mexico, Mexico. He has
been working on service robots and has tested his latest results at the
RoboCup and similar robot competitions for the last ten years.
World Representation Through Artificial Neural Networks
Speaker: Luis Contreras | Tamagawa University
Time: June 16, 2020 (Tue) 09:00~11:00 (GMT+8)
https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series
Highlights
● Artificial Neural Networks and its
application to Object Recognition
● Convolutional Neural Networks

Mais conteúdo relacionado

Semelhante a Robot Localisation: An Introduction - Luis Contreras 2020.06.09 | RoboCup@Home Education

FCN-Based 6D Robotic Grasping for Arbitrary Placed Objects
FCN-Based 6D Robotic Grasping for Arbitrary Placed ObjectsFCN-Based 6D Robotic Grasping for Arbitrary Placed Objects
FCN-Based 6D Robotic Grasping for Arbitrary Placed ObjectsKusano Hitoshi
 
Introduction to computing Processing and performance.pdf
Introduction to computing Processing and performance.pdfIntroduction to computing Processing and performance.pdf
Introduction to computing Processing and performance.pdfTulasiramKandula1
 
A multi-sensor based uncut crop edge detection method for head-feeding combin...
A multi-sensor based uncut crop edge detection method for head-feeding combin...A multi-sensor based uncut crop edge detection method for head-feeding combin...
A multi-sensor based uncut crop edge detection method for head-feeding combin...Institute of Agricultural Machinery, NARO
 
Matlab and Python: Basic Operations
Matlab and Python: Basic OperationsMatlab and Python: Basic Operations
Matlab and Python: Basic OperationsWai Nwe Tun
 
Automatic selection of object recognition methods using reinforcement learning
Automatic selection of object recognition methods using reinforcement learningAutomatic selection of object recognition methods using reinforcement learning
Automatic selection of object recognition methods using reinforcement learningShunta Saito
 
Robot navigation in unknown environment with obstacle recognition using laser...
Robot navigation in unknown environment with obstacle recognition using laser...Robot navigation in unknown environment with obstacle recognition using laser...
Robot navigation in unknown environment with obstacle recognition using laser...IJECEIAES
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
EIPOMDP Poster (PDF)
EIPOMDP Poster (PDF)EIPOMDP Poster (PDF)
EIPOMDP Poster (PDF)Teddy Ni
 
Python for Ocean Science
Python for Ocean SciencePython for Ocean Science
Python for Ocean ScienceMEOPAR
 
Trajectory Planning Through Polynomial Equation
Trajectory Planning Through Polynomial EquationTrajectory Planning Through Polynomial Equation
Trajectory Planning Through Polynomial Equationgummaavinash7
 
Path Planning And Navigation
Path Planning And NavigationPath Planning And Navigation
Path Planning And Navigationguest90654fd
 
Path Planning And Navigation
Path Planning And NavigationPath Planning And Navigation
Path Planning And Navigationguest90654fd
 
One Algorithm to Rule Them All: How to Automate Statistical Computation
One Algorithm to Rule Them All: How to Automate Statistical ComputationOne Algorithm to Rule Them All: How to Automate Statistical Computation
One Algorithm to Rule Them All: How to Automate Statistical ComputationWork-Bench
 
Unit 5 Introduction to Planning and ANN.pptx
Unit 5 Introduction to Planning and ANN.pptxUnit 5 Introduction to Planning and ANN.pptx
Unit 5 Introduction to Planning and ANN.pptxDrYogeshDeshmukh1
 
The cubic root unscented kalman filter to estimate the position and orientat...
The cubic root unscented kalman filter to estimate  the position and orientat...The cubic root unscented kalman filter to estimate  the position and orientat...
The cubic root unscented kalman filter to estimate the position and orientat...IJECEIAES
 
Formato guía10
Formato guía10Formato guía10
Formato guía10litecom
 
RSC: Mining and Modeling Temporal Activity in Social Media
RSC: Mining and Modeling Temporal Activity in Social MediaRSC: Mining and Modeling Temporal Activity in Social Media
RSC: Mining and Modeling Temporal Activity in Social MediaAlceu Ferraz Costa
 

Semelhante a Robot Localisation: An Introduction - Luis Contreras 2020.06.09 | RoboCup@Home Education (20)

FCN-Based 6D Robotic Grasping for Arbitrary Placed Objects
FCN-Based 6D Robotic Grasping for Arbitrary Placed ObjectsFCN-Based 6D Robotic Grasping for Arbitrary Placed Objects
FCN-Based 6D Robotic Grasping for Arbitrary Placed Objects
 
Introduction to computing Processing and performance.pdf
Introduction to computing Processing and performance.pdfIntroduction to computing Processing and performance.pdf
Introduction to computing Processing and performance.pdf
 
Eo4301852855
Eo4301852855Eo4301852855
Eo4301852855
 
A multi-sensor based uncut crop edge detection method for head-feeding combin...
A multi-sensor based uncut crop edge detection method for head-feeding combin...A multi-sensor based uncut crop edge detection method for head-feeding combin...
A multi-sensor based uncut crop edge detection method for head-feeding combin...
 
Matlab and Python: Basic Operations
Matlab and Python: Basic OperationsMatlab and Python: Basic Operations
Matlab and Python: Basic Operations
 
Automatic selection of object recognition methods using reinforcement learning
Automatic selection of object recognition methods using reinforcement learningAutomatic selection of object recognition methods using reinforcement learning
Automatic selection of object recognition methods using reinforcement learning
 
Robot navigation in unknown environment with obstacle recognition using laser...
Robot navigation in unknown environment with obstacle recognition using laser...Robot navigation in unknown environment with obstacle recognition using laser...
Robot navigation in unknown environment with obstacle recognition using laser...
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorch
 
EIPOMDP Poster (PDF)
EIPOMDP Poster (PDF)EIPOMDP Poster (PDF)
EIPOMDP Poster (PDF)
 
Python for Ocean Science
Python for Ocean SciencePython for Ocean Science
Python for Ocean Science
 
MPCR_R_O_V_E_R_Final
MPCR_R_O_V_E_R_FinalMPCR_R_O_V_E_R_Final
MPCR_R_O_V_E_R_Final
 
Trajectory Planning Through Polynomial Equation
Trajectory Planning Through Polynomial EquationTrajectory Planning Through Polynomial Equation
Trajectory Planning Through Polynomial Equation
 
Path Planning And Navigation
Path Planning And NavigationPath Planning And Navigation
Path Planning And Navigation
 
Path Planning And Navigation
Path Planning And NavigationPath Planning And Navigation
Path Planning And Navigation
 
One Algorithm to Rule Them All: How to Automate Statistical Computation
One Algorithm to Rule Them All: How to Automate Statistical ComputationOne Algorithm to Rule Them All: How to Automate Statistical Computation
One Algorithm to Rule Them All: How to Automate Statistical Computation
 
Unit 5 Introduction to Planning and ANN.pptx
Unit 5 Introduction to Planning and ANN.pptxUnit 5 Introduction to Planning and ANN.pptx
Unit 5 Introduction to Planning and ANN.pptx
 
The cubic root unscented kalman filter to estimate the position and orientat...
The cubic root unscented kalman filter to estimate  the position and orientat...The cubic root unscented kalman filter to estimate  the position and orientat...
The cubic root unscented kalman filter to estimate the position and orientat...
 
4260 9235-1-pb
4260 9235-1-pb4260 9235-1-pb
4260 9235-1-pb
 
Formato guía10
Formato guía10Formato guía10
Formato guía10
 
RSC: Mining and Modeling Temporal Activity in Social Media
RSC: Mining and Modeling Temporal Activity in Social MediaRSC: Mining and Modeling Temporal Activity in Social Media
RSC: Mining and Modeling Temporal Activity in Social Media
 

Último

Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxJuliansyahHarahap1
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadhamedmustafa094
 
DC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationDC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationBhangaleSonal
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdfKamal Acharya
 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptMsecMca
 
Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086anil_gaur
 
Engineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesEngineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesRAJNEESHKUMAR341697
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfJiananWang21
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VDineshKumar4165
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxSCMS School of Architecture
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Servicemeghakumariji156
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startQuintin Balsdon
 
2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projectssmsksolar
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Arindam Chakraborty, Ph.D., P.E. (CA, TX)
 
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKARHAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKARKOUSTAV SARKAR
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...HenryBriggs2
 
Online electricity billing project report..pdf
Online electricity billing project report..pdfOnline electricity billing project report..pdf
Online electricity billing project report..pdfKamal Acharya
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"mphochane1998
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayEpec Engineered Technologies
 

Último (20)

Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal load
 
DC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationDC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equation
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdf
 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.ppt
 
Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086Minimum and Maximum Modes of microprocessor 8086
Minimum and Maximum Modes of microprocessor 8086
 
Engineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesEngineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planes
 
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 
2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects2016EF22_0 solar project report rooftop projects
2016EF22_0 solar project report rooftop projects
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKARHAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
HAND TOOLS USED AT ELECTRONICS WORK PRESENTED BY KOUSTAV SARKAR
 
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
scipt v1.pptxcxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...
 
Online electricity billing project report..pdf
Online electricity billing project report..pdfOnline electricity billing project report..pdf
Online electricity billing project report..pdf
 
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments""Lesotho Leaps Forward: A Chronicle of Transformative Developments"
"Lesotho Leaps Forward: A Chronicle of Transformative Developments"
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 

Robot Localisation: An Introduction - Luis Contreras 2020.06.09 | RoboCup@Home Education

  • 1. Robot Localisation: An Introduction Speaker: Luis Contreras | Tamagawa University Time: June 09, 2020 (Tue) 09:00~11:00 (GMT+8) https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series RoboCup@Home Education ONLINE CLASSROOM Invited Lecture Series Highlights ● Probabilistic in robot localisation ● Probabilistic model for robot motion and particle filters Luis Contreras received his Ph.D. in Computer Science at the Visual Information Laboratory, in the Department of Computer Vision, University of Bristol, UK. Currently, he is a research fellow at the Advanced Intelligence & Robotics Research Center, Tamagawa University, Japan. He has also been an active member of the Bio-robotics Laboratory at the Faculty of Engineering, National Autonomous University of Mexico, Mexico. He has been working on service robots and has tested his latest results at the RoboCup and similar robot competitions for the last ten years.
  • 2. RoboCup@Home Education | www.RoboCupatHomeEDU.org Robot Localisation: An Introduction ● Speaker: Luis Contreras | Tamagawa University ● Host: Jeffrey Tan | @HomeEDU ● Date and Time: ○ June 09, 2020 (Tue) 09:00~11:00 (GMT+8 China/Malaysia) ○ June 08, 2020 (Mon) 21:00~23:00 (EDT New York) ○ June 08, 2020 (Mon) 03:00~05:00 (CEST Italy/France) ○ Web: https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series ** Privacy reminder: Video will be recorded and published online ** RoboCup@Home Education Online Classroom 2
  • 3. RoboCup@Home Education | www.RoboCupatHomeEDU.org RoboCup@Home Education is an educational initiative in RoboCup@Home that promotes educational efforts to boost RoboCup@Home participation and artificial intelligence (AI)-focused service robot development. Under this initiative, currently there are 4 efforts in active operation: 1. RoboCup@Home Education Challenge events (national, regional, international) 2. Open Source Educational Robot Platforms for RoboCup@Home (service robotics) 3. OpenCourseWare for the learning of AI-focused service robot development 4. Outreach Programs (local workshops, international academic exchanges, etc.) Web: https://www.robocupathomeedu.org/ FB: https://www.facebook.com/robocupathomeedu/ RoboCup@Home Education 3
  • 4. RoboCup@Home Education | www.RoboCupatHomeEDU.org Special Online Challenge Tracks ● Open Platform Online Classroom [EN] ● Open Platform Online Classroom [CN] ● Standard Platform Pepper 2.9 Online Classroom [EN] ● Standard Platform Pepper 2.5 Online Classroom [CN] More details: https://www.robocupathomeedu.org/learn/online -classroom Invited Lecture Series ● Robotics Development with MATLAB [EN] ● Robot Localisation: An Introduction [EN] ● World Representation Through Artificial Neural Networks: An Introduction [EN] ● ROS with AI [TH] Regular Online Classroom Tracks ● Introduction to Service Robotics [EN] ○ 6 weeks ○ ROS, Python ○ Speech, Vision, Navigation, Arm RoboCup@Home Education Online Classroom 4
  • 5. RoboCup@Home Education | www.RoboCupatHomeEDU.org Luis Contreras | Tamagawa University 5 Luis Contreras received his Ph.D. in Computer Science at the Visual Information Laboratory, in the Department of Computer Vision, University of Bristol, UK. Currently, he is a research fellow at the Advanced Intelligence & Robotics Research Center, Tamagawa University, Japan. He has also been an active member of the Bio-robotics Laboratory at the Faculty of Engineering, National Autonomous University of Mexico, Mexico. He has been working on service robots and has tested his latest results at the RoboCup and similar robot competitions for the last ten years.
  • 6. tamagawa.jp Robot Localisation: An Introduction Luis Angel Contreras-Toledo, PhD Advance Intelligence and Robotics Research Center Tamagawa University https://aibot.jp/ 2020
  • 7. tamagawa.jp Content • Robot Localisation: An Introduction • An introduction to Robot Vision
  • 8. tamagawa.jp Motivation Metric map Topologic map Probabilistic map Symbolic map Wall Floor
  • 18. tamagawa.jp Localisation Given a map m, with ui~N(μ,σ) and zi~N(μ,σ), at time T we have ST ={s0, s1, s2, …, sT} UT ={u1, u2, u3, …, uT} ZT ={z1, z2, z3, …, zT} The localisation problem is then defined as p(ST¦UT, ZT,m)
  • 19. tamagawa.jp Error model o x y 𝑠𝑡 = 𝑥 𝑡 𝑦𝑡 𝜃𝑡
  • 20. tamagawa.jp Error model o x y ut+1 = (d, α) 𝑠𝑡+1 = 𝑥 𝑡+1 𝑦𝑡+1 𝜃𝑡+1 = 𝑥 𝑡 + 𝑑 cos 𝜃𝑡+1 𝑦𝑡 + 𝑑 sin 𝜃𝑡+1 𝜃𝑡 + 𝜶
  • 21. tamagawa.jp Error model o x y ut+1 = (d, α) 𝑠𝑡+1 = 𝑥 𝑡+1 𝑦𝑡+1 𝜃𝑡+1 = 𝑥 𝑡 + 𝒅 cos 𝜃𝑡+1 𝑦𝑡 + 𝒅 sin 𝜃𝑡+1 𝜃𝑡 + 𝛼
  • 22. tamagawa.jp Error model o x y ut+1 = (d+ε, α+φ) 𝑠𝑡+1 = 𝑥 𝑡+1 𝑦𝑡+1 𝜃𝑡+1 ≈ 𝑥 𝑡 + (𝑑 + 𝜀) cos 𝜃𝑡+1 𝑦𝑡 + (𝑑 + 𝜀) sin 𝜃𝑡+1 𝜃𝑡 + 𝛼 + 𝜑
  • 23. tamagawa.jp Error model X X Distribution of positions after several trials Original position of the robot
  • 24. tamagawa.jp Error model X X X X X X XXXXX After 10 steps
  • 25. tamagawa.jp Error model -Position error -Orientation error START GOAL START GOAL 0 0 𝜎𝜀 𝜎 𝜑
  • 26. tamagawa.jp Error model -Pose (i.e. position and orientation) error ut+1 = (d+ε, α+φ) 𝜀 = 0 + 𝜎𝜀 ∙ randn 1,1 , a random Gaussian number with μ=0 and σ=σε. 𝜑 = 0 + 𝜎 𝜑 ∙ randn(1,1) , a random Gaussian number with μ=0 and σ=σφ. 𝑠𝑡+1 = 𝑥 𝑡+1 𝑦𝑡+1 𝜃𝑡+1 = 𝑥 𝑡 + (𝑑 + 𝜀) cos 𝜃𝑡+1 𝑦𝑡 + (𝑑 + 𝜀) sin 𝜃𝑡+1 𝜃𝑡 + 𝛼 + 𝜑 𝜎𝜀 𝜎 𝜑
  • 27. tamagawa.jp Error model Sensor error z Error from reported distance. It can be modelled as a probability function, e.g. a Gaussian distribution, given a reading z = r and a distance to the obstacle d, then given x = |r – d| we have 𝑃 𝑥 = 1 2𝜋𝜎𝑧 2 𝑒 − 𝑥2 2𝜎 𝑧 2 0 𝜎𝑧 d
  • 28. tamagawa.jp Error model Sensor error z Where, for a number of readings z = {𝑟1, 𝑟2, … , 𝑟𝑛} 𝜎𝑧 = σ𝑖=1 𝑛 𝑟𝑖 − ҧ𝑟 2 𝑛 − 1 0 𝜎𝑧 d
  • 29. tamagawa.jp A probabilistic robot Uniform distribution After one measurement, uncertainty is centred around possible locations Images from S. Thrun et al. “Probabilistic Robotics”. MIT Press, 2005.
  • 30. tamagawa.jp After moving to the right, uncertainty is propagated to After a further measurement uncertainty reduces And carries on...
  • 31. tamagawa.jp The weighted particle representation map m 𝑠𝑡, 𝑤𝑡 , where 𝑠𝑡 = 𝑥 𝑡 𝑦𝑡 𝜃𝑡 𝜃𝑡 𝑧𝑡 𝑐1 𝑐2 𝑑1
  • 32. tamagawa.jp Key concepts Bayes Formula 𝑃 𝑠𝑖 𝑧 = 𝑃 𝑧 𝑠𝑖 𝑃(𝑠𝑖) 𝑃(𝑧) = likelihood ∙ prior evidence
  • 33. tamagawa.jp Key concepts Probability P(S = si) = P(si) that random variable S takes on value si . Prior (probability distribution) P(si) models uncertainty before new data is collected. Likelihood P(z | si) that sensor measurement takes on value z given that the robot is at pose si . Posterior (probability distribution) P(si | z) expresses uncertainty after measurement.
  • 34. tamagawa.jp Key concepts Bayes Formula Suppose a robot detects when a door is open. If it gets measurement z=d±σz, what is P(open|z) . 𝑑 𝜎𝑧 z Error model
  • 35. tamagawa.jp Key concepts Bayes Formula P(open|z) is diagnostic. P(z|open) is causal (it counts frequency). *Often, causal knowledge is easier to obtain. 𝑃 open 𝑧 = 𝑃 𝑧 open 𝑃(open) 𝑃(𝑧)
  • 36. tamagawa.jp Key concepts Example 𝑃 𝑧 open = 0.6 𝑃 𝑧 ¬open = 0.3 𝑃 open = P ¬open = 0.5 𝑃 open 𝑧 = 𝑃 𝑧 open 𝑃(open) 𝑃(𝑧) = 𝑃 𝑧 open 𝑃(open) 𝑃(𝑧|open)𝑃(open) + 𝑃(𝑧|¬open)𝑃(¬open)
  • 37. tamagawa.jp Key concepts Example 𝑃 𝑧 open = 0.6 𝑃 𝑧 ¬open = 0.3 𝑃 open = P ¬open = 0.5 𝑃 open 𝑧 = 0.6 ∙ 0.5 0.6 ∙ 0.5 + 0.3 ∙ 0.5 𝑃 open 𝑧 = 0.67
  • 38. tamagawa.jp The weighted particle representation map m 𝑠𝑡, 𝑤𝑡 , where 𝑠𝑡 = 𝑥 𝑡 𝑦𝑡 𝜃𝑡 𝜃𝑡 𝑟𝑡 𝑐1 𝑐2 𝑑1 𝑟𝑡 = 𝑐1,𝑦 − 𝑐2,𝑦 𝑐2,𝑥 − 𝑥 𝑡 − (𝑐1,𝑥 − 𝑐2,𝑥)(𝑐2,𝑦 − 𝑦𝑡) 𝑐1,𝑦 − 𝑐2,𝑦 cos 𝜃 − (𝑐1,𝑥 − 𝑐2,𝑥) sin 𝜃
  • 39. tamagawa.jp The weighted particle representation Get the likelihood of z given groundtruth rt at state st The weight of a particle might be calculated as To avoid some particles disappearing too quickly, we can add a damping factor 𝑃 𝑧|𝑠𝑡 ∝ 1 2𝜋𝜎𝑧 2 𝑒 − (𝑧−𝑟𝑡)2 2𝜎 𝑧 2 𝑤 ∝ 𝑃 𝑧|𝑟 𝑤 ∝ 𝑃 𝑧|𝑟 + k
  • 40. tamagawa.jp Particle filter localisation Use particle distribution to represent uncertainty of robot position and orientation (state). Each particle is a hypothesis of the state of the robot. The particles’ weight indicates the credibility of that hypothesis. Particle propagation after robot motion considers uncertainty in the actuators, while particles’ weights consider sensor’s uncertainty.
  • 41. tamagawa.jp Particle filter localisation Also known as Montecarlo filters, Condensation, or Factored Sampling, this method probabilistically estimates where the robot is. It is a Bayesian estimator. Also considered an Evolutionary Algorithm since the fittest individuals (particles) survive.
  • 46. tamagawa.jp Particle filter localisation Remember Bayes formula 𝑃 𝑠𝑖 𝑧 = 𝑃 𝑧 𝑠𝑖 𝑃(𝑠𝑖) 𝑃(𝑧) Considering 𝑃(𝑠𝑖) and 𝑃(𝑧) constant for every particle, then 𝑃 𝑠𝑖 𝑧 ∝ 𝑤𝑖 Normalising for all particles 𝑤𝑖 = 𝑤𝑖 σ 𝑗=1 𝑁 𝑤𝑗
  • 53. tamagawa.jp Particle filter localisation map m Given 𝑠𝑖, 𝑤𝑖 , where 𝑠𝑖 = 𝑥𝑖 𝑦𝑖 𝜃𝑖 In general, 𝑠𝑡 can be given by ෝ𝑠𝑡 = ෍ 𝑖=1 𝑁 𝑠𝑖 𝑤𝑖
  • 54. tamagawa.jp Particle filter localisation 0. Spread particles uniformly in the virtual map. 1. Motion prediction: Move real robot and each particle inside the map. 2. Particle update: Take a measurement with the real robot and weight particles according to virtual readings from each particle inside the virtual world. 3. Re-sampling: Particles with better match between real and virtual measurement will get higher weight. 4. Go to Step 1 unless the robot is lost, in that case go to Step 0.
  • 56. tamagawa.jp Content • Robot Localisation: An Introduction • An introduction to Robot Vision
  • 57. tamagawa.jp An introduction to Robot Vision We consider robot vision a crucial skill for a service robot to meet its expectations and therefore in this paper we presents a tutorial to computer vision for robotic applications, so new students can have a clear idea where and how to start. We first present the basic concepts of image publishers and subscribers in ROS and then we apply some basic commands to introduce the students to the digital image processing theory; finally, we present some RGBD and point cloud notions and applications.
  • 58. tamagawa.jp Install You should access to: https://gitlab.com/trcp/introvision and follow the instructions there. Basically, creat a ROS workspace: $ cd ~ $ mkdir -p erasers_ws/src $ cd erasers_ws $ catkin_make and clone the repository: $ cd ~/erasers_ws/src $ git clone https://gitlab.com/trcp/introvision.git $ cd .. $ catkin_make
  • 59. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 60. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 61. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 62. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 63. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 64. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 65. tamagawa.jp Installation You should access to: https://gitlab.com/trcp/introvision
  • 66. tamagawa.jp Image Publishers and Subscribers in ROS We present a series of steps so the learners can start programming in the ROS environment while they learn the ROS concepts. The templates provided here can serve as a basic platform for more complex lessons or projects they develop after finishing all the lessons. ros::NodeHandle nh; image_transport::ImageTransport it(nh); image_transport::Publisher pub = it.advertise("camera/image", 1); ros::NodeHandle nh; image_transport::ImageTransport it(nh); image_transport::Subscriber sub = it.subscribe("camera/image", 1, callback_image); void callback_image(const sensor_msgs::ImageConstPtr& msg){ … }
  • 67. tamagawa.jp RGB Image Processing with OpenCV and ROS We understand the images as a 2D array, or matrix, where each element (also known as pixel) in the array has a color value. We use three color channels per element in the array: Red, Green, and Blue. The origin of this image matrix is at the top-left corner and columns values increase positively from left to right while rows values increase positively from top to bottom.
  • 68. tamagawa.jp RGB Image Processing with OpenCV and ROS We introduce the students to the basic elements in an image and how to perform some built-in OpenCV functions. Finally, we show them how to perform their own operations by accessing to the pixel elements in their image.
  • 69. tamagawa.jp Point Cloud processing with ROS We present and introduction to Point Cloud data in ROS and propose a simple task where the students should track a person moving in front of a RGBD camera mounted in a mobile robot. We start by introducing what is a Depth image and how to interpret it.
  • 70. tamagawa.jp Point Cloud processing with ROS Then, we introduce some concepts on point clouds of 3D points and how to use them to perform the target task where we divide the 3D space into a series of 2D planes so the student can interpret and select the appropriate information to perform the task at hand.
  • 71. tamagawa.jp Summary In this work we have provided new comers to computer vision and robotics a short guide with a number of examples and exercises that they can use to solve the proposed task and extend them to solve their own applications. Moreover, by providing a series of rosbags, they do not need to have a real robot to start thinking of robot vision. We hope these work motivates them to continue in this field.
  • 72. tamagawa.jp Robot Localisation: An Introduction Luis Angel Contreras-Toledo, PhD Advance Intelligence and Robotics Research Center Tamagawa University https://aibot.jp/ 2020
  • 73. Web: https://www.robocupathomeedu.org/ FB: https://www.facebook.com/robocupathomeedu/ GitHub: https://github.com/robocupathomeedu/ Online Classroom: https://www.robocupathomeedu.org/learn/online-classroom Contact: oc@robocupathomeedu.org RoboCup@Home Education ONLINE CLASSROOM Invited Lecture Series
  • 74. RoboCup@Home Education ONLINE CLASSROOM Invited Lecture Series Luis Contreras received his Ph.D. in Computer Science at the Visual Information Laboratory, in the Department of Computer Vision, University of Bristol, UK. Currently, he is a research fellow at the Advanced Intelligence & Robotics Research Center, Tamagawa University, Japan. He has also been an active member of the Bio-robotics Laboratory at the Faculty of Engineering, National Autonomous University of Mexico, Mexico. He has been working on service robots and has tested his latest results at the RoboCup and similar robot competitions for the last ten years. World Representation Through Artificial Neural Networks Speaker: Luis Contreras | Tamagawa University Time: June 16, 2020 (Tue) 09:00~11:00 (GMT+8) https://www.robocupathomeedu.org/learn/online-classroom/invited-lecture-series Highlights ● Artificial Neural Networks and its application to Object Recognition ● Convolutional Neural Networks