SlideShare uma empresa Scribd logo
1 de 29
CHAPTER 2:
INTRODUCTION TO
PROBABILITY
COMPUTER VISION: MODELS, LEARNING AND
INFERENCE


Lukas Tencer
Random variables
2


       A random variable x denotes a quantity that is
        uncertain
       May be result of experiment (flipping a coin) or a
        real world measurements (measuring
        temperature)
       If observe several instances of x we get different
        values
       Some values occur more than others and this
        information is captured by a probability
        distribution
                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Discrete Random Variables
3




                                                                   Ordered
                                                                   - histogram




                                                                   Unordered
                                                                   - hintongram




          Computer vision: models, learning and inference. ©2011
                                               Simon J.D. Prince
Continuous Random Variable
4




    - Probability density function (PDF) of a distribution, integrates to 1
                     Computer vision: models, learning and inference. ©2011
                                                          Simon J.D. Prince
Joint Probability
5


       Consider two random variables x and y
       If we observe multiple paired instances, then
        some combinations of outcomes are more likely
        than others
       This is captured in the joint probability
        distribution
       Written as Pr(x,y)
       Can read Pr(x,y) as “probability of x and y”


                Computer vision: models, learning and inference. ©2011
                                                     Simon J.D. Prince
Joint Probability
6




            Computer vision: models, learning and inference. ©2011
                                                 Simon J.D. Prince
Marginalization
7
    We can recover probability distribution of any variable in a joint
      distribution by integrating (or summing) over the other variables




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Marginalization
8
    We can recover probability distribution of any variable in a joint
      distribution by integrating (or summing) over the other variables




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Marginalization
9
    We can recover probability distribution of any variable in a joint
      distribution by integrating (or summing) over the other variables




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Marginalization
10
     We can recover probability distribution of any variable in a joint
       distribution by integrating (or summing) over the other variables




     Works in higher dimensions as well – leaves joint distribution
     between whatever variables are left




                     Computer vision: models, learning and inference. ©2011
                                                          Simon J.D. Prince
Conditional Probability
11


        Conditional probability of x given that y=y1 is relative
         propensity of variable x to take different outcomes given
         that y is fixed to be equal to y1.

        Written as Pr(x|y=y1)




                   Computer vision: models, learning and inference. ©2011
                                                        Simon J.D. Prince
Conditional Probability
12

        Conditional probability can be extracted from joint probability
        Extract appropriate slice and normalize (because dos not
         sums to 1)




                     Computer vision: models, learning and inference. ©2011
                                                          Simon J.D. Prince
Conditional Probability
13




        More usually written in compact form



     • Can be re-arranged to give



                  Computer vision: models, learning and inference. ©2011
                                                       Simon J.D. Prince
Conditional Probability
14




        This idea can be extended to more than two
         variables




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Bayes’ Rule
15
     From before:


     Combining:


     Re-arranging:




                  Computer vision: models, learning and inference. ©2011
                                                       Simon J.D. Prince
Bayes’ Rule Terminology
16

           Likelihood – propensity                  Prior – what we know
           for observing a certain                  about y before seeing
           value of x given a certain               x
           value of y




     Posterior – what we                           Evidence –a constant to
     know about y after                            ensure that the left hand
     seeing x                                      side is a valid distribution
                   Computer vision: models, learning and inference. ©2011
                                                    Simon J.D. Prince
Independence
17
        If two variables x and y are independent then variable x
         tells us nothing about variable y (and vice-versa)




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Independence
18
        If two variables x and y are independent then variable x
         tells us nothing about variable y (and vice-versa)




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Independence
19
        When variables are independent, the joint factorizes into
         a product of the marginals:




                    Computer vision: models, learning and inference. ©2011
                                                         Simon J.D. Prince
Expectation
20

     Expectation tell us the expected or average value of
     some function f [x] taking into account the distribution of
     x
     Definition:




                   Computer vision: models, learning and inference. ©2011
                                                        Simon J.D. Prince
Expectation
21

     Expectation tell us the expected or average value of
     some function f [x] taking into account the distribution of
     x
     Definition in two dimensions:




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Expectation: Common Cases
22




          Computer vision: models, learning and inference. ©2011
                                               Simon J.D. Prince
Expectation: Rules
23




     Rule 1:

     Expected value of a constant is the constant




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Expectation: Rules
24




     Rule 2:

     Expected value of constant times function is constant
     times expected value of function




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Expectation: Rules
25




     Rule 3:

     Expectation of sum of functions is sum of expectation of
     functions




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Expectation: Rules
26




     Rule 4:

     Expectation of product of functions in variables x and y
     is product of expectations of functions if x and y are independen




                 Computer vision: models, learning and inference. ©2011
                                                      Simon J.D. Prince
Conclusions
27



     • Rules of probability are compact and simple

     • Concepts of marginalization, joint and conditional
       probability, Bayes rule and expectation underpin all of
       the models in this book

     • One remaining concept – conditional expectation –
       discussed later




                Computer vision: models, learning and inference. ©2011
                                                     Simon J.D. Prince
28
            Thank You
            for you attention


Based on:
Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
http://www.computervisionmodels.com/
Additional
29


        Calculating k-th moment

Mais conteúdo relacionado

Semelhante a Introduction to Probability

09 cv mil_classification
09 cv mil_classification09 cv mil_classification
09 cv mil_classification
zukun
 
11 cv mil_models_for_chains_and_trees
11 cv mil_models_for_chains_and_trees11 cv mil_models_for_chains_and_trees
11 cv mil_models_for_chains_and_trees
zukun
 
15 cv mil_models_for_transformations
15 cv mil_models_for_transformations15 cv mil_models_for_transformations
15 cv mil_models_for_transformations
zukun
 
08 cv mil_regression
08 cv mil_regression08 cv mil_regression
08 cv mil_regression
zukun
 
07 cv mil_modeling_complex_densities
07 cv mil_modeling_complex_densities07 cv mil_modeling_complex_densities
07 cv mil_modeling_complex_densities
zukun
 
13 cv mil_preprocessing
13 cv mil_preprocessing13 cv mil_preprocessing
13 cv mil_preprocessing
zukun
 
17 cv mil_models_for_shape
17 cv mil_models_for_shape17 cv mil_models_for_shape
17 cv mil_models_for_shape
zukun
 
06 cv mil_learning_and_inference
06 cv mil_learning_and_inference06 cv mil_learning_and_inference
06 cv mil_learning_and_inference
zukun
 
16 cv mil_multiple_cameras
16 cv mil_multiple_cameras16 cv mil_multiple_cameras
16 cv mil_multiple_cameras
zukun
 
20 cv mil_models_for_words
20 cv mil_models_for_words20 cv mil_models_for_words
20 cv mil_models_for_words
zukun
 
18 cv mil_style_and_identity
18 cv mil_style_and_identity18 cv mil_style_and_identity
18 cv mil_style_and_identity
zukun
 
19 cv mil_temporal_models
19 cv mil_temporal_models19 cv mil_temporal_models
19 cv mil_temporal_models
zukun
 

Semelhante a Introduction to Probability (12)

09 cv mil_classification
09 cv mil_classification09 cv mil_classification
09 cv mil_classification
 
11 cv mil_models_for_chains_and_trees
11 cv mil_models_for_chains_and_trees11 cv mil_models_for_chains_and_trees
11 cv mil_models_for_chains_and_trees
 
15 cv mil_models_for_transformations
15 cv mil_models_for_transformations15 cv mil_models_for_transformations
15 cv mil_models_for_transformations
 
08 cv mil_regression
08 cv mil_regression08 cv mil_regression
08 cv mil_regression
 
07 cv mil_modeling_complex_densities
07 cv mil_modeling_complex_densities07 cv mil_modeling_complex_densities
07 cv mil_modeling_complex_densities
 
13 cv mil_preprocessing
13 cv mil_preprocessing13 cv mil_preprocessing
13 cv mil_preprocessing
 
17 cv mil_models_for_shape
17 cv mil_models_for_shape17 cv mil_models_for_shape
17 cv mil_models_for_shape
 
06 cv mil_learning_and_inference
06 cv mil_learning_and_inference06 cv mil_learning_and_inference
06 cv mil_learning_and_inference
 
16 cv mil_multiple_cameras
16 cv mil_multiple_cameras16 cv mil_multiple_cameras
16 cv mil_multiple_cameras
 
20 cv mil_models_for_words
20 cv mil_models_for_words20 cv mil_models_for_words
20 cv mil_models_for_words
 
18 cv mil_style_and_identity
18 cv mil_style_and_identity18 cv mil_style_and_identity
18 cv mil_style_and_identity
 
19 cv mil_temporal_models
19 cv mil_temporal_models19 cv mil_temporal_models
19 cv mil_temporal_models
 

Mais de Lukas Tencer

Mais de Lukas Tencer (11)

Semi-Supervised Learning
Semi-Supervised LearningSemi-Supervised Learning
Semi-Supervised Learning
 
ICRA: Intelligent Platform for Collaboration and Interaction
ICRA: Intelligent Platform for Collaboration and InteractionICRA: Intelligent Platform for Collaboration and Interaction
ICRA: Intelligent Platform for Collaboration and Interaction
 
Large Scale Online Learning of Image Similarity Through Ranking
Large Scale Online Learning of Image Similarity Through RankingLarge Scale Online Learning of Image Similarity Through Ranking
Large Scale Online Learning of Image Similarity Through Ranking
 
Slovakia Presentation at Day of Cultures
Slovakia Presentation at Day of CulturesSlovakia Presentation at Day of Cultures
Slovakia Presentation at Day of Cultures
 
Web-based framework for online sketch-based image retrieval
Web-based framework for online sketch-based image retrievalWeb-based framework for online sketch-based image retrieval
Web-based framework for online sketch-based image retrieval
 
Supervised Learning of Semantic Classes for Image Annotation and Retrieval
Supervised Learning of Semantic Classes for Image Annotation and RetrievalSupervised Learning of Semantic Classes for Image Annotation and Retrieval
Supervised Learning of Semantic Classes for Image Annotation and Retrieval
 
Personal Career,Education and skills presentation, 2011
Personal Career,Education and skills presentation, 2011Personal Career,Education and skills presentation, 2011
Personal Career,Education and skills presentation, 2011
 
Introduction to Computer Graphics, lesson 1
Introduction to Computer Graphics, lesson 1Introduction to Computer Graphics, lesson 1
Introduction to Computer Graphics, lesson 1
 
Computer graphics on web and in mobile devices
Computer graphics on web and in mobile devicesComputer graphics on web and in mobile devices
Computer graphics on web and in mobile devices
 
Telnet and SSH
Telnet and SSHTelnet and SSH
Telnet and SSH
 
Tracking of objects with known color signature - ELITECH 20
Tracking of objects with known color signature - ELITECH 20Tracking of objects with known color signature - ELITECH 20
Tracking of objects with known color signature - ELITECH 20
 

Último

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 

Último (20)

08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 

Introduction to Probability

  • 1. CHAPTER 2: INTRODUCTION TO PROBABILITY COMPUTER VISION: MODELS, LEARNING AND INFERENCE Lukas Tencer
  • 2. Random variables 2  A random variable x denotes a quantity that is uncertain  May be result of experiment (flipping a coin) or a real world measurements (measuring temperature)  If observe several instances of x we get different values  Some values occur more than others and this information is captured by a probability distribution Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 3. Discrete Random Variables 3 Ordered - histogram Unordered - hintongram Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 4. Continuous Random Variable 4 - Probability density function (PDF) of a distribution, integrates to 1 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 5. Joint Probability 5  Consider two random variables x and y  If we observe multiple paired instances, then some combinations of outcomes are more likely than others  This is captured in the joint probability distribution  Written as Pr(x,y)  Can read Pr(x,y) as “probability of x and y” Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 6. Joint Probability 6 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 7. Marginalization 7 We can recover probability distribution of any variable in a joint distribution by integrating (or summing) over the other variables Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 8. Marginalization 8 We can recover probability distribution of any variable in a joint distribution by integrating (or summing) over the other variables Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 9. Marginalization 9 We can recover probability distribution of any variable in a joint distribution by integrating (or summing) over the other variables Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 10. Marginalization 10 We can recover probability distribution of any variable in a joint distribution by integrating (or summing) over the other variables Works in higher dimensions as well – leaves joint distribution between whatever variables are left Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 11. Conditional Probability 11  Conditional probability of x given that y=y1 is relative propensity of variable x to take different outcomes given that y is fixed to be equal to y1.  Written as Pr(x|y=y1) Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 12. Conditional Probability 12  Conditional probability can be extracted from joint probability  Extract appropriate slice and normalize (because dos not sums to 1) Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 13. Conditional Probability 13  More usually written in compact form • Can be re-arranged to give Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 14. Conditional Probability 14  This idea can be extended to more than two variables Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 15. Bayes’ Rule 15 From before: Combining: Re-arranging: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 16. Bayes’ Rule Terminology 16 Likelihood – propensity Prior – what we know for observing a certain about y before seeing value of x given a certain x value of y Posterior – what we Evidence –a constant to know about y after ensure that the left hand seeing x side is a valid distribution Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 17. Independence 17  If two variables x and y are independent then variable x tells us nothing about variable y (and vice-versa) Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 18. Independence 18  If two variables x and y are independent then variable x tells us nothing about variable y (and vice-versa) Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 19. Independence 19  When variables are independent, the joint factorizes into a product of the marginals: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 20. Expectation 20 Expectation tell us the expected or average value of some function f [x] taking into account the distribution of x Definition: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 21. Expectation 21 Expectation tell us the expected or average value of some function f [x] taking into account the distribution of x Definition in two dimensions: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 22. Expectation: Common Cases 22 Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 23. Expectation: Rules 23 Rule 1: Expected value of a constant is the constant Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 24. Expectation: Rules 24 Rule 2: Expected value of constant times function is constant times expected value of function Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 25. Expectation: Rules 25 Rule 3: Expectation of sum of functions is sum of expectation of functions Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 26. Expectation: Rules 26 Rule 4: Expectation of product of functions in variables x and y is product of expectations of functions if x and y are independen Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 27. Conclusions 27 • Rules of probability are compact and simple • Concepts of marginalization, joint and conditional probability, Bayes rule and expectation underpin all of the models in this book • One remaining concept – conditional expectation – discussed later Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
  • 28. 28 Thank You for you attention Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince http://www.computervisionmodels.com/
  • 29. Additional 29  Calculating k-th moment