SlideShare uma empresa Scribd logo
1 de 7
A Robotic Shopping Assistant for the Blind
                      Vladimir Kulyukin Chaitanya Gharpure
                   Computer Science Assistive Technology Laboratory
                          Department of Computer Science
                                 Utah State University
                                Logan, UT 83422-4205

ABSTRACT

The Computer Science Assistive Technology Laboratory (CSATL) of Utah State
University (USU) is currently developing RoboCart, a robotic shopping assistant for the
blind. This paper describes a small set of initial experiments with RoboCart at Lee’s
MarketPlace, a supermarket in Logan, Utah.

KEYWORDS

Visual impairment, robot-assisted navigation, robot-assisted grocery shopping

BACKGROUND

There are 11.4 million visually impaired individuals living in the U.S. [1]. Grocery
shopping is an activity that presents a barrier to independence for many visually impaired
people who either do not go grocery shopping at all or rely on sighted guides, e.g.,
friends, spouses, and partners. Traditional navigation aids, such as guide dogs and white
canes, are not adequate in such dynamic and complex environments as modern
supermarkets. These aids cannot help their users with macro-navigation, which requires
topological knowledge of the environment. Nor can they assist with carrying useful
payloads.

In summer 2004, the Computer Science Assistive Technology Laboratory (CSATL) of
the Department of Computer Science (CS) of Utah State University (USU) launched a
project whose objective is to build a robotic shopping assistant for the visually impaired.
In our previous publications, we examined several technical aspects of robot-assisted
navigation for the blind, such as RFID-based localization, greedy free space selection,
and topological knowledge representation [2, 3, 4]. In this paper, we briefly describe our
robotic shopping assistant for the blind, called RoboCart, and present a small set of initial
experiments with RoboCart in Lee’s MarketPlace, a supermarket in Logan, Utah.

HYPOTHESIS

It was hypothesized by the investigators that repeated use of RoboCart by a visually
impaired shopper leads to the reduction in overall shopping time which eventually
reaches asymptote.

METHOD
----------------------------------------
Figures 1 & 2 Go Here
----------------------------------------

RoboCart is built on top of a Pioneer 2DX robotic platform from ActivMedia
Corporation. RoboCart’s navigation system resides in a PVC pipes structure mounted on
top of the platform (See Figure 1). The navigation system consists of a Dell TM
Ultralight X300 laptop connected to the platform’s microcontroller, a SICK laser range
finder, a TI-Series 2000 RFID reader from Texas Instruments, and a Logitech camera
facing vertically down. The RFID reader is attached to a 200mm x 200mm antenna,
which is attached close to the floor, in front of the robot as seen in figure 1. The antenna
reads the small RFID tags embedded under carpets placed at the beginning and end of
grocery aisles. One such carpet is shown in Figure 2. The antenna is attached in the front,
because the robot’s metallic body and the magnets in its motors disabled the antenna
when placed under the body of the robot.

Navigation in RoboCart is based on Kuipers’ Spatial Semantic Hierarchy (SSH) [5]. The
SSH is a model
to represent spatial knowledge. In an SSH, spatial knowledge can be represented in five
levels: sensory, control, causal, topological and metric. Sensory level is the interface to
the robot’s sensory system. The RoboCart’s navigation is a combination of Markov
localization that uses the laser range finder and RFID-based localization that uses RFID
carpets. RoboCart has a topological map of the store that contains information on what
product items are contained in what aisles. The shopper interacts with the cart by
browsing a voice-based product directory with a 10-key keypad attached to the right of
the handle. When a product item is selected RoboCart takes the shopper to an appropriate
shelf.

------------------------------
Figures 4 Goes Here
------------------------------

A wireless IT2020 barcode reader from Hand Held Products Inc. is wirelessly coupled to
the onboard laptop. When the shopper reaches the desired product in the aisle, he/she
picks up the barcode and scans the barcode stickers on the edge of the shelf. When a
barcode is scanned the barcode reader beeps. If the barcode scanned is that of the search
item, the user hears a synthesized message in a Bluetooth headphone. Figure 3 shows a
visually impaired user scanning a barcode on the shelf with a wireless barcode reader.

RESULTS

------------------------------
Figures 4 and 5 Go Here
------------------------------

Preliminary experiments were run with one visually impaired shopper over the period of
three days. A single shopping iteration consisted of the shopper picking up RoboCart
from the docking area near the entrance, navigating to three pre-selected products, and
navigating back to the docking area through the cash register. Each iteration was divided
into 10 tasks: navigating from the docking area to product 1 (N1), finding product 1 (P1),
navigating from product 1 to product 2 (N2), finding product 2 (P2), navigating from
product 2 to product 3 (N3), finding product 3 (P3), navigating from product 3 to entry of
cash register (NC1), unloading the products (UL), navigating from the cash register entry
to the cashregister exit (NC2), and navigating from the cash register to the docking area
(NLast). Before the experiments, the shopper was given 15 minutes of training on using
the barcode reader to scan barcodes. Seven shopping runs were completed for three
different sets of products. Within each set, one product was chosen from the top shelf,
one from the third shelf and one from the bottom shelf. Time to completion numbers for
each of the ten tasks were recorded by a human observer. It can be seen from the graph in
Figure 4 that the time taken by the different navigation tasks remained fairly constant
over all runs. The graph in Figure 5 shows that the time to find a product reduces after a
few iterations. The initial longer time in finding the product is due the fact that the
shopper is not aware of the exact location of the product. However, over time, the
shopper learns where to look for the barcode for a specific product item, and the product
search time reduces. For the shopper in the experiments, the product search time reached
the asymptote at an average of 20 to 30 seconds.

DISCUSSION

This single subject study with gives the investigators hope that visually impaired
shoppers can be trained to use a barcode reader in a relatively short period of time. The
experiments conducted with one visually impaired shopper indicate that the overall
shopping time reduces with the number of shopping iterations and eventually reaches
asymptote.

REFERENCES

1. LaPlante, M. P. & Carlson, D. (2000). Disability in the United States: Prevalence and
Causes. Washington, DC: U.S. Department of Education.
2. Kulyukin, V., Gharpure, C., De Graw, N., Nicholson, J., and Pavithran, S. (2004). A
Robotic Wayfinding System for the Visually Impaired. In Proceedings of the Innovative
Applications of Artificial Intelligence Conference (IAAI), pp. 864-869. AAAI, July 2004.
3. Kulyukin, V., Gharpure, C., Nicholson, J., and S. Pavithran. (2004). RFID in Robot-
Assisted Indoor Navigation for the Visually Impaired. In Proceedings of the IEEE
International Conference on Intelligent Robots and Systems (IROS). IEEE/RSJ, October
2004.
4. Kulyukin, V., Gharpure, C., and Nicholson, J. (2005). RoboCart: Toward Robot-
Assisted Navigation of Grocery Stores by the Visually Impaired. Proceedings of the
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),
IEEE/RSJ, July 2005.
5. Kupiers, B. (2000). The Spatial Semantic Hierarchy. Artificial Intelligence, 119:191-
233.
ACKNOWLEDGMENTS

The study was funded, in part, by two Community University Research Initiative (CURI)
grants from the State of Utah (2004-05 and 2005-06) and NSF Grant IIS-0346880. The
authors would like to thank Sachin Pavithran, a visually impaired training and
development specialist at the USU Center for Persons with Disabilities, for his feedback
on the localization experiments.

Author Contact Information:

Vladimir Kulyukin, Ph.D., Computer Science Assistive Technology Laboratory,
Department of Computer Science, Utah State University, 4205 Old Main Hill, Logan, UT
84322-4205, Office Phone (435) 797-8163. EMAIL: vladimir.kulyukin@usu.edu.
Chaitanya Gharpure, Computer Science Assistive Technology Laboratory, Department of
Computer Science, Utah State University, 4205 Old Main Hill, Logan, UT 84322-4205,
Office Phone (435) 512-4560. EMAIL: cpg@cc.usu.edu.
GRAPHICS AND EQUATIONS
---------------------
Figure 1: RoboCart
---------------------




Alternative Text Description for Figure 1.
The figure shows the structure of RoboCart. A PVC pipe structure which holds the
wayfinding toolkit, is mounted on the Pioneer 2DX robotic platform. The wayfinding
toolkit consists of the laser range finder, RFID reader and antenna, a Dell Latitude X300
laptop, a Logitech camera, speakers, and a 10-key keypad. The RFID antenna is placed in
the front of teh roboitic base, close to the floor. It is used to read RFID tags embedded in
a carpet which is placed at strategic locations in the store.

-----------------------
Figure 2: RFID carpet
-----------------------




Alternative Text Description for Figure 2.
The figure shows a carpet instrumented with RFID tags. This RFID carpet is placed at
strategic locations in teh store, and used by RoboCart to localize. The RFID tags are
placed in the carpet in a hexagonal pattern. Distance between any two tags is 15 cm.
-------------------------------------
Figure 3: User scanning a barcode
-------------------------------------




Alternative Text Description for Figure 3.
The figure contains a visually impaired user attempting to read a barcode on the shelf,
using a wireless barcode reader.

--------------------------------------------
Figure 4: Navigation Timings for RoboCart
--------------------------------------------




Alternative Text Description for Figure 4.
The figure shows a graph of navigation timings. The X axis denotes the run number and
the Y axis denotes the time in seconds. Navigation timings for six navigation slots are
graphed. The navigation timings in seconds for N1 for 7 runs are 124, 124, 127, 124, 125,
124, 124 respectievly. The navigation timings in seconds for N2 for 7 runs are 61, 61, 62,
60, 60, 61, 60 respectively. The navigation timings in seconds for N3 for 7 runs are 57,
57, 61, 57, 56, 56, 56 respectively. The navigation timings in seconds for NC1 for 7 runs
are 55, 53, 53, 50, 50, 50, 50 respectively. The navigation timings in seconds for NC2 for
7 runs are 15, 16, 15, 15, 16, 16, 16 respectively. The navigation timings in seconds for
NLast for 7 runs are 20, 20, 18, 20, 19, 19, 20 respectively.
-----------------------------------
Figure 5: Product search timings
-----------------------------------




Alternative Text Description for Figure 5.
The figure shows a graph of product search timings. The X axis denotes the run number
and the Y axis denotes the time in seconds. Product search timings for three products are
graphed. The navigation timings in seconds for Product1 for 7 runs are 44, 28, 21, 19, 19,
18, 13 respectievly. The navigation timings in seconds for Product2 for 7 runs are 55, 36,
31, 25, 21, 23, 25 respectively. The navigation timings in seconds for Product3 for 7 runs
are 30, 22, 16, 13, 18, 16, 15 respectively.

Mais conteúdo relacionado

Destaque

Sociaal intranet summerschool 2012
Sociaal intranet summerschool 2012Sociaal intranet summerschool 2012
Sociaal intranet summerschool 2012Sasja Beerendonk
 
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...Vladimir Kulyukin
 
Rijksmonumenten.info
Rijksmonumenten.infoRijksmonumenten.info
Rijksmonumenten.infoArchief 2.0
 
Silver heroes roundtable maart2015
Silver heroes roundtable maart2015Silver heroes roundtable maart2015
Silver heroes roundtable maart2015Sasja Beerendonk
 
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode ScansShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode ScansVladimir Kulyukin
 
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...Vladimir Kulyukin
 
Ergonomics-for-One in a Robotic Shopping Cart for the Blind
Ergonomics-for-One in a Robotic Shopping Cart for the BlindErgonomics-for-One in a Robotic Shopping Cart for the Blind
Ergonomics-for-One in a Robotic Shopping Cart for the BlindVladimir Kulyukin
 
Rotterdam Open Data
Rotterdam Open DataRotterdam Open Data
Rotterdam Open DataArchief 2.0
 
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Vladimir Kulyukin
 
Control; the spanish eperience c.gallardo
Control; the spanish eperience c.gallardoControl; the spanish eperience c.gallardo
Control; the spanish eperience c.gallardocargallardofron
 
Op weg naar het zuidland
Op weg naar het zuidlandOp weg naar het zuidland
Op weg naar het zuidlandArchief 2.0
 
Effective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesEffective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesVladimir Kulyukin
 
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8Tatacarashalatrasulutklelakiperempuan 1219158319928446-8
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8YISC Al-Azhar
 
Teleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindTeleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindVladimir Kulyukin
 
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...Vladimir Kulyukin
 
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced Maps
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced MapsA Software Tool for Rapid Acquisition of Streetwise Geo-Referenced Maps
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced MapsVladimir Kulyukin
 

Destaque (20)

Sociaal intranet summerschool 2012
Sociaal intranet summerschool 2012Sociaal intranet summerschool 2012
Sociaal intranet summerschool 2012
 
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...
The Blind Leading the Blind: Toward Collaborative Online Route Information Ma...
 
Rijksmonumenten.info
Rijksmonumenten.infoRijksmonumenten.info
Rijksmonumenten.info
 
Silver heroes roundtable maart2015
Silver heroes roundtable maart2015Silver heroes roundtable maart2015
Silver heroes roundtable maart2015
 
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode ScansShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
 
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...
Robot-Assisted Shopping for the Blind: Issues in Spatial Cognition and Produc...
 
Ergonomics-for-One in a Robotic Shopping Cart for the Blind
Ergonomics-for-One in a Robotic Shopping Cart for the BlindErgonomics-for-One in a Robotic Shopping Cart for the Blind
Ergonomics-for-One in a Robotic Shopping Cart for the Blind
 
Rotterdam Open Data
Rotterdam Open DataRotterdam Open Data
Rotterdam Open Data
 
Bolsas femeninas
Bolsas femeninasBolsas femeninas
Bolsas femeninas
 
Mengenal sunnah
Mengenal sunnahMengenal sunnah
Mengenal sunnah
 
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
 
Control; the spanish eperience c.gallardo
Control; the spanish eperience c.gallardoControl; the spanish eperience c.gallardo
Control; the spanish eperience c.gallardo
 
PDF Test
PDF TestPDF Test
PDF Test
 
테스트용
테스트용테스트용
테스트용
 
Op weg naar het zuidland
Op weg naar het zuidlandOp weg naar het zuidland
Op weg naar het zuidland
 
Effective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesEffective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on Smartphones
 
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8Tatacarashalatrasulutklelakiperempuan 1219158319928446-8
Tatacarashalatrasulutklelakiperempuan 1219158319928446-8
 
Teleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindTeleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the Blind
 
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...
ShopTalk: Independent Blind Shopping Through Verbal Route Directions and Barc...
 
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced Maps
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced MapsA Software Tool for Rapid Acquisition of Streetwise Geo-Referenced Maps
A Software Tool for Rapid Acquisition of Streetwise Geo-Referenced Maps
 

Semelhante a Robotic Shopping Assistant for the Blind Reduces Shopping Time

Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindPassive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindVladimir Kulyukin
 
Robots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor SpacesRobots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor SpacesVladimir Kulyukin
 
Several Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind ShoppingSeveral Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind ShoppingVladimir Kulyukin
 
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...Vladimir Kulyukin
 
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedRFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedVladimir Kulyukin
 
ShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsVladimir Kulyukin
 
MobileHCI2010 p345-lee
MobileHCI2010 p345-leeMobileHCI2010 p345-lee
MobileHCI2010 p345-leeHarry Zhang
 
IRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET Journal
 
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...IRJET Journal
 
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...Dissertation | Investigative acceptance of affordable robotic kits in the uk ...
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...Mukesh Kulal
 
Smart Utilities for Blind
Smart Utilities for BlindSmart Utilities for Blind
Smart Utilities for BlindNaishil Shah
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Vladimir Kulyukin
 
Conference slides 6_18
Conference slides 6_18Conference slides 6_18
Conference slides 6_18aschriempf
 
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...ssuser50a5ec
 
Field Controlled Picker Robot
Field Controlled Picker RobotField Controlled Picker Robot
Field Controlled Picker Robotijtsrd
 

Semelhante a Robotic Shopping Assistant for the Blind Reduces Shopping Time (20)

Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindPassive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
 
Robots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor SpacesRobots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor Spaces
 
Several Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind ShoppingSeveral Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind Shopping
 
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...
A Rollator-Mounted Wayfinding System for the Elderly: Proof-of-Concept Design...
 
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedRFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
 
ShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual Impairments
 
MobileHCI2010 p345-lee
MobileHCI2010 p345-leeMobileHCI2010 p345-lee
MobileHCI2010 p345-lee
 
IRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired People
 
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...
Pothole Detection for Safer Commutes with the help of Deep learning and IOT D...
 
Fa 102b
Fa 102bFa 102b
Fa 102b
 
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...Dissertation | Investigative acceptance of affordable robotic kits in the uk ...
Dissertation | Investigative acceptance of affordable robotic kits in the uk ...
 
Ts 2 b topic
Ts 2 b topicTs 2 b topic
Ts 2 b topic
 
Ai and robotics
Ai and roboticsAi and robotics
Ai and robotics
 
Smart Utilities for Blind
Smart Utilities for BlindSmart Utilities for Blind
Smart Utilities for Blind
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
 
Smart shopping system
Smart shopping systemSmart shopping system
Smart shopping system
 
Conference slides 6_18
Conference slides 6_18Conference slides 6_18
Conference slides 6_18
 
Roboclub, IITK (2008)
Roboclub, IITK (2008)Roboclub, IITK (2008)
Roboclub, IITK (2008)
 
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
Analysis_of_Navigation_Assistants_for_Blind_and_Visually_Impaired_People_A_Sy...
 
Field Controlled Picker Robot
Field Controlled Picker RobotField Controlled Picker Robot
Field Controlled Picker Robot
 

Mais de Vladimir Kulyukin

Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Vladimir Kulyukin
 
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Vladimir Kulyukin
 
Generalized Hamming Distance
Generalized Hamming DistanceGeneralized Hamming Distance
Generalized Hamming DistanceVladimir Kulyukin
 
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityAdapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityVladimir Kulyukin
 
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...Vladimir Kulyukin
 
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Vladimir Kulyukin
 
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsText Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsVladimir Kulyukin
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vladimir Kulyukin
 
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...Vladimir Kulyukin
 
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudAn Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudVladimir Kulyukin
 
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceNarrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceVladimir Kulyukin
 
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Vladimir Kulyukin
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vladimir Kulyukin
 
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Vladimir Kulyukin
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vladimir Kulyukin
 
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Vladimir Kulyukin
 
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Vladimir Kulyukin
 
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsWireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsVladimir Kulyukin
 
On Natural Language Dialogue with Assistive Robots
On Natural Language Dialogue with Assistive RobotsOn Natural Language Dialogue with Assistive Robots
On Natural Language Dialogue with Assistive RobotsVladimir Kulyukin
 
A Wearable Two-Sensor O&M Device for Blind College Students
A Wearable Two-Sensor O&M Device for Blind College StudentsA Wearable Two-Sensor O&M Device for Blind College Students
A Wearable Two-Sensor O&M Device for Blind College StudentsVladimir Kulyukin
 

Mais de Vladimir Kulyukin (20)

Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
 
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
 
Generalized Hamming Distance
Generalized Hamming DistanceGeneralized Hamming Distance
Generalized Hamming Distance
 
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityAdapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
 
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
 
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
 
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsText Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
 
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
 
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudAn Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
 
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceNarrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
 
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
 
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
 
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsWireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
 
On Natural Language Dialogue with Assistive Robots
On Natural Language Dialogue with Assistive RobotsOn Natural Language Dialogue with Assistive Robots
On Natural Language Dialogue with Assistive Robots
 
A Wearable Two-Sensor O&M Device for Blind College Students
A Wearable Two-Sensor O&M Device for Blind College StudentsA Wearable Two-Sensor O&M Device for Blind College Students
A Wearable Two-Sensor O&M Device for Blind College Students
 

Último

well logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxwell logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxzaydmeerab121
 
DNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxDNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxGiDMOh
 
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书zdzoqco
 
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdf
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdfKDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdf
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdfGABYFIORELAMALPARTID1
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2AuEnriquezLontok
 
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learning
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep LearningCombining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learning
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learningvschiavoni
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxfarhanvvdk
 
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfReplisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfAtiaGohar1
 
Forensic limnology of diatoms by Sanjai.pptx
Forensic limnology of diatoms by Sanjai.pptxForensic limnology of diatoms by Sanjai.pptx
Forensic limnology of diatoms by Sanjai.pptxkumarsanjai28051
 
final waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterfinal waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterHanHyoKim
 
complex analysis best book for solving questions.pdf
complex analysis best book for solving questions.pdfcomplex analysis best book for solving questions.pdf
complex analysis best book for solving questions.pdfSubhamKumar3239
 
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxEnvironmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxpriyankatabhane
 
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdf
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdfDECOMPOSITION PATHWAYS of TM-alkyl complexes.pdf
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdfDivyaK787011
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...D. B. S. College Kanpur
 
Explainable AI for distinguishing future climate change scenarios
Explainable AI for distinguishing future climate change scenariosExplainable AI for distinguishing future climate change scenarios
Explainable AI for distinguishing future climate change scenariosZachary Labe
 
linear Regression, multiple Regression and Annova
linear Regression, multiple Regression and Annovalinear Regression, multiple Regression and Annova
linear Regression, multiple Regression and AnnovaMansi Rastogi
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Christina Parmionova
 
Environmental acoustics- noise criteria.pptx
Environmental acoustics- noise criteria.pptxEnvironmental acoustics- noise criteria.pptx
Environmental acoustics- noise criteria.pptxpriyankatabhane
 

Último (20)

well logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptxwell logging & petrophysical analysis.pptx
well logging & petrophysical analysis.pptx
 
DNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptxDNA isolation molecular biology practical.pptx
DNA isolation molecular biology practical.pptx
 
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
办理麦克马斯特大学毕业证成绩单|购买加拿大文凭证书
 
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdf
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdfKDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdf
KDIGO-2023-CKD-Guideline-Public-Review-Draft_5-July-2023.pdf
 
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
LESSON PLAN IN SCIENCE GRADE 4 WEEK 1 DAY 2
 
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learning
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep LearningCombining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learning
Combining Asynchronous Task Parallelism and Intel SGX for Secure Deep Learning
 
Oxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptxOxo-Acids of Halogens and their Salts.pptx
Oxo-Acids of Halogens and their Salts.pptx
 
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdfReplisome-Cohesin Interfacing A Molecular Perspective.pdf
Replisome-Cohesin Interfacing A Molecular Perspective.pdf
 
Forensic limnology of diatoms by Sanjai.pptx
Forensic limnology of diatoms by Sanjai.pptxForensic limnology of diatoms by Sanjai.pptx
Forensic limnology of diatoms by Sanjai.pptx
 
final waves properties grade 7 - third quarter
final waves properties grade 7 - third quarterfinal waves properties grade 7 - third quarter
final waves properties grade 7 - third quarter
 
complex analysis best book for solving questions.pdf
complex analysis best book for solving questions.pdfcomplex analysis best book for solving questions.pdf
complex analysis best book for solving questions.pdf
 
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptxEnvironmental Acoustics- Speech interference level, acoustics calibrator.pptx
Environmental Acoustics- Speech interference level, acoustics calibrator.pptx
 
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdf
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdfDECOMPOSITION PATHWAYS of TM-alkyl complexes.pdf
DECOMPOSITION PATHWAYS of TM-alkyl complexes.pdf
 
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
Fertilization: Sperm and the egg—collectively called the gametes—fuse togethe...
 
Explainable AI for distinguishing future climate change scenarios
Explainable AI for distinguishing future climate change scenariosExplainable AI for distinguishing future climate change scenarios
Explainable AI for distinguishing future climate change scenarios
 
PLASMODIUM. PPTX
PLASMODIUM. PPTXPLASMODIUM. PPTX
PLASMODIUM. PPTX
 
Let’s Say Someone Did Drop the Bomb. Then What?
Let’s Say Someone Did Drop the Bomb. Then What?Let’s Say Someone Did Drop the Bomb. Then What?
Let’s Say Someone Did Drop the Bomb. Then What?
 
linear Regression, multiple Regression and Annova
linear Regression, multiple Regression and Annovalinear Regression, multiple Regression and Annova
linear Regression, multiple Regression and Annova
 
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
Charateristics of the Angara-A5 spacecraft launched from the Vostochny Cosmod...
 
Environmental acoustics- noise criteria.pptx
Environmental acoustics- noise criteria.pptxEnvironmental acoustics- noise criteria.pptx
Environmental acoustics- noise criteria.pptx
 

Robotic Shopping Assistant for the Blind Reduces Shopping Time

  • 1. A Robotic Shopping Assistant for the Blind Vladimir Kulyukin Chaitanya Gharpure Computer Science Assistive Technology Laboratory Department of Computer Science Utah State University Logan, UT 83422-4205 ABSTRACT The Computer Science Assistive Technology Laboratory (CSATL) of Utah State University (USU) is currently developing RoboCart, a robotic shopping assistant for the blind. This paper describes a small set of initial experiments with RoboCart at Lee’s MarketPlace, a supermarket in Logan, Utah. KEYWORDS Visual impairment, robot-assisted navigation, robot-assisted grocery shopping BACKGROUND There are 11.4 million visually impaired individuals living in the U.S. [1]. Grocery shopping is an activity that presents a barrier to independence for many visually impaired people who either do not go grocery shopping at all or rely on sighted guides, e.g., friends, spouses, and partners. Traditional navigation aids, such as guide dogs and white canes, are not adequate in such dynamic and complex environments as modern supermarkets. These aids cannot help their users with macro-navigation, which requires topological knowledge of the environment. Nor can they assist with carrying useful payloads. In summer 2004, the Computer Science Assistive Technology Laboratory (CSATL) of the Department of Computer Science (CS) of Utah State University (USU) launched a project whose objective is to build a robotic shopping assistant for the visually impaired. In our previous publications, we examined several technical aspects of robot-assisted navigation for the blind, such as RFID-based localization, greedy free space selection, and topological knowledge representation [2, 3, 4]. In this paper, we briefly describe our robotic shopping assistant for the blind, called RoboCart, and present a small set of initial experiments with RoboCart in Lee’s MarketPlace, a supermarket in Logan, Utah. HYPOTHESIS It was hypothesized by the investigators that repeated use of RoboCart by a visually impaired shopper leads to the reduction in overall shopping time which eventually reaches asymptote. METHOD
  • 2. ---------------------------------------- Figures 1 & 2 Go Here ---------------------------------------- RoboCart is built on top of a Pioneer 2DX robotic platform from ActivMedia Corporation. RoboCart’s navigation system resides in a PVC pipes structure mounted on top of the platform (See Figure 1). The navigation system consists of a Dell TM Ultralight X300 laptop connected to the platform’s microcontroller, a SICK laser range finder, a TI-Series 2000 RFID reader from Texas Instruments, and a Logitech camera facing vertically down. The RFID reader is attached to a 200mm x 200mm antenna, which is attached close to the floor, in front of the robot as seen in figure 1. The antenna reads the small RFID tags embedded under carpets placed at the beginning and end of grocery aisles. One such carpet is shown in Figure 2. The antenna is attached in the front, because the robot’s metallic body and the magnets in its motors disabled the antenna when placed under the body of the robot. Navigation in RoboCart is based on Kuipers’ Spatial Semantic Hierarchy (SSH) [5]. The SSH is a model to represent spatial knowledge. In an SSH, spatial knowledge can be represented in five levels: sensory, control, causal, topological and metric. Sensory level is the interface to the robot’s sensory system. The RoboCart’s navigation is a combination of Markov localization that uses the laser range finder and RFID-based localization that uses RFID carpets. RoboCart has a topological map of the store that contains information on what product items are contained in what aisles. The shopper interacts with the cart by browsing a voice-based product directory with a 10-key keypad attached to the right of the handle. When a product item is selected RoboCart takes the shopper to an appropriate shelf. ------------------------------ Figures 4 Goes Here ------------------------------ A wireless IT2020 barcode reader from Hand Held Products Inc. is wirelessly coupled to the onboard laptop. When the shopper reaches the desired product in the aisle, he/she picks up the barcode and scans the barcode stickers on the edge of the shelf. When a barcode is scanned the barcode reader beeps. If the barcode scanned is that of the search item, the user hears a synthesized message in a Bluetooth headphone. Figure 3 shows a visually impaired user scanning a barcode on the shelf with a wireless barcode reader. RESULTS ------------------------------ Figures 4 and 5 Go Here ------------------------------ Preliminary experiments were run with one visually impaired shopper over the period of
  • 3. three days. A single shopping iteration consisted of the shopper picking up RoboCart from the docking area near the entrance, navigating to three pre-selected products, and navigating back to the docking area through the cash register. Each iteration was divided into 10 tasks: navigating from the docking area to product 1 (N1), finding product 1 (P1), navigating from product 1 to product 2 (N2), finding product 2 (P2), navigating from product 2 to product 3 (N3), finding product 3 (P3), navigating from product 3 to entry of cash register (NC1), unloading the products (UL), navigating from the cash register entry to the cashregister exit (NC2), and navigating from the cash register to the docking area (NLast). Before the experiments, the shopper was given 15 minutes of training on using the barcode reader to scan barcodes. Seven shopping runs were completed for three different sets of products. Within each set, one product was chosen from the top shelf, one from the third shelf and one from the bottom shelf. Time to completion numbers for each of the ten tasks were recorded by a human observer. It can be seen from the graph in Figure 4 that the time taken by the different navigation tasks remained fairly constant over all runs. The graph in Figure 5 shows that the time to find a product reduces after a few iterations. The initial longer time in finding the product is due the fact that the shopper is not aware of the exact location of the product. However, over time, the shopper learns where to look for the barcode for a specific product item, and the product search time reduces. For the shopper in the experiments, the product search time reached the asymptote at an average of 20 to 30 seconds. DISCUSSION This single subject study with gives the investigators hope that visually impaired shoppers can be trained to use a barcode reader in a relatively short period of time. The experiments conducted with one visually impaired shopper indicate that the overall shopping time reduces with the number of shopping iterations and eventually reaches asymptote. REFERENCES 1. LaPlante, M. P. & Carlson, D. (2000). Disability in the United States: Prevalence and Causes. Washington, DC: U.S. Department of Education. 2. Kulyukin, V., Gharpure, C., De Graw, N., Nicholson, J., and Pavithran, S. (2004). A Robotic Wayfinding System for the Visually Impaired. In Proceedings of the Innovative Applications of Artificial Intelligence Conference (IAAI), pp. 864-869. AAAI, July 2004. 3. Kulyukin, V., Gharpure, C., Nicholson, J., and S. Pavithran. (2004). RFID in Robot- Assisted Indoor Navigation for the Visually Impaired. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS). IEEE/RSJ, October 2004. 4. Kulyukin, V., Gharpure, C., and Nicholson, J. (2005). RoboCart: Toward Robot- Assisted Navigation of Grocery Stores by the Visually Impaired. Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE/RSJ, July 2005. 5. Kupiers, B. (2000). The Spatial Semantic Hierarchy. Artificial Intelligence, 119:191- 233.
  • 4. ACKNOWLEDGMENTS The study was funded, in part, by two Community University Research Initiative (CURI) grants from the State of Utah (2004-05 and 2005-06) and NSF Grant IIS-0346880. The authors would like to thank Sachin Pavithran, a visually impaired training and development specialist at the USU Center for Persons with Disabilities, for his feedback on the localization experiments. Author Contact Information: Vladimir Kulyukin, Ph.D., Computer Science Assistive Technology Laboratory, Department of Computer Science, Utah State University, 4205 Old Main Hill, Logan, UT 84322-4205, Office Phone (435) 797-8163. EMAIL: vladimir.kulyukin@usu.edu. Chaitanya Gharpure, Computer Science Assistive Technology Laboratory, Department of Computer Science, Utah State University, 4205 Old Main Hill, Logan, UT 84322-4205, Office Phone (435) 512-4560. EMAIL: cpg@cc.usu.edu.
  • 5. GRAPHICS AND EQUATIONS --------------------- Figure 1: RoboCart --------------------- Alternative Text Description for Figure 1. The figure shows the structure of RoboCart. A PVC pipe structure which holds the wayfinding toolkit, is mounted on the Pioneer 2DX robotic platform. The wayfinding toolkit consists of the laser range finder, RFID reader and antenna, a Dell Latitude X300 laptop, a Logitech camera, speakers, and a 10-key keypad. The RFID antenna is placed in the front of teh roboitic base, close to the floor. It is used to read RFID tags embedded in a carpet which is placed at strategic locations in the store. ----------------------- Figure 2: RFID carpet ----------------------- Alternative Text Description for Figure 2. The figure shows a carpet instrumented with RFID tags. This RFID carpet is placed at strategic locations in teh store, and used by RoboCart to localize. The RFID tags are placed in the carpet in a hexagonal pattern. Distance between any two tags is 15 cm.
  • 6. ------------------------------------- Figure 3: User scanning a barcode ------------------------------------- Alternative Text Description for Figure 3. The figure contains a visually impaired user attempting to read a barcode on the shelf, using a wireless barcode reader. -------------------------------------------- Figure 4: Navigation Timings for RoboCart -------------------------------------------- Alternative Text Description for Figure 4. The figure shows a graph of navigation timings. The X axis denotes the run number and the Y axis denotes the time in seconds. Navigation timings for six navigation slots are graphed. The navigation timings in seconds for N1 for 7 runs are 124, 124, 127, 124, 125, 124, 124 respectievly. The navigation timings in seconds for N2 for 7 runs are 61, 61, 62, 60, 60, 61, 60 respectively. The navigation timings in seconds for N3 for 7 runs are 57, 57, 61, 57, 56, 56, 56 respectively. The navigation timings in seconds for NC1 for 7 runs are 55, 53, 53, 50, 50, 50, 50 respectively. The navigation timings in seconds for NC2 for 7 runs are 15, 16, 15, 15, 16, 16, 16 respectively. The navigation timings in seconds for NLast for 7 runs are 20, 20, 18, 20, 19, 19, 20 respectively.
  • 7. ----------------------------------- Figure 5: Product search timings ----------------------------------- Alternative Text Description for Figure 5. The figure shows a graph of product search timings. The X axis denotes the run number and the Y axis denotes the time in seconds. Product search timings for three products are graphed. The navigation timings in seconds for Product1 for 7 runs are 44, 28, 21, 19, 19, 18, 13 respectievly. The navigation timings in seconds for Product2 for 7 runs are 55, 36, 31, 25, 21, 23, 25 respectively. The navigation timings in seconds for Product3 for 7 runs are 30, 22, 16, 13, 18, 16, 15 respectively.