SlideShare uma empresa Scribd logo
1 de 18
Baixar para ler offline
College of William and Mary
Final Project: PHYS 351
ARDUINO BASED IR LED TRACKER
Authors: Dara Kharabi and Jacob McCormick
November 3, 2016
1 Abstract
One of the applications of robotics most present in the public consciousness is that of the
“personal companion,” a robot that acts to serve the needs of its human owners. One of
the fundamental abilities of such a companion robot is that of following its owner around.
Navigating the real world presents a difficult, often non-ideal problem space. Following a
target is a desirable behavior goal because it allows a robot to circumvent many of these dif-
ficulties by following another entity (like a human owner) that already possesses the facilities
necessary for navigation.
To this end, we have designed and implemented a robot capable of detecting, tracking,
and following a pulse-width-modulated infrared signal. Our design uses an array of eight
inexpensive IR receiver sensors to collect data. This data is then processed with an Arduino
Uno microcontroller and used to govern its motion (mediated by two wheels on continuous
rotation servomotors). Our design is highly tolerant to EM noise/interference and can track a
low-power IR LED from up to 14 meters away. With a more powerful IR beacon and suitable
directional lensing apparatus, the system’s tracking range could be increased tenfold or more.
Originally, we planned for the robot to be able to hold and transport loads upward of
50kg, allowing it to assist in household chores and even manned transportation. However,
time and budget constraints led us to design our robot around easily available low-torque
servomotors, constraining these use cases. We do not predict that any additional software-
side challenges will arise from increasing the scale of the robot. In other words, our robot’s
IR tracking system may be used as-is to allow robots that transport supplies or people to
follow human guides in both indoor and outdoor settings.
1
2 Executive Summary
For this project, the robot was built around an Arduino Uno microcontroller. The Arduino
Uno board has inputs from an array of 8 sensors located on the upper platform of the 3D
printed chassis. We used a Makerbot Replicator 3D printer to print our chassis to ensure
proper fit of the components and overall durability of the robot. The Arduino is powered
by a battery pack and sends signal to the servos, which drive the wheels. The goal of this
project was to create a robot capable of following an infrared LED light over a relatively
large range with reasonable precision.
Much of the challenge associated with this project was incurred in engineering the physical
aspects of the system. The path of information for the robot remained quite linear. First,
a signal is received by one or more of the sensors on the sensor array. The data collected
by the sensors is then sent to the Arduino Uno microcontroller to be processed. Once the
Arduino Uno has decided the proper course of action, the Arduino sends signals to the Servo
motors to drive the wheels.
The other portion of the project consists of an Arduino that pulses an IR LED at a
frequency that is detectable by the 8 sensor array on the robot. This is the IR LED that the
robot tracks. By moving the LED to specific places, you can then cause the robot to go to
that place. For now, there is just one LED that pulses. However, we can add more LED as
well as code to control which LED is flashing in the future.
Overall, this project was a success because the robot was capable of tracking and moving
toward a pulsing LED. However, there is room for improvement in a variety of ways which
will be discussed later in the report.
2
3 Table of Contents
1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 4
2. System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg.s 5-8
(a) Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 5
(b) Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 6
3. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 9
4. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 11
5. Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg.s 12-17
(a) Arduino Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 12
(b) Component Pictures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 16
3
4 Introduction
In the modern age, many tasks that were once the responsibility of humans have been
offloaded to machinery. Whether it be by means of a washing machine, dishwater, vacuum
cleaner, etc we can see the value of implementing robotics in every day life. By using robots
to assist people in carrying out their everyday tasks, we can improve our quality of life as
well as improve the efficiency with which we perform these tasks.
One behavior that shows promise as a building block for these helpful machines is fol-
lowing. A robot that follows a target has no need for complicated navigational intelligence
of its own, and can be used in a wide range of beneficial applications.
A robot could follow an individual around their house, carrying useful items such as
laundry or tools in an unobtrusive and hands-free manner. a similar supplies-carrier robot,
fitted with a heavy duty chassis, could assist hikers and infantrymen. Robots that follow a
beam or other hand-held beacon can be used as an alternative to RC robots for situations
where an remote-controlled rover is necessary, such as clearing mines and exploring hazardous
environments.
In this report, we outline the design, construction, and performance of a simple following
robot that identifies an infrared signal and drives toward its source. A guidance system like
the one we built can be used to enable the above range of following robot applications.
4
5 System Design
5.1 Hardware
The hardware component of this project is what a majority our time was devoted to.
Though the software aspect certain had its challenges, most of the improvement seen between
the early prototype and finished product came as a result of changes regarding the hardware
used for the robot. There are six main components of interest for our robot. These include
the Arduino Uno board, servos, battery pack, sensor array, 3D printed chassis, and IR
transmitter; each of which will be discussed in some level of detail.
The Arduino Uno board can be considered the brain of the robot. It looks at the data
collected by the sensory pieces on the robot, ie. the sensor array, and makes decisions based
on the data. The Arduino Uno microcontroller also receives power from the battery pack.
The Arduino then regulates this input voltage and supplies a constant 5 Volts to the servo
motors, as well as all of the sensors on the sensor array. The Arduino receives input from 8
input pins; one pin for each IR sensor. The Arduino only outputs two signals; one to each
of the servos to govern their speeds and directions of motion.
The servos that we use for the robot are continuous rotation servo motors. Continuous
rotation simply means that they are not constrained to a 180 or 360 degree range of motion.
This allows the servos to spin the wheels continuously in one direction or the other. The
servos use a 5 Volt signal for their power and accepts a dynamic voltage value that it
interprets as a correspond rotation velocity and direction.
The battery pack supplies a constant 9 Volts to the Arduino Uno microcontroller and
acts as the primary power supply for the robot. The battery pack creates this 9 Volts using
6 1.5 Volt batteries and allows the robot to function without being tethered to a wall outlet,
or other static power source.
The sensor array is comprised of eight infrared sensors TSOP38238 that send data to the
5
Arduino Uno board. The sensor array is placed slightly above the other components on the
robot, so that each sensor has a clear line of sight. This ensures that the sensor is able to
see whatever infrared light is present. This helps to narrow down the potential causes for a
malfunction of the robot in detecting the infrared signal.
The 3D printed chassis acts to provide structural support for the robot, as well as a
secure fit for all of the components of the robot. The sensor array sits on a raised platform
that is integrated into the chassis. There is a hole for the sensor wires to exit out of. There
is a depression for each of the components that ensures that no components accidentally fall
off of the robot, compromising the function of the robot.
Finally, the IR transmitter is actually independent of the robot itself. The IR transmitter
is composed of another Arduino Uno board powered by a small 9V battery pack. There is
a small circuit that powers an infrared LED, which is given in figure () in the appendix.
The Arduino Uno microcontroller is used to pulse the infrared LED at around 38 kHz, a
frequency that can be detected by the infrared sensors on the robot.
5.2 Software
The software component of this project comprises two programs written in the Arduino
IDE: the code that runs the remote, and the control script for the robot.
The remote’s code is relatively simple: we use the external TimerOne library to generate
a 5V, 38kHz square wave that we then feed into our IR LED assembly. We have to use
TimerOne to generate the square wave, as Arduino’s standard loop function doesn’t iterate
quickly enough to generate the kind of wave we need.
The code that controls the robot works by doing some basic trigonometry to the input
from the 8 IR sensors mounted on the robot to compute an X and a Y coordinate. These
coordinates represent the components of a vector that, ideally, points toward the IR trans-
mitter. The program then runs this information through a bunch of conditional statements
6
to determine how to direct the robot:
• If the vector points behind the robot, it will turn in place in order to face the trans-
mitter.
• If the vector points in front of the robot, but at an angle, it will move forward while
also turning to face the transmitter head-on.
• If the vector points directly ahead of the robot, it will simply move forward.
Moreover, a number of fail-safes are built in:
• If no IR transmitter is detected, the robot will turn in place, scanning for a IR signal
source.
• If a transmitter is detected, but the signal’s direction is unclear, the robot will move
in a circle in order to establish a clearer signal.
Finally, the code contains functions to linearly interpolate the speed of the two servos.
This is important for three reasons. First, interpolating the control signal sent to the servo
protects it from sudden changes in speed or direction, lengthening the life of the servo. Sec-
ond, interpolation prevents the robot from jerking uncontrollably if it has to turn suddenly,
increasing its turning accuracy. Finally, interpolation has a smoothing effect on the servo
motion: noise in the IR sensor data that would otherwise cause the robot to turn back and
forth in subsequent iterations and reduce accuracy instead cancels out, having a minimal
effect on the robot’s motion.
7
Figure 1: The control script sums the directional information obtained from each sensor,
yielding a vector that ought to point toward the remote.
8
6 Results
The completed robot was tested in a number of different environments. When put a long
hallway with the transmitter pointed directly at it, the robot identified and tracked the signal
from up to 14 meters away. The IR sensors we used for this project have a very large field
of view. While this is undesirable when navigating very close to the signal source, it does
allow the robot to track a signal source inclined above or below it. This means our robot
can handle tracking signals through modest hills and valleys. Using a 38kHz square wave
as our signal, we found no noticeable interference arising from incandescent or fluorescent
light fixtures, ambient radio noise, or even an LCD projector remote control pointed at the
robot. In the event that we had experienced interference issues, these issues could have
been resolved by transmitting and searching for a message conveyed via PWM with a 38kHz
carrier wave.
One problem we experienced was that the sensors’ wide field of view resulted in a loss of
directionality information when the robot was close to the transmitter. We initially tried to
solve this by designing and 3D printing a cap for the sensor array with blinded sectors for each
sensor. While the cap exhibited moderate success at improving the robot’s accuracy at close
ranges, it also limited the robot’s maximum range. Later, when servo signal interpolation
was added to the robot’s software, accuracy at close ranges improved somewhat. Combining
the blinder cap with software interpolation would provide the best accuracy at close ranges,
and would be ideal for applications where long-range tracking isn’t necessary.
Another problem we experienced had to do with our IR transmitter’s light source re-
flecting off of objects in the environment and triggering sensors not directly in line with the
transmitter. Three measures that lessened the effect of this issue were using the blinder cap,
interpolating the servomotor signal, and wrapping electrical tape around the transmitter
LED to collimate the light. We predict that using a stronger and more tightly directed
9
signal source would go a long way toward mitigating our reflection problems. As it is, the
robot experiences intermittent difficulty tracking in environments with lots of IR-reflective
objects, but seems to eventually navigate to the signal source.
Navigating spaces was unproblematic, except for infrequent occasions where the robot
would drive itself into a corner or other tight spot. In these cases, the robot was often
incapable of extricating itself. A distance sensor and some extra code could help remedy
this scenario.
10
7 Conclusion
After testing the robot, we found that the initial goal of having the robot track and follow
an infrared signal was accomplished. There was some issue that arose when the robot was
place in a small area with reflective surface. However, in an area that does not suffer from
these issues, the infrared sensors on the robot were sensitive to infrared light from up to
14 meters away. Moreover, the robot was able to track toward the infrared light in a near
straight manner, very seldom deviating to the point where it could no longer see the infrared
light. Because of this, whenever the robot detected a light, it was most often the case that
the robot would successfully reach the light. For the times that the robot did not successful
reach the infrared light source, it usually ended up in a corner where the light source couldn’t
reach. One issue that occurred frequently was that the field of view got too large as the robot
got very close to the infrared light source. This was because almost all of the sensors were
seeing the light at the same time. When this occurred, the robot had a hard time determining
which direction it needed to go.
For the future of the project, we plan to add distance sensors to the robot so that it can
track along the wall if it loses the light. This ensures that if the robot gets stuck in a corner,
it can find its way back into the open. We also plan on using a stronger light source as well
as multiple light sources and a way to dictate which light is on. This would allow us to give
the robot new information on the fly. We would be able to place the lights at different places
of interest inside of a space so that the robot is able to travel to these different points of
interest. Finally, we plan on scaling up this project to be able to carry heavier loads, perhaps
even its human owners. Overall, the project was a relative success and we have learned a lot
of essential information for future development of the project.
11
8 Appendix
8.1 Arduino Code
Code for the robot:
#include <Servo.h>
// Assign pin numbers for servos
const int pinServoL=13;
const int pinServoR=12;
// Assign pin numbers 8 IR sensors
const int pinN=7;
const int pinNE=10;
const int pinE=9;
const int pinSE=6;
const int pinS=2;
const int pinSW=5;
const int pinW=8;
const int pinNW=3;
// Constants that govern servo speed
const int stoppedSpeed = 92;
const int FSpeed = 180;
const int RSpeed = 1;
// Default step size for servo speed smoothing
const int defaultStepSize = 10;
// These are variables that govern smooth servo motion
// Here, we’re declaring them and setting initial values
int speedL = stoppedSpeed;
int speedR = stoppedSpeed;
int goalSpeedL = stoppedSpeed;
int goalSpeedR = stoppedSpeed;
int stepSize = defaultStepSize;
// Declare the two servo motors as Servo objects
Servo servoL;
Servo servoR;
// Convenience functions:
// - park() will stop the robot
// - scanL() will cause the robot to turn in place anticlockwise,
// and is used to scan for an IR signal
// - scanR() behaves similarly to scanL, but has the robot spinning
// clockwise
void park()
{
setServos(stoppedSpeed, stoppedSpeed);
}
void scanL()
{
setServos(RSpeed, FSpeed);
}
12
void scanR()
{
setServos(FSpeed, RSpeed);
}
// All calls to change servo speed are routed through this function,
// providing a convenience point to modify servo motion.
void setServos(int l, int r)
{
goalSpeedL = l;
goalSpeedR = r;
}
void updateServos()
{
// Linearly interpolates between actual and goal speed value,
// leading to smoother movement. Set stepSize to 180 to disable.
int diffL = goalSpeedL-speedL;
int diffR = goalSpeedR-speedR;
if (diffL >= 0) speedL += min(diffL, stepSize); else speedL -= max(diffL, stepSize);
if (diffR >= 0) speedR += min(diffR, stepSize); else speedR -= max(diffR, stepSize);
servoL.write(min(max(abs(180-speedL), 1), 180));
servoR.write(min(max(speedR, 1), 180));
// Below is code to print diagnostic information through serial. It can
// be commented out without negatively affecting the robot’s performance.
Serial.print("updated servos: L: ");
Serial.print(goalSpeedL);
Serial.print(" R: ");
Serial.println(goalSpeedR);
}
void setup()
{
// Initialize serial (used only for diagnostic information)
Serial.begin(9600);
// Tell Servo.h which pins the servos are connected to
servoL.attach(pinServoL);
servoR.attach(pinServoR);
// Make double-sure the servos start parked.
setServos(stoppedSpeed, stoppedSpeed);
}
int N = 0;
int NE = 0;
int NW = 0;
int Se = 0;
int SW = 0;
int E = 0;
int W = 0;
int S = 0;
int xcoord = 0;
int ycoord = 0;
void loop()
{
13
// Read (and invert) IR sensor values (the sensor outputs 0
// upon detecting the remote)
N = abs(digitalRead(pinN) - 1);
NE = abs(digitalRead(pinNE) - 1);
E = abs(digitalRead(pinE) - 1);
NW = abs(digitalRead(pinNW) - 1);
W = abs(digitalRead(pinW) - 1);
Se = abs(digitalRead(pinSE) - 1);
S = abs(digitalRead(pinS) - 1);
SW = abs(digitalRead(pinSW) - 1);
// Compute X and Y components of a vector representing the
// directional information we have this iteration.
// 10 and 7 correspond to 1 and 1/sqrt(2), respectively.
xcoord = 10*(E-W) + 7*(NE+Se-NW-SW);
ycoord = 10*(N-S) + 7*(NE+NW-Se-SW);
// More diagnostic code
Serial.print(N);
Serial.print(NE);
Serial.print(E);
Serial.print(Se);
Serial.print(S);
Serial.print(SW);
Serial.print(W);
Serial.println(NW);
Serial.print(xcoord);
Serial.print(", ");
Serial.println(ycoord);
// If no signal is detected, keep scanning.
if (N+NE+E+Se+S+SW+W+NW == 0) {
scanL();
} else {
// Do Spyyyder loops if signal is unclear
if (xcoord == 0 && ycoord == 0)
{
setServos(FSpeed, 110);
} else if (ycoord <= 0) {
// If the signal is coming from behind the robot, turn in place
// in the appropriate direction so that it faces the remote
if (xcoord < 0) {
scanL();
} else {
scanR();
}
} else {
// If the signal is coming from in front of the robot, move forward
if (xcoord == 0) {
setServos(FSpeed, FSpeed);
// If necessary, turn slightly while moving to correct direction
// turnFactor is determined via a modified version of theta = atan(y/x)
// the constants 50 and 60 were determined empirically based on the
// characteristics of the S4303R servomotors we used.
} else if (xcoord > 0) {
int turnFactor = int(abs(60*atan(float(xcoord)/(float(ycoord)+.01))));
setServos(FSpeed, max(FSpeed - 50 - turnFactor, 10));
} else {
int turnFactor = int(abs(60*atan(float(xcoord)/(float(ycoord)+.01))));
setServos(max(FSpeed - 50 - turnFactor, 10), FSpeed);
14
}
}
}
// Step the servo speeds closer to their goal speeds
updateServos();
// Wait a while.
delay(100);
}
Code for the IR transmitter remote:
#include <TimerOne.h>
void setup() {
// Ready pin 9 for output.
pinMode(9, OUTPUT);
// Initialize Timer1, and set a 26.32 us period.
// This translates to a frequencty of around 38kHz.
Timer1.initialize(26.32);
// setup pwm on pin 9 with a 50% duty cycle.
Timer1.pwm(9, 512);
}
void loop() {
// Nothing needs to happen here.
delay(100);
}
15
8.2 Component Pictures
Figure 2: IR Transmitter
Figure 3: Robot, top view
Figure 4: Robot, battery compartment
Figure 5: Robot, angle view
16
Figure 6: Robot, profile view Figure 7: Robot, rear view
Figure 8: Breadboard and Arduino Figure 9: Robot, sensor tower close-up
17

Mais conteúdo relacionado

Mais procurados

Mais procurados (20)

Raspberry Pi Technology
Raspberry Pi TechnologyRaspberry Pi Technology
Raspberry Pi Technology
 
Biochips seminar report
Biochips seminar reportBiochips seminar report
Biochips seminar report
 
Introduction to raspberry pi
Introduction to raspberry piIntroduction to raspberry pi
Introduction to raspberry pi
 
Raspberry pi technical documentation
Raspberry pi technical documentationRaspberry pi technical documentation
Raspberry pi technical documentation
 
Chap1 chipset
Chap1 chipsetChap1 chipset
Chap1 chipset
 
Introduction to raspberry pi
Introduction to raspberry piIntroduction to raspberry pi
Introduction to raspberry pi
 
Raspberry Pi (Introduction)
Raspberry Pi (Introduction)Raspberry Pi (Introduction)
Raspberry Pi (Introduction)
 
Raspberry PI
Raspberry PIRaspberry PI
Raspberry PI
 
HARD DISK DRIVE
HARD DISK DRIVE HARD DISK DRIVE
HARD DISK DRIVE
 
Raspberry pi
Raspberry piRaspberry pi
Raspberry pi
 
PIR AND ULTRASONIC PPT.pptx
PIR AND ULTRASONIC PPT.pptxPIR AND ULTRASONIC PPT.pptx
PIR AND ULTRASONIC PPT.pptx
 
Audio noise reduction techniques
Audio  noise  reduction  techniquesAudio  noise  reduction  techniques
Audio noise reduction techniques
 
Motherboard + ports & connector
Motherboard + ports & connectorMotherboard + ports & connector
Motherboard + ports & connector
 
Arduino
ArduinoArduino
Arduino
 
10 color image processing
10 color image processing10 color image processing
10 color image processing
 
Raspberry Pi
Raspberry PiRaspberry Pi
Raspberry Pi
 
Introduction to Raspberry PI
Introduction to Raspberry PIIntroduction to Raspberry PI
Introduction to Raspberry PI
 
Gesture control car
Gesture control carGesture control car
Gesture control car
 
E ball technology ppt
E ball technology pptE ball technology ppt
E ball technology ppt
 
Compare CD vs DVD
Compare CD vs DVDCompare CD vs DVD
Compare CD vs DVD
 

Destaque

Presentation door knock
Presentation door knockPresentation door knock
Presentation door knockVamsi Krishna
 
Arduino e-Course Syllabus for Beginners
Arduino e-Course Syllabus for BeginnersArduino e-Course Syllabus for Beginners
Arduino e-Course Syllabus for BeginnersMadhu Honey
 
Knock detecting door lock research paper
Knock detecting door lock research paperKnock detecting door lock research paper
Knock detecting door lock research paperChamila Wijayarathna
 
Raspberry Pi + ROS
Raspberry Pi + ROSRaspberry Pi + ROS
Raspberry Pi + ROSArnoldBail
 
robotics and embedded system ppt
robotics and embedded system pptrobotics and embedded system ppt
robotics and embedded system pptNilgiri publicity
 
Project Report On Micro-controller Embedded System
Project Report On Micro-controller Embedded SystemProject Report On Micro-controller Embedded System
Project Report On Micro-controller Embedded SystemRkrishna Mishra
 
Slideshare Powerpoint presentation
Slideshare Powerpoint presentationSlideshare Powerpoint presentation
Slideshare Powerpoint presentationelliehood
 

Destaque (10)

Presentation door knock
Presentation door knockPresentation door knock
Presentation door knock
 
Arduino e-Course Syllabus for Beginners
Arduino e-Course Syllabus for BeginnersArduino e-Course Syllabus for Beginners
Arduino e-Course Syllabus for Beginners
 
Knock detecting door lock research paper
Knock detecting door lock research paperKnock detecting door lock research paper
Knock detecting door lock research paper
 
Raspberry Pi + ROS
Raspberry Pi + ROSRaspberry Pi + ROS
Raspberry Pi + ROS
 
smart door knocker
smart door knockersmart door knocker
smart door knocker
 
robotics and embedded system ppt
robotics and embedded system pptrobotics and embedded system ppt
robotics and embedded system ppt
 
Password based door locksystem
Password  based door locksystemPassword  based door locksystem
Password based door locksystem
 
Project Report On Micro-controller Embedded System
Project Report On Micro-controller Embedded SystemProject Report On Micro-controller Embedded System
Project Report On Micro-controller Embedded System
 
Project Report Format
Project Report FormatProject Report Format
Project Report Format
 
Slideshare Powerpoint presentation
Slideshare Powerpoint presentationSlideshare Powerpoint presentation
Slideshare Powerpoint presentation
 

Semelhante a Arduino_Project_Report

ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINO
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINOACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINO
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINOSnehasis Mondal
 
Ball following Robot using ESP32-cam & Arduino UNO
Ball following Robot using ESP32-cam & Arduino UNOBall following Robot using ESP32-cam & Arduino UNO
Ball following Robot using ESP32-cam & Arduino UNOIRJET Journal
 
Wireless AI based industrial security robot
Wireless AI based industrial security robotWireless AI based industrial security robot
Wireless AI based industrial security robotVarun B P
 
Autonomous navigation robot
Autonomous navigation robotAutonomous navigation robot
Autonomous navigation robotIRJET Journal
 
Floor cleaning robot(autonomus mannual) vatsal shah-ec_4th year
Floor cleaning robot(autonomus mannual)  vatsal shah-ec_4th yearFloor cleaning robot(autonomus mannual)  vatsal shah-ec_4th year
Floor cleaning robot(autonomus mannual) vatsal shah-ec_4th yearVatsal N Shah
 
PC-based mobile robot navigation sytem
PC-based mobile robot navigation sytemPC-based mobile robot navigation sytem
PC-based mobile robot navigation sytemANKIT SURATI
 
obstacle avoiding robot project
obstacle avoiding robot projectobstacle avoiding robot project
obstacle avoiding robot projectAisha Naeem
 
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdf
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdfA SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdf
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdfAakash Sheelvant
 
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT
A SMART VOICE CONTROLLED PICK AND PLACE ROBOTA SMART VOICE CONTROLLED PICK AND PLACE ROBOT
A SMART VOICE CONTROLLED PICK AND PLACE ROBOTIRJET Journal
 
Jun 13 ijbi_002
Jun 13 ijbi_002Jun 13 ijbi_002
Jun 13 ijbi_002Anuj Raj
 
Jun 13 ijbi_002
Jun 13 ijbi_002Jun 13 ijbi_002
Jun 13 ijbi_002anuj87420
 
Sonar Project Report
Sonar Project ReportSonar Project Report
Sonar Project ReportSumit Sapra
 
Final Report 9505482 5845742
Final Report 9505482 5845742Final Report 9505482 5845742
Final Report 9505482 5845742Bawantha Liyanage
 
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...IRJET Journal
 

Semelhante a Arduino_Project_Report (20)

ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINO
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINOACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINO
ACCELEROMETER BASED HAND GESTURE CONTROLLED ROBOT USING ARDUINO
 
Ball following Robot using ESP32-cam & Arduino UNO
Ball following Robot using ESP32-cam & Arduino UNOBall following Robot using ESP32-cam & Arduino UNO
Ball following Robot using ESP32-cam & Arduino UNO
 
Wireless AI based industrial security robot
Wireless AI based industrial security robotWireless AI based industrial security robot
Wireless AI based industrial security robot
 
Autonomous navigation robot
Autonomous navigation robotAutonomous navigation robot
Autonomous navigation robot
 
Floor cleaning robot(autonomus mannual) vatsal shah-ec_4th year
Floor cleaning robot(autonomus mannual)  vatsal shah-ec_4th yearFloor cleaning robot(autonomus mannual)  vatsal shah-ec_4th year
Floor cleaning robot(autonomus mannual) vatsal shah-ec_4th year
 
PC-based mobile robot navigation sytem
PC-based mobile robot navigation sytemPC-based mobile robot navigation sytem
PC-based mobile robot navigation sytem
 
obstacle avoiding robot project
obstacle avoiding robot projectobstacle avoiding robot project
obstacle avoiding robot project
 
117_SIRJ_HMS
117_SIRJ_HMS117_SIRJ_HMS
117_SIRJ_HMS
 
Control-Project
Control-ProjectControl-Project
Control-Project
 
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdf
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdfA SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdf
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT.pdf
 
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT
A SMART VOICE CONTROLLED PICK AND PLACE ROBOTA SMART VOICE CONTROLLED PICK AND PLACE ROBOT
A SMART VOICE CONTROLLED PICK AND PLACE ROBOT
 
Colour tracking robot.pdf
Colour tracking robot.pdfColour tracking robot.pdf
Colour tracking robot.pdf
 
Jun 13 ijbi_002
Jun 13 ijbi_002Jun 13 ijbi_002
Jun 13 ijbi_002
 
PERSON ALIVE DETECTION
PERSON ALIVE DETECTIONPERSON ALIVE DETECTION
PERSON ALIVE DETECTION
 
Jun 13 ijbi_002
Jun 13 ijbi_002Jun 13 ijbi_002
Jun 13 ijbi_002
 
Robot arm ppt
Robot arm pptRobot arm ppt
Robot arm ppt
 
Sonar Project Report
Sonar Project ReportSonar Project Report
Sonar Project Report
 
Final Report 9505482 5845742
Final Report 9505482 5845742Final Report 9505482 5845742
Final Report 9505482 5845742
 
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...
IRJET- Gesture Controlled Robot with Obstacle Avoidance using Arduino and And...
 
Spook Robi
Spook RobiSpook Robi
Spook Robi
 

Arduino_Project_Report

  • 1. College of William and Mary Final Project: PHYS 351 ARDUINO BASED IR LED TRACKER Authors: Dara Kharabi and Jacob McCormick November 3, 2016
  • 2. 1 Abstract One of the applications of robotics most present in the public consciousness is that of the “personal companion,” a robot that acts to serve the needs of its human owners. One of the fundamental abilities of such a companion robot is that of following its owner around. Navigating the real world presents a difficult, often non-ideal problem space. Following a target is a desirable behavior goal because it allows a robot to circumvent many of these dif- ficulties by following another entity (like a human owner) that already possesses the facilities necessary for navigation. To this end, we have designed and implemented a robot capable of detecting, tracking, and following a pulse-width-modulated infrared signal. Our design uses an array of eight inexpensive IR receiver sensors to collect data. This data is then processed with an Arduino Uno microcontroller and used to govern its motion (mediated by two wheels on continuous rotation servomotors). Our design is highly tolerant to EM noise/interference and can track a low-power IR LED from up to 14 meters away. With a more powerful IR beacon and suitable directional lensing apparatus, the system’s tracking range could be increased tenfold or more. Originally, we planned for the robot to be able to hold and transport loads upward of 50kg, allowing it to assist in household chores and even manned transportation. However, time and budget constraints led us to design our robot around easily available low-torque servomotors, constraining these use cases. We do not predict that any additional software- side challenges will arise from increasing the scale of the robot. In other words, our robot’s IR tracking system may be used as-is to allow robots that transport supplies or people to follow human guides in both indoor and outdoor settings. 1
  • 3. 2 Executive Summary For this project, the robot was built around an Arduino Uno microcontroller. The Arduino Uno board has inputs from an array of 8 sensors located on the upper platform of the 3D printed chassis. We used a Makerbot Replicator 3D printer to print our chassis to ensure proper fit of the components and overall durability of the robot. The Arduino is powered by a battery pack and sends signal to the servos, which drive the wheels. The goal of this project was to create a robot capable of following an infrared LED light over a relatively large range with reasonable precision. Much of the challenge associated with this project was incurred in engineering the physical aspects of the system. The path of information for the robot remained quite linear. First, a signal is received by one or more of the sensors on the sensor array. The data collected by the sensors is then sent to the Arduino Uno microcontroller to be processed. Once the Arduino Uno has decided the proper course of action, the Arduino sends signals to the Servo motors to drive the wheels. The other portion of the project consists of an Arduino that pulses an IR LED at a frequency that is detectable by the 8 sensor array on the robot. This is the IR LED that the robot tracks. By moving the LED to specific places, you can then cause the robot to go to that place. For now, there is just one LED that pulses. However, we can add more LED as well as code to control which LED is flashing in the future. Overall, this project was a success because the robot was capable of tracking and moving toward a pulsing LED. However, there is room for improvement in a variety of ways which will be discussed later in the report. 2
  • 4. 3 Table of Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 4 2. System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg.s 5-8 (a) Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 5 (b) Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 6 3. Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 9 4. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 11 5. Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg.s 12-17 (a) Arduino Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Pg. 12 (b) Component Pictures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Pg. 16 3
  • 5. 4 Introduction In the modern age, many tasks that were once the responsibility of humans have been offloaded to machinery. Whether it be by means of a washing machine, dishwater, vacuum cleaner, etc we can see the value of implementing robotics in every day life. By using robots to assist people in carrying out their everyday tasks, we can improve our quality of life as well as improve the efficiency with which we perform these tasks. One behavior that shows promise as a building block for these helpful machines is fol- lowing. A robot that follows a target has no need for complicated navigational intelligence of its own, and can be used in a wide range of beneficial applications. A robot could follow an individual around their house, carrying useful items such as laundry or tools in an unobtrusive and hands-free manner. a similar supplies-carrier robot, fitted with a heavy duty chassis, could assist hikers and infantrymen. Robots that follow a beam or other hand-held beacon can be used as an alternative to RC robots for situations where an remote-controlled rover is necessary, such as clearing mines and exploring hazardous environments. In this report, we outline the design, construction, and performance of a simple following robot that identifies an infrared signal and drives toward its source. A guidance system like the one we built can be used to enable the above range of following robot applications. 4
  • 6. 5 System Design 5.1 Hardware The hardware component of this project is what a majority our time was devoted to. Though the software aspect certain had its challenges, most of the improvement seen between the early prototype and finished product came as a result of changes regarding the hardware used for the robot. There are six main components of interest for our robot. These include the Arduino Uno board, servos, battery pack, sensor array, 3D printed chassis, and IR transmitter; each of which will be discussed in some level of detail. The Arduino Uno board can be considered the brain of the robot. It looks at the data collected by the sensory pieces on the robot, ie. the sensor array, and makes decisions based on the data. The Arduino Uno microcontroller also receives power from the battery pack. The Arduino then regulates this input voltage and supplies a constant 5 Volts to the servo motors, as well as all of the sensors on the sensor array. The Arduino receives input from 8 input pins; one pin for each IR sensor. The Arduino only outputs two signals; one to each of the servos to govern their speeds and directions of motion. The servos that we use for the robot are continuous rotation servo motors. Continuous rotation simply means that they are not constrained to a 180 or 360 degree range of motion. This allows the servos to spin the wheels continuously in one direction or the other. The servos use a 5 Volt signal for their power and accepts a dynamic voltage value that it interprets as a correspond rotation velocity and direction. The battery pack supplies a constant 9 Volts to the Arduino Uno microcontroller and acts as the primary power supply for the robot. The battery pack creates this 9 Volts using 6 1.5 Volt batteries and allows the robot to function without being tethered to a wall outlet, or other static power source. The sensor array is comprised of eight infrared sensors TSOP38238 that send data to the 5
  • 7. Arduino Uno board. The sensor array is placed slightly above the other components on the robot, so that each sensor has a clear line of sight. This ensures that the sensor is able to see whatever infrared light is present. This helps to narrow down the potential causes for a malfunction of the robot in detecting the infrared signal. The 3D printed chassis acts to provide structural support for the robot, as well as a secure fit for all of the components of the robot. The sensor array sits on a raised platform that is integrated into the chassis. There is a hole for the sensor wires to exit out of. There is a depression for each of the components that ensures that no components accidentally fall off of the robot, compromising the function of the robot. Finally, the IR transmitter is actually independent of the robot itself. The IR transmitter is composed of another Arduino Uno board powered by a small 9V battery pack. There is a small circuit that powers an infrared LED, which is given in figure () in the appendix. The Arduino Uno microcontroller is used to pulse the infrared LED at around 38 kHz, a frequency that can be detected by the infrared sensors on the robot. 5.2 Software The software component of this project comprises two programs written in the Arduino IDE: the code that runs the remote, and the control script for the robot. The remote’s code is relatively simple: we use the external TimerOne library to generate a 5V, 38kHz square wave that we then feed into our IR LED assembly. We have to use TimerOne to generate the square wave, as Arduino’s standard loop function doesn’t iterate quickly enough to generate the kind of wave we need. The code that controls the robot works by doing some basic trigonometry to the input from the 8 IR sensors mounted on the robot to compute an X and a Y coordinate. These coordinates represent the components of a vector that, ideally, points toward the IR trans- mitter. The program then runs this information through a bunch of conditional statements 6
  • 8. to determine how to direct the robot: • If the vector points behind the robot, it will turn in place in order to face the trans- mitter. • If the vector points in front of the robot, but at an angle, it will move forward while also turning to face the transmitter head-on. • If the vector points directly ahead of the robot, it will simply move forward. Moreover, a number of fail-safes are built in: • If no IR transmitter is detected, the robot will turn in place, scanning for a IR signal source. • If a transmitter is detected, but the signal’s direction is unclear, the robot will move in a circle in order to establish a clearer signal. Finally, the code contains functions to linearly interpolate the speed of the two servos. This is important for three reasons. First, interpolating the control signal sent to the servo protects it from sudden changes in speed or direction, lengthening the life of the servo. Sec- ond, interpolation prevents the robot from jerking uncontrollably if it has to turn suddenly, increasing its turning accuracy. Finally, interpolation has a smoothing effect on the servo motion: noise in the IR sensor data that would otherwise cause the robot to turn back and forth in subsequent iterations and reduce accuracy instead cancels out, having a minimal effect on the robot’s motion. 7
  • 9. Figure 1: The control script sums the directional information obtained from each sensor, yielding a vector that ought to point toward the remote. 8
  • 10. 6 Results The completed robot was tested in a number of different environments. When put a long hallway with the transmitter pointed directly at it, the robot identified and tracked the signal from up to 14 meters away. The IR sensors we used for this project have a very large field of view. While this is undesirable when navigating very close to the signal source, it does allow the robot to track a signal source inclined above or below it. This means our robot can handle tracking signals through modest hills and valleys. Using a 38kHz square wave as our signal, we found no noticeable interference arising from incandescent or fluorescent light fixtures, ambient radio noise, or even an LCD projector remote control pointed at the robot. In the event that we had experienced interference issues, these issues could have been resolved by transmitting and searching for a message conveyed via PWM with a 38kHz carrier wave. One problem we experienced was that the sensors’ wide field of view resulted in a loss of directionality information when the robot was close to the transmitter. We initially tried to solve this by designing and 3D printing a cap for the sensor array with blinded sectors for each sensor. While the cap exhibited moderate success at improving the robot’s accuracy at close ranges, it also limited the robot’s maximum range. Later, when servo signal interpolation was added to the robot’s software, accuracy at close ranges improved somewhat. Combining the blinder cap with software interpolation would provide the best accuracy at close ranges, and would be ideal for applications where long-range tracking isn’t necessary. Another problem we experienced had to do with our IR transmitter’s light source re- flecting off of objects in the environment and triggering sensors not directly in line with the transmitter. Three measures that lessened the effect of this issue were using the blinder cap, interpolating the servomotor signal, and wrapping electrical tape around the transmitter LED to collimate the light. We predict that using a stronger and more tightly directed 9
  • 11. signal source would go a long way toward mitigating our reflection problems. As it is, the robot experiences intermittent difficulty tracking in environments with lots of IR-reflective objects, but seems to eventually navigate to the signal source. Navigating spaces was unproblematic, except for infrequent occasions where the robot would drive itself into a corner or other tight spot. In these cases, the robot was often incapable of extricating itself. A distance sensor and some extra code could help remedy this scenario. 10
  • 12. 7 Conclusion After testing the robot, we found that the initial goal of having the robot track and follow an infrared signal was accomplished. There was some issue that arose when the robot was place in a small area with reflective surface. However, in an area that does not suffer from these issues, the infrared sensors on the robot were sensitive to infrared light from up to 14 meters away. Moreover, the robot was able to track toward the infrared light in a near straight manner, very seldom deviating to the point where it could no longer see the infrared light. Because of this, whenever the robot detected a light, it was most often the case that the robot would successfully reach the light. For the times that the robot did not successful reach the infrared light source, it usually ended up in a corner where the light source couldn’t reach. One issue that occurred frequently was that the field of view got too large as the robot got very close to the infrared light source. This was because almost all of the sensors were seeing the light at the same time. When this occurred, the robot had a hard time determining which direction it needed to go. For the future of the project, we plan to add distance sensors to the robot so that it can track along the wall if it loses the light. This ensures that if the robot gets stuck in a corner, it can find its way back into the open. We also plan on using a stronger light source as well as multiple light sources and a way to dictate which light is on. This would allow us to give the robot new information on the fly. We would be able to place the lights at different places of interest inside of a space so that the robot is able to travel to these different points of interest. Finally, we plan on scaling up this project to be able to carry heavier loads, perhaps even its human owners. Overall, the project was a relative success and we have learned a lot of essential information for future development of the project. 11
  • 13. 8 Appendix 8.1 Arduino Code Code for the robot: #include <Servo.h> // Assign pin numbers for servos const int pinServoL=13; const int pinServoR=12; // Assign pin numbers 8 IR sensors const int pinN=7; const int pinNE=10; const int pinE=9; const int pinSE=6; const int pinS=2; const int pinSW=5; const int pinW=8; const int pinNW=3; // Constants that govern servo speed const int stoppedSpeed = 92; const int FSpeed = 180; const int RSpeed = 1; // Default step size for servo speed smoothing const int defaultStepSize = 10; // These are variables that govern smooth servo motion // Here, we’re declaring them and setting initial values int speedL = stoppedSpeed; int speedR = stoppedSpeed; int goalSpeedL = stoppedSpeed; int goalSpeedR = stoppedSpeed; int stepSize = defaultStepSize; // Declare the two servo motors as Servo objects Servo servoL; Servo servoR; // Convenience functions: // - park() will stop the robot // - scanL() will cause the robot to turn in place anticlockwise, // and is used to scan for an IR signal // - scanR() behaves similarly to scanL, but has the robot spinning // clockwise void park() { setServos(stoppedSpeed, stoppedSpeed); } void scanL() { setServos(RSpeed, FSpeed); } 12
  • 14. void scanR() { setServos(FSpeed, RSpeed); } // All calls to change servo speed are routed through this function, // providing a convenience point to modify servo motion. void setServos(int l, int r) { goalSpeedL = l; goalSpeedR = r; } void updateServos() { // Linearly interpolates between actual and goal speed value, // leading to smoother movement. Set stepSize to 180 to disable. int diffL = goalSpeedL-speedL; int diffR = goalSpeedR-speedR; if (diffL >= 0) speedL += min(diffL, stepSize); else speedL -= max(diffL, stepSize); if (diffR >= 0) speedR += min(diffR, stepSize); else speedR -= max(diffR, stepSize); servoL.write(min(max(abs(180-speedL), 1), 180)); servoR.write(min(max(speedR, 1), 180)); // Below is code to print diagnostic information through serial. It can // be commented out without negatively affecting the robot’s performance. Serial.print("updated servos: L: "); Serial.print(goalSpeedL); Serial.print(" R: "); Serial.println(goalSpeedR); } void setup() { // Initialize serial (used only for diagnostic information) Serial.begin(9600); // Tell Servo.h which pins the servos are connected to servoL.attach(pinServoL); servoR.attach(pinServoR); // Make double-sure the servos start parked. setServos(stoppedSpeed, stoppedSpeed); } int N = 0; int NE = 0; int NW = 0; int Se = 0; int SW = 0; int E = 0; int W = 0; int S = 0; int xcoord = 0; int ycoord = 0; void loop() { 13
  • 15. // Read (and invert) IR sensor values (the sensor outputs 0 // upon detecting the remote) N = abs(digitalRead(pinN) - 1); NE = abs(digitalRead(pinNE) - 1); E = abs(digitalRead(pinE) - 1); NW = abs(digitalRead(pinNW) - 1); W = abs(digitalRead(pinW) - 1); Se = abs(digitalRead(pinSE) - 1); S = abs(digitalRead(pinS) - 1); SW = abs(digitalRead(pinSW) - 1); // Compute X and Y components of a vector representing the // directional information we have this iteration. // 10 and 7 correspond to 1 and 1/sqrt(2), respectively. xcoord = 10*(E-W) + 7*(NE+Se-NW-SW); ycoord = 10*(N-S) + 7*(NE+NW-Se-SW); // More diagnostic code Serial.print(N); Serial.print(NE); Serial.print(E); Serial.print(Se); Serial.print(S); Serial.print(SW); Serial.print(W); Serial.println(NW); Serial.print(xcoord); Serial.print(", "); Serial.println(ycoord); // If no signal is detected, keep scanning. if (N+NE+E+Se+S+SW+W+NW == 0) { scanL(); } else { // Do Spyyyder loops if signal is unclear if (xcoord == 0 && ycoord == 0) { setServos(FSpeed, 110); } else if (ycoord <= 0) { // If the signal is coming from behind the robot, turn in place // in the appropriate direction so that it faces the remote if (xcoord < 0) { scanL(); } else { scanR(); } } else { // If the signal is coming from in front of the robot, move forward if (xcoord == 0) { setServos(FSpeed, FSpeed); // If necessary, turn slightly while moving to correct direction // turnFactor is determined via a modified version of theta = atan(y/x) // the constants 50 and 60 were determined empirically based on the // characteristics of the S4303R servomotors we used. } else if (xcoord > 0) { int turnFactor = int(abs(60*atan(float(xcoord)/(float(ycoord)+.01)))); setServos(FSpeed, max(FSpeed - 50 - turnFactor, 10)); } else { int turnFactor = int(abs(60*atan(float(xcoord)/(float(ycoord)+.01)))); setServos(max(FSpeed - 50 - turnFactor, 10), FSpeed); 14
  • 16. } } } // Step the servo speeds closer to their goal speeds updateServos(); // Wait a while. delay(100); } Code for the IR transmitter remote: #include <TimerOne.h> void setup() { // Ready pin 9 for output. pinMode(9, OUTPUT); // Initialize Timer1, and set a 26.32 us period. // This translates to a frequencty of around 38kHz. Timer1.initialize(26.32); // setup pwm on pin 9 with a 50% duty cycle. Timer1.pwm(9, 512); } void loop() { // Nothing needs to happen here. delay(100); } 15
  • 17. 8.2 Component Pictures Figure 2: IR Transmitter Figure 3: Robot, top view Figure 4: Robot, battery compartment Figure 5: Robot, angle view 16
  • 18. Figure 6: Robot, profile view Figure 7: Robot, rear view Figure 8: Breadboard and Arduino Figure 9: Robot, sensor tower close-up 17