2. Definition of Algorithms
Definition: An algorithm is a finite set of
unambiguous instruction that if followed
accomplishes a particular task
• An algorithm should have finite number of
steps
• It should accept Input and produce desired
output
• Algorithm should terminate
3. An algorithm should satisfy the following
criteria
• Input: Algorithm should have Zero or more inputs
• Output: The algorithm should produce correct output
• Definiteness: Each instruction should be clear and
unambiguous.
• Finiteness: The algorithm must terminate after finite
number of steps.
• Effectiveness: The instructions should be simple
and should transform given input to desired output
• A program is an expression of algorithm in a
programming language
4. Algorithm Specification
• We can use natural languages like English
• Graphic representation is called flow chart
• Another representation is Pseudocode that
resembles C or Pascal
5. 1. Comments begin with //
2. Blocks are indicated with matching braces {
}
3. An identifier begins with a letter
4. Simple data types like integer, char, float,
Boolean(True or False) can be used
5. Assignment of values to variables is done
using the assignment statement
< variable> :=<expression> or
< variable> <expression>
6. Logical operators (and, or, not )and the
relational operators <, ≤, =,≠,>, ≥ are
provided
6. 7. The following looping statements are employed:
for, while, repeat-until
8. The following conditional statements can be
used
If……. Then….
if …….then……else……
9. Array indices starts at zero, array elements are
represented using [] Eg:A[i]
10. Inputs and outputs are done using the
instructions read and write
11. There is only one procedure: Algorithm
the heading takes the form
Algorithm Name ( parameter list)
7. Loop contd..
For loop
for variable:=value1 to value2 [step stepval]do
{
<statement 1>
………..
………..
<statement n>
}
12. Example
//Purpose: To find maximum of array elements
//input: An array a of integers of size n
//output: maximum element present in the array
Algorithm max(a,n)
{
res:=a[0]
for i:=1 to n-1 do
{
if(a[i] > res)
res:=a[i]
i:=i+1
}
Return res
}
13. //Purpose: To compute GCD of two non negative numbers using consecutive integer checking
//Input: Two non negative non zero numbers
//Output: GCD of m and n
Algorithm GCD(m,n)
{
t=min(m,n)
while(t≠ 1)do
{
t=m mod t;
If(t=0)
{
t=n mod t;
If(t=0)
Return t;
}
}
t=t-1;
}
Return t;
}
14. How to Analyze Algorithms
• Analysis of Algorithm or performance analysis refers to
the task of determining how much computing time and
storage an algorithm requires.
• It involves 2 phases 1.Priori analysis 2.Posterior analysis
• Priori analysis: Algorithms computing time is obtained in
terms of order of magnitude
• Posterior analysis: Statistics about the algorithm in terms
of space and time is obtained during execution
• This is a challenging area which some times requires
great mathematical skill
15. Testing
Testing consists of two phases
• Debugging: It is the process of executing the program on
sample data sets and verify if correct results are
obtained.
• Profiling or performance measurement is the process of
executing a correct program on data set and measuring
the time and space it take to compute the result
16. Analysis of algorithms
Efficiency of algorithm depends on
1.Space efficiency 2.Time efficiency
Space Efficiency: It is the amount of memory required to
run the program completely and eficiently.
Space complexity depends on:
Program space: space required to store the compiled
program.
Data space: Space required to store the constants
variables.
Stack space: Space required to store return address ,
parameters passed etc.
17. Time complexity
• The time T(P) taken by a program is the
sum of the compile time and runtime.
• We may assume that a compiled
program will run several times without
recompilation
• Consequently we consider only the run
time of the program. This runtime is
denoted as tp. .
18. Contd..
Time efficiency: Measure of how fast algorithm executes
Time complexity depends on:
• Speed of computer
• Choice of programming language
• Compiler used
• Choice of algorithm
• Number of inputs/outputs
• Operations count
• Steps count
Time efficiency mainly depends on size of input n,hence
expressed in terms of n.
19. • Basic operation: Operation that contributes most towards
running of algorithm(or)most time consuming operating
in the algorithm.
• Time efficiency depends on number of times basic
operation is executed
T(n)≈ b * C(n)
• T(n) is running time of algorithm
• n is the size of the input
• b is execution time for basic operation
• C is number of times basic operation is executed
20. Order of growth
The efficiency of algorithm can be analyzed
by considering highest order of n.
As N value increases order of growth increases
N Log N N NlogN N2 N3 2N N!
1 0 1 0 1 1 2 1
2 1 2 2 4 8 4 2
4 2 4 8 16 64 16 24
8 3 8 24 64 512 256 40320
16 4 16 256 256 4096 65536 High
32 5 32 1024 1024 32768 42949
67296
Very
high
21. Asymptotic notations
Asymptotic notations are used to compare
the order of growth of algorithms.
The three notations used are
Big oh (O) notation
Big omega () notation
Big theta ()notation
22. • Big “oh”:-Big-O, commonly written as O, is an Asymptotic
Notation for the worst case, or ceiling of growth for a
given function.
• It provides us with an asymptotic upper bound for the
growth rate of runtime of an algorithm.
• The function f(n) = O(g(n)) iff there exist a positive
constant c and no such that f(n) ≤ c * g(n) for all n, n ≥ no
• The function 3n + 2=O(n) as
• 3n +2 ≤ 4n, for all n ≥ 2.
Here value of g(n) =n, c= 4 & no =2
3n+3 ≤ 4n for all n ≥ 3.
23. • Big-Omega:commonly written as Ω, is an Asymptotic
Notation for the best case, or a floor growth rate for a
given function.
• It provides us with an asymptotic lower bound for the
growth rate of runtime of an algorithm.
• f(n) is Ω(g(n)), if for some real constants c (c > 0) and
n0 (n0 > 0), f(n) is >= c g(n) for every input size n (n > n0).
• Eg: 3n+3=(n) as 3n+3 3n for n 1
24. Big Theta: commonly written as Θ, is an Asymptotic
Notation to denote the asymptotically tight bound on
the growth rate of runtime of an algorithm.
• f(n) is Θ(g(n)), if for some real constants c1, c2 and
n0 (c1 > 0, c2 > 0, n0 > 0), c1 g(n) is < f(n) is < c2 g(n) for
every input size n (n > n0).
• ∴ f(n) is Θ(g(n)) implies f(n) is O(g(n)) as well as f(n) is
Ω(g(n)).
• f(n) = (g(n)) if and only if
f(n) = O(g(n)) AND f(n) = (g(n))