SlideShare uma empresa Scribd logo
1 de 41
Information Theory and Coding
ECEg- 6028
Objective
• The objective of the course it to expose students to the
principles and practice of information theory, covering
both theoretical and applied issues of recognized
importance in data/information compression,
transmission, storage and processing.
• To have the knowledge of Modulation & Demodulation
• To understand different types of coding techniques
• To have the knowledge of how these codes are useful in
communication
Contents
• Introduction to communication system and information;
measure of information;
• review of discrete sources; source coding;
• DML channels capacity; introduction to coding for reliable
communication and storage;
• modulation/ demodulation; maximum likelihood decoding;
review of algebra;
• algebra codes: linear block codes; cyclic codes;
• BCH codes (Bose Chaudhuri Hocquenghem), Burst
codes;
• burst error correcting codes; convolution codes; structural
properties of convolution codes;
• Maximum likelihood decoding; viterbi algorithm,
performance bounds for convolution codes, burst error
correcting convolution codes.
Reference Materials
• Text :
• Elements of Information Theory, 2nd Edition, T. M.
Cover and J. A. Thomas, Wiley Inter-science, 2006.
• References:
1. Information Theory and Reliable Communication,
Robert G. Gallager, Wiley Text Books, 1968.
2. Relevant papers in the development of information
theory.
Content
• Introduction
• Measure of Information
• Discrete Memory less Channels
• Source Coding
Introduction
• Information theory provide the quantitative measure of the
information contained in a message signals.
• It allows to determine the capacity of a communication
system to transfer this information from source to destination .
Measure of Information
A. Information Source
• An information source is an object that produces an event, the
outcome of it is selected at random according to a probability
distribution.
• A practical source of information in a communication system is a
device produces messages, it can be either analog or discrete.
• The set of source symbols is called the source alphabet and the
elements of the set are called symbols or letters.
• Information source can be classified as having memory and memory
less.
• A source with memory is one for which the current symbol depends on
the previous symbols.
• A memory less source is one for which each symbol produced is
independent of the previous symbols .
B. Information content of a Discrete Memoryless System
• The amount of information contained in an event is closely
related to its uncertainty.
• Message having knowledge of high probability of occurrence
convey relatively little information.
• If the message is certain (with high probability of occurrence) , it
conveys zero information.
• Mathematical measure of information should be a function of the
probability of the outcome and should satisfy the following
axioms:
– Information should be proportional to the uncertainty of
an outcome.
– Information contained in independent outcomes should
add.
Information Content of a Symbol
Average Information or Entropy
• In a practical communication system, we usually transmit long
sequences of symbols from an information source.
• Thus, we are more interested in the average information that a
source produces than the information content of a single symbol.
• The mean value of I(xi) over the alphabet of source X with ‘m’
different symbols is given by
• The quantity H(X) is called the Entropy of source X. It is a
measure of the average information content per source symbol.
• The source entropy H(X) can be considered as the average
amount of uncertainty within source X that is resolved by use
of the alphabet.
• The source entropy H(X) satisfies the following relation:
Information Rate
• If the time rate at which source X emits symbols is ‘r’
(symbols/s), the information rate R of the source is given by
Discrete Memoryless Channels
Channel Representation:
• A Communication channel is the path or medium through which the
symbols flow to the receiver.
• A discrete memoryless channel (DMC) is a statistical model with an input
X and an output Y.
• During each unit of the time (signalling interval), the channel accepts an
input symbol from X, and in response it generates an output symbol from
Y.
• The channel is "discrete" when the alphabets of X and Y are both
finite. It is "memoryless" when the current output depends on only
the current input and not on any of the previous inputs.
• A diagram of a DMC with ‘m’ inputs and ‘n’ outputs is illustrated in
Figure below.
• The input X consists of input symbols x1, x2, ... , xm. The a priori
probabilities of these source symbols P(Xi) are assumed to be known.
• The output Y consists of output symbols y1, y2, ... , yn.
• Each possible input-to-output path is indicated along with a conditional
probability P(yj/xi), where P(yj/xi) is the conditional probability of
obtaining output yj given that the input is xi, and is called a channel
transition probability.
Channel Matrix:
• A channel is completely specified by the complete set of transition
probabilities.
• Accordingly, the channel of above figure is often specified by the matrix
of transition probabilities [P(YlX)] , given by
• The matrix [P(YlX)] is called the channel matrix.
• Since each input to the channel results in some output, each row
of the channel matrix must sum to unity, that is,
Special Channels:
1. Lossless Channel:
– A channel described by a channel matrix with only one
nonzero element in each column is called a lossless
channel.
– An example of a lossless channel is shown in Figure below,
and the corresponding channel matrix is shown in equation
below.
– It can be shown that in the lossless channel no source
information is lost in transmission.
2. Deterministic Channel:
• A channel described by a channel matrix with only one nonzero
element in each row is called a deterministic channel.
• An example of a deterministic channel and the corresponding
channel matrix is shown below.
• Note that since each row has only one nonzero element, this
element must be unity.
• Thus, when a given source symbol is sent in the deterministic
channel, it is clear which output symbol will be received.
3. Noiseless Channel:
• A channel is called noiseless if it is both lossless and
deterministic.
• A noiseless channel is shown in the figure below. The channel
matrix has only one element in each row and in each column,
and this element is unity.
• Note that the input and output alphabets are of the same size;
that is, m = n for the noiseless channel.
4. Binary Symmetric Channel:
• The binary symmetric channel (BSC) is defined by the channel
diagram shown below, and its channel matrix is given by
• The channel has two inputs (X1 = 0,X2 = 1) and two outputs
(Y1 = 0,Y2 = 1).
• The channel is symmetric because the probability of
receiving a 1 if a 0 is sent is the same as the probability of
receiving a 0 if a 1 is sent. This common transition probability
is denoted by p.
Mutual Information
A. Conditional and Joint Entropies:
• Using the Input probabilities P(xi), Output probabilities P(yj ),
Transition probabilities P(xi / yj ), and Joint probabilities P(xi,yj ),
we can define the following various entropy functions for a
channel with ‘m’ inputs and ‘n’ outputs:
• These entropies can be interpreted as follows: H(X) is the
average uncertainty of the channel input, and H(Y) is the
average uncertainty of the channel output.
• The conditional entropy H(X/Y) is a measure of the average
uncertainty remaining about the channel input after the channel
output has been observed. And H(X/Y) is sometimes called the
equivocation of X with respect to Y.
• The conditional entropy H(Y/X) is the average uncertainty of
the channel output given that X was transmitted. The joint
entropy H(X, Y) is the average uncertainty of the
communication channel as a whole.
• Two useful relationships among the above various entropies are
B. Mutual Information:
• The mutual information I(X; Y) of a channel is defined by
• Since H(X) represents the uncertainty about the channel input
before the channel output is observed and H(X/ Y) represents the
uncertainty about the channel input after the channel output is
observed, the mutual information I(X;Y) represents the
uncertainty about the channel input that is resolved by observing
the channel output.
• Properties of I(X;Y) :
Channel Capacity
A. Channel Capacity per Symbol Cs:
• The channel capacity per symbol of a DMC is defined as
• where the maximization is over all possible input probability distributions
{P(Xi)} on X. Note that the channel capacity Cs is a function of only the
channel transition probabilities that define the channel.
B. Channel Capacity per Second C:
• If r symbols are being transmitted per second, then the maximum rate of
transmission of information per second is rCs. This is the channel capacity
per second and is denoted by C (b/s):
C. Capacities of Special Channels:
1. Lossless Channel:
• For a lossless channel, H(XlY) = 0
I(X; Y) = H(X)
• Thus, the mutual information (information transfer) is equal to the input
(source) entropy, and no source information is lost in transmission.
• Consequently, the channel capacity per symbol is
2. Deterministic Channel:
• For a deterministic channel, H(YIX) = 0 for all input distributions P(Xi) , and
I(X; Y) = H( Y)
• Thus, the information transfer is equal to the output entropy. The channel
capacity per symbol is
3. Noiseless Channel:
• Since a noiseless channel is both lossless and deterministic, we have
4. Binary Symmetric Channel:
• For the BSC, the mutual information is
Additive White Gaussian Noise Channel
• In a continuous channel an information source produces a continuous signal
x(t).
• The set of possible signals is considered as an ensemble of waveforms
generated by some ergodic random process.
A. Differential Entropy:
• The average amount of information per sample value of x(t) is measured by
• The entropy H(X) defined by equation above is known as the differential
entropy of X.
• The average mutual information in a continuous channel is defined (by
analogy with the discrete case) as
B. Additive White Gaussian Noise Channel
• In an additive white gaussian noise (A WGN ) channel, the channel output
Y is given by
Y= X + n
• where X is the channel input and n is an additive band-limited white
Gaussian noise with zero mean and variance σ2.
• The capacity Cs of an AWGN channel is given by
• where SIN is the signal-to-noise ratio at the channel output. If the channel
bandwidth B Hz is fixed, then the output y(t) is also a band-limited signal
completely characterized by its periodic sample values taken at the Nyquist
rate 2B samples/s. Then the capacity C (b/s) of the AWGN channel is given
by
• Equation above is known as the Shannon-Hartley law.
Source Coding
A conversion of the output of a Discrete Memoryless source (DMS)
into a sequence of binary symbols (binary code word) is called
source coding. The device that performs this conversion is called
the source encoder (figure below).
• An objective of source coding is to minimize the average bit
rate required for representation of the source by reducing the
redundancy of the information source.
A. Code Length and Code Efficiency:
• Let X be a DMS with finite entropy H(X) and an alphabet {X1,... , Xm}
with corresponding probabilities of occurrence P(xi)(i = 1, ... , m).
• Let the binary code word assigned to symbol Xi by the encoder have
length ni, measured in bits. The length of a code word is the number
of binary digits in the code word.
• The average code word length L, per source symbol is given by
• The parameter L represents the average number of bits per source
symbol used in the source coding process.
• The code efficiency ŋ is defined as
• where L min is the minimum possible value of L. When ŋ approaches
unity, the code is said to be efficient.
• The code redundancy γ is defined as
B. Source Coding Theorem:
• The source coding theorem states that for a DMS X with entropy
H(X), the average code word length L per symbol is bounded as
C. Classification of Codes:
• Classification of codes is best illustrated by an example.
• Consider Table below where a source of size 4 has been encoded
in binary codes with symbol 0 and 1.
1. Fixed-Length Codes: A fixed-length code is one whose code word
length is fixed. Code 1 and code 2 of Table above are fixed-length
codes with length 2.
2. Variable-Length Codes: A variable-length code is one whose code
word length is not fixed. All codes of Table above except codes 1
and 2 are variable-length codes.
3. Distinct Codes: A code is distinct if each code word is
distinguishable from other code words. All codes of Table above
except code 1 are distinct codes-notice the codes for X1 and X3.
4. Prefix-Free Codes: A code in which no code word can be formed
by adding code symbols to another code word is called a prefix-
free code. Thus, in a prefix-free code no code word is a prefix of
another. Codes 2, 4, and 6 of Table above are prefix-free codes.
5. Uniquely Decodable Codes: A distinct code is uniquely decodable
if the original source sequence can be reconstructed perfectly from
the encoded binary sequence.
5. Uniquely Decodable Codes:
• A distinct code is uniquely decodable if the original source
sequence can be reconstructed perfectly from the encoded binary
sequence.
• Note that code 3 of Table above is not a uniquely decodable code.
• For example, the binary sequence 1001 may correspond to the
source sequences X2X3X2 or X2X1XIX2.
• A sufficient condition to ensure that a code is uniquely decodable
is that no code word is a prefix of another.
• Thus, the prefix-free codes 2, 4, and 6 are uniquely decodable
codes.
• Note that the prefix-free condition is not a necessary condition
for unique decodability.
• For example, code 5 of Table above does not satisfy the prefix-
free condition, and yet it is uniquely decodable since the bit 0
indicates the beginning of each code word of the code.
6. Instantaneous Codes:
• A uniquely decodable code is called an instantaneous code if the end of any
code word is recognizable without examining subsequent code symbols.
• The instantaneous codes have the property previously mentioned that no code
word is a prefix of another code word.
• For this reason, prefix-free codes are sometimes called instantaneous codes.
7. Optimal Codes:
• A code is said to be optimal if it is instantaneous and has minimum average
length L for a given source with a given probability assignment for the source
symbols.
D. Kraft Inequality:
• Let X be a DMS with alphabet {Xi} (i = 1,2, ... ,m). Assume that the length of the
assigned binary code word corresponding to Xi is ni.
• A necessary and sufficient condition for the existence of an instantaneous binary
code is
• which is known as the Kraft inequality. Note that the Kraft inequality assures
us of the existence of an instantaneously decodable code with code word
lengths that satisfy the inequality.
Entropy Coding
• The design of a variable-length code such that its average code
word length approaches the entropy of the DMS is often referred
to as entropy coding.
• In this section we present two examples of
entropy coding.
A. Shannon-Fano Coding:
• An efficient code can be obtained by the following simple
procedure, known as Shannon-Fano algorithm:
– List the source symbols in order of decreasing probability.
– Partition the set into two sets that are as close to equi-probable
as possible, and assign 0 to the upper set and 1 to the lower set.
– Continue this process, each time partitioning the sets with as
nearly equal probabilities as possible until further partitioning
is not possible.
• An example of Shannon-Fano encoding is shown in Table below.
• Note that in Shannon-Fano encoding the ambiguity may arise in
the choice of approximately equi-probable sets.
B. Huffman Encoding:
• In general, Huffman encoding results in an optimum code. Thus, it
is the code that has the highest efficiency. The Huffman encoding
procedure is as follows:
• An example of Huffman encoding is shown in Table below.
Ec eg  intro 1
Ec eg  intro 1
Ec eg  intro 1
Ec eg  intro 1

Mais conteĂşdo relacionado

Mais procurados

4.4 external hashing
4.4 external hashing4.4 external hashing
4.4 external hashingKrish_ver2
 
BOOLEAN ALGEBRA.ppt
BOOLEAN ALGEBRA.pptBOOLEAN ALGEBRA.ppt
BOOLEAN ALGEBRA.pptSainiNageshSir
 
SHA-256.pptx
SHA-256.pptxSHA-256.pptx
SHA-256.pptxJadhavSujeet
 
Parity bits
Parity bitsParity bits
Parity bitsmrhaken
 
Bcd to excess 3 code converter
Bcd to excess 3 code converterBcd to excess 3 code converter
Bcd to excess 3 code converterUshaswini Chowdary
 
Combinational Logic Circuit
Combinational Logic CircuitCombinational Logic Circuit
Combinational Logic CircuitGargiKhanna1
 
Find transitive-closure using warshalls-algorithm
Find transitive-closure  using warshalls-algorithmFind transitive-closure  using warshalls-algorithm
Find transitive-closure using warshalls-algorithmMamun Hasan
 
Hash function
Hash function Hash function
Hash function Salman Memon
 
Решение задач на собственные значения
Решение задач на собственные значенияРешение задач на собственные значения
Решение задач на собственные значенияTheoretical mechanics department
 
ECC vs RSA: Battle of the Crypto-Ninjas
ECC vs RSA: Battle of the Crypto-NinjasECC vs RSA: Battle of the Crypto-Ninjas
ECC vs RSA: Battle of the Crypto-NinjasJames McGivern
 
01.Number Systems
01.Number Systems01.Number Systems
01.Number SystemsAmit Chandra
 
fuzzy_measures.ppt
fuzzy_measures.pptfuzzy_measures.ppt
fuzzy_measures.pptJanosBotzheim
 
Convolution using Scilab
Convolution using ScilabConvolution using Scilab
Convolution using Scilabsachin achari
 
Lzw compression
Lzw compressionLzw compression
Lzw compressionMeghna Singh
 
Encoder and decoder
Encoder and decoderEncoder and decoder
Encoder and decoderAbid Ali
 
Homomorphic Encryption
Homomorphic EncryptionHomomorphic Encryption
Homomorphic EncryptionVipin Tejwani
 
Information and data security cryptographic hash functions
Information and data security cryptographic hash functionsInformation and data security cryptographic hash functions
Information and data security cryptographic hash functionsMazin Alwaaly
 
Topic20 The RC4 Algorithm.pptx
Topic20 The RC4 Algorithm.pptxTopic20 The RC4 Algorithm.pptx
Topic20 The RC4 Algorithm.pptxUrjaDhabarde
 
Deterministic Finite Automata (DFA)
Deterministic Finite Automata (DFA)Deterministic Finite Automata (DFA)
Deterministic Finite Automata (DFA)Animesh Chaturvedi
 

Mais procurados (20)

4.4 external hashing
4.4 external hashing4.4 external hashing
4.4 external hashing
 
BOOLEAN ALGEBRA.ppt
BOOLEAN ALGEBRA.pptBOOLEAN ALGEBRA.ppt
BOOLEAN ALGEBRA.ppt
 
SHA-256.pptx
SHA-256.pptxSHA-256.pptx
SHA-256.pptx
 
Parity bits
Parity bitsParity bits
Parity bits
 
Bcd to excess 3 code converter
Bcd to excess 3 code converterBcd to excess 3 code converter
Bcd to excess 3 code converter
 
Combinational Logic Circuit
Combinational Logic CircuitCombinational Logic Circuit
Combinational Logic Circuit
 
Find transitive-closure using warshalls-algorithm
Find transitive-closure  using warshalls-algorithmFind transitive-closure  using warshalls-algorithm
Find transitive-closure using warshalls-algorithm
 
Hash function
Hash function Hash function
Hash function
 
Решение задач на собственные значения
Решение задач на собственные значенияРешение задач на собственные значения
Решение задач на собственные значения
 
ECC vs RSA: Battle of the Crypto-Ninjas
ECC vs RSA: Battle of the Crypto-NinjasECC vs RSA: Battle of the Crypto-Ninjas
ECC vs RSA: Battle of the Crypto-Ninjas
 
01.Number Systems
01.Number Systems01.Number Systems
01.Number Systems
 
fuzzy_measures.ppt
fuzzy_measures.pptfuzzy_measures.ppt
fuzzy_measures.ppt
 
Convolution using Scilab
Convolution using ScilabConvolution using Scilab
Convolution using Scilab
 
Lzw compression
Lzw compressionLzw compression
Lzw compression
 
Encoder and decoder
Encoder and decoderEncoder and decoder
Encoder and decoder
 
Homomorphic Encryption
Homomorphic EncryptionHomomorphic Encryption
Homomorphic Encryption
 
Information and data security cryptographic hash functions
Information and data security cryptographic hash functionsInformation and data security cryptographic hash functions
Information and data security cryptographic hash functions
 
Topic20 The RC4 Algorithm.pptx
Topic20 The RC4 Algorithm.pptxTopic20 The RC4 Algorithm.pptx
Topic20 The RC4 Algorithm.pptx
 
Deterministic Finite Automata (DFA)
Deterministic Finite Automata (DFA)Deterministic Finite Automata (DFA)
Deterministic Finite Automata (DFA)
 
Cryptology
CryptologyCryptology
Cryptology
 

Semelhante a Ec eg intro 1

Radio receiver and information coding .pptx
Radio receiver and information coding .pptxRadio receiver and information coding .pptx
Radio receiver and information coding .pptxswatihalunde
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfvani374987
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptshortstime400
 
Communication engineering -UNIT IV .pptx
Communication engineering -UNIT IV .pptxCommunication engineering -UNIT IV .pptx
Communication engineering -UNIT IV .pptxManoj Kumar
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel reviewShilpaDe
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxKIRUTHIKAAR2
 
Information theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxInformation theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxprernaguptaec
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdftemmy7
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxKIRUTHIKAAR2
 
Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - IntroductionBurdwan University
 
Combinational Circuits.pptx
Combinational Circuits.pptxCombinational Circuits.pptx
Combinational Circuits.pptxAshokRachapalli1
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information TheoryDr. Sanjay M. Gulhane
 
Digital Electronics Unit_1.pptx
Digital Electronics Unit_1.pptxDigital Electronics Unit_1.pptx
Digital Electronics Unit_1.pptxThapar Institute
 
5. Encoder and Decoder-Bus and memory.pdf
5. Encoder and Decoder-Bus and memory.pdf5. Encoder and Decoder-Bus and memory.pdf
5. Encoder and Decoder-Bus and memory.pdfSourabhRaj29
 

Semelhante a Ec eg intro 1 (20)

Radio receiver and information coding .pptx
Radio receiver and information coding .pptxRadio receiver and information coding .pptx
Radio receiver and information coding .pptx
 
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdfUnit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
Unit I DIGITAL COMMUNICATION-INFORMATION THEORY.pdf
 
DC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.pptDC Lecture Slides 1 - Information Theory.ppt
DC Lecture Slides 1 - Information Theory.ppt
 
Communication engineering -UNIT IV .pptx
Communication engineering -UNIT IV .pptxCommunication engineering -UNIT IV .pptx
Communication engineering -UNIT IV .pptx
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel review
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptx
 
Information theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptxInformation theory & coding PPT Full Syllabus.pptx
Information theory & coding PPT Full Syllabus.pptx
 
INFORMATION_THEORY.pdf
INFORMATION_THEORY.pdfINFORMATION_THEORY.pdf
INFORMATION_THEORY.pdf
 
Unit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptxUnit-1_Digital_Communication-Information_Theory.pptx
Unit-1_Digital_Communication-Information_Theory.pptx
 
Information Theory - Introduction
Information Theory  -  IntroductionInformation Theory  -  Introduction
Information Theory - Introduction
 
Combinational Circuits.pptx
Combinational Circuits.pptxCombinational Circuits.pptx
Combinational Circuits.pptx
 
Source coding
Source coding Source coding
Source coding
 
Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information Theory
 
Digital Electronics Unit_1.pptx
Digital Electronics Unit_1.pptxDigital Electronics Unit_1.pptx
Digital Electronics Unit_1.pptx
 
5. Encoder and Decoder-Bus and memory.pdf
5. Encoder and Decoder-Bus and memory.pdf5. Encoder and Decoder-Bus and memory.pdf
5. Encoder and Decoder-Bus and memory.pdf
 
Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Unit 5.pdf
 
Lecture 3.pptx
Lecture 3.pptxLecture 3.pptx
Lecture 3.pptx
 
Lecture 3.pptx
Lecture 3.pptxLecture 3.pptx
Lecture 3.pptx
 
Chapter 1
Chapter 1Chapter 1
Chapter 1
 
Digital Communication Techniques
Digital Communication TechniquesDigital Communication Techniques
Digital Communication Techniques
 

Último

Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptxOrlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptxMuhammadAsimMuhammad6
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadhamedmustafa094
 
DC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationDC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationBhangaleSonal
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network DevicesChandrakantDivate1
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiessarkmank1
 
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsArindam Chakraborty, Ph.D., P.E. (CA, TX)
 
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...Call Girls Mumbai
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfJiananWang21
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesMayuraD1
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxJuliansyahHarahap1
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Servicemeghakumariji156
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdfKamal Acharya
 
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEGEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEselvakumar948
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdfAldoGarca30
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayEpec Engineered Technologies
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.Kamal Acharya
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxSCMS School of Architecture
 
Computer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to ComputersComputer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to ComputersMairaAshraf6
 

Último (20)

Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptxOrlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
 
kiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal loadkiln thermal load.pptx kiln tgermal load
kiln thermal load.pptx kiln tgermal load
 
DC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equationDC MACHINE-Motoring and generation, Armature circuit equation
DC MACHINE-Motoring and generation, Armature circuit equation
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network Devices
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
 
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
Bhubaneswar🌹Call Girls Bhubaneswar ❤Komal 9777949614 💟 Full Trusted CALL GIRL...
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
 
Online food ordering system project report.pdf
Online food ordering system project report.pdfOnline food ordering system project report.pdf
Online food ordering system project report.pdf
 
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEGEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
 
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 
Employee leave management system project.
Employee leave management system project.Employee leave management system project.
Employee leave management system project.
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
 
Computer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to ComputersComputer Lecture 01.pptxIntroduction to Computers
Computer Lecture 01.pptxIntroduction to Computers
 

Ec eg intro 1

  • 1. Information Theory and Coding ECEg- 6028
  • 2. Objective • The objective of the course it to expose students to the principles and practice of information theory, covering both theoretical and applied issues of recognized importance in data/information compression, transmission, storage and processing. • To have the knowledge of Modulation & Demodulation • To understand different types of coding techniques • To have the knowledge of how these codes are useful in communication
  • 3. Contents • Introduction to communication system and information; measure of information; • review of discrete sources; source coding; • DML channels capacity; introduction to coding for reliable communication and storage; • modulation/ demodulation; maximum likelihood decoding; review of algebra; • algebra codes: linear block codes; cyclic codes; • BCH codes (Bose Chaudhuri Hocquenghem), Burst codes; • burst error correcting codes; convolution codes; structural properties of convolution codes; • Maximum likelihood decoding; viterbi algorithm, performance bounds for convolution codes, burst error correcting convolution codes.
  • 4. Reference Materials • Text : • Elements of Information Theory, 2nd Edition, T. M. Cover and J. A. Thomas, Wiley Inter-science, 2006. • References: 1. Information Theory and Reliable Communication, Robert G. Gallager, Wiley Text Books, 1968. 2. Relevant papers in the development of information theory.
  • 5. Content • Introduction • Measure of Information • Discrete Memory less Channels • Source Coding
  • 6. Introduction • Information theory provide the quantitative measure of the information contained in a message signals. • It allows to determine the capacity of a communication system to transfer this information from source to destination .
  • 7. Measure of Information A. Information Source • An information source is an object that produces an event, the outcome of it is selected at random according to a probability distribution. • A practical source of information in a communication system is a device produces messages, it can be either analog or discrete. • The set of source symbols is called the source alphabet and the elements of the set are called symbols or letters. • Information source can be classified as having memory and memory less. • A source with memory is one for which the current symbol depends on the previous symbols. • A memory less source is one for which each symbol produced is independent of the previous symbols .
  • 8. B. Information content of a Discrete Memoryless System • The amount of information contained in an event is closely related to its uncertainty. • Message having knowledge of high probability of occurrence convey relatively little information. • If the message is certain (with high probability of occurrence) , it conveys zero information. • Mathematical measure of information should be a function of the probability of the outcome and should satisfy the following axioms: – Information should be proportional to the uncertainty of an outcome. – Information contained in independent outcomes should add.
  • 10. Average Information or Entropy • In a practical communication system, we usually transmit long sequences of symbols from an information source. • Thus, we are more interested in the average information that a source produces than the information content of a single symbol. • The mean value of I(xi) over the alphabet of source X with ‘m’ different symbols is given by • The quantity H(X) is called the Entropy of source X. It is a measure of the average information content per source symbol. • The source entropy H(X) can be considered as the average amount of uncertainty within source X that is resolved by use of the alphabet.
  • 11. • The source entropy H(X) satisfies the following relation: Information Rate • If the time rate at which source X emits symbols is ‘r’ (symbols/s), the information rate R of the source is given by
  • 12. Discrete Memoryless Channels Channel Representation: • A Communication channel is the path or medium through which the symbols flow to the receiver. • A discrete memoryless channel (DMC) is a statistical model with an input X and an output Y. • During each unit of the time (signalling interval), the channel accepts an input symbol from X, and in response it generates an output symbol from Y. • The channel is "discrete" when the alphabets of X and Y are both finite. It is "memoryless" when the current output depends on only the current input and not on any of the previous inputs. • A diagram of a DMC with ‘m’ inputs and ‘n’ outputs is illustrated in Figure below.
  • 13. • The input X consists of input symbols x1, x2, ... , xm. The a priori probabilities of these source symbols P(Xi) are assumed to be known. • The output Y consists of output symbols y1, y2, ... , yn. • Each possible input-to-output path is indicated along with a conditional probability P(yj/xi), where P(yj/xi) is the conditional probability of obtaining output yj given that the input is xi, and is called a channel transition probability. Channel Matrix: • A channel is completely specified by the complete set of transition probabilities. • Accordingly, the channel of above figure is often specified by the matrix of transition probabilities [P(YlX)] , given by • The matrix [P(YlX)] is called the channel matrix.
  • 14. • Since each input to the channel results in some output, each row of the channel matrix must sum to unity, that is,
  • 15. Special Channels: 1. Lossless Channel: – A channel described by a channel matrix with only one nonzero element in each column is called a lossless channel. – An example of a lossless channel is shown in Figure below, and the corresponding channel matrix is shown in equation below. – It can be shown that in the lossless channel no source information is lost in transmission.
  • 16. 2. Deterministic Channel: • A channel described by a channel matrix with only one nonzero element in each row is called a deterministic channel. • An example of a deterministic channel and the corresponding channel matrix is shown below. • Note that since each row has only one nonzero element, this element must be unity. • Thus, when a given source symbol is sent in the deterministic channel, it is clear which output symbol will be received.
  • 17. 3. Noiseless Channel: • A channel is called noiseless if it is both lossless and deterministic. • A noiseless channel is shown in the figure below. The channel matrix has only one element in each row and in each column, and this element is unity. • Note that the input and output alphabets are of the same size; that is, m = n for the noiseless channel.
  • 18. 4. Binary Symmetric Channel: • The binary symmetric channel (BSC) is defined by the channel diagram shown below, and its channel matrix is given by • The channel has two inputs (X1 = 0,X2 = 1) and two outputs (Y1 = 0,Y2 = 1). • The channel is symmetric because the probability of receiving a 1 if a 0 is sent is the same as the probability of receiving a 0 if a 1 is sent. This common transition probability is denoted by p.
  • 19. Mutual Information A. Conditional and Joint Entropies: • Using the Input probabilities P(xi), Output probabilities P(yj ), Transition probabilities P(xi / yj ), and Joint probabilities P(xi,yj ), we can define the following various entropy functions for a channel with ‘m’ inputs and ‘n’ outputs:
  • 20. • These entropies can be interpreted as follows: H(X) is the average uncertainty of the channel input, and H(Y) is the average uncertainty of the channel output. • The conditional entropy H(X/Y) is a measure of the average uncertainty remaining about the channel input after the channel output has been observed. And H(X/Y) is sometimes called the equivocation of X with respect to Y. • The conditional entropy H(Y/X) is the average uncertainty of the channel output given that X was transmitted. The joint entropy H(X, Y) is the average uncertainty of the communication channel as a whole. • Two useful relationships among the above various entropies are
  • 21. B. Mutual Information: • The mutual information I(X; Y) of a channel is defined by • Since H(X) represents the uncertainty about the channel input before the channel output is observed and H(X/ Y) represents the uncertainty about the channel input after the channel output is observed, the mutual information I(X;Y) represents the uncertainty about the channel input that is resolved by observing the channel output. • Properties of I(X;Y) :
  • 22. Channel Capacity A. Channel Capacity per Symbol Cs: • The channel capacity per symbol of a DMC is defined as • where the maximization is over all possible input probability distributions {P(Xi)} on X. Note that the channel capacity Cs is a function of only the channel transition probabilities that define the channel. B. Channel Capacity per Second C: • If r symbols are being transmitted per second, then the maximum rate of transmission of information per second is rCs. This is the channel capacity per second and is denoted by C (b/s):
  • 23. C. Capacities of Special Channels: 1. Lossless Channel: • For a lossless channel, H(XlY) = 0 I(X; Y) = H(X) • Thus, the mutual information (information transfer) is equal to the input (source) entropy, and no source information is lost in transmission. • Consequently, the channel capacity per symbol is 2. Deterministic Channel: • For a deterministic channel, H(YIX) = 0 for all input distributions P(Xi) , and I(X; Y) = H( Y) • Thus, the information transfer is equal to the output entropy. The channel capacity per symbol is
  • 24. 3. Noiseless Channel: • Since a noiseless channel is both lossless and deterministic, we have 4. Binary Symmetric Channel: • For the BSC, the mutual information is
  • 25. Additive White Gaussian Noise Channel • In a continuous channel an information source produces a continuous signal x(t). • The set of possible signals is considered as an ensemble of waveforms generated by some ergodic random process. A. Differential Entropy: • The average amount of information per sample value of x(t) is measured by • The entropy H(X) defined by equation above is known as the differential entropy of X.
  • 26. • The average mutual information in a continuous channel is defined (by analogy with the discrete case) as
  • 27. B. Additive White Gaussian Noise Channel • In an additive white gaussian noise (A WGN ) channel, the channel output Y is given by Y= X + n • where X is the channel input and n is an additive band-limited white Gaussian noise with zero mean and variance σ2. • The capacity Cs of an AWGN channel is given by • where SIN is the signal-to-noise ratio at the channel output. If the channel bandwidth B Hz is fixed, then the output y(t) is also a band-limited signal completely characterized by its periodic sample values taken at the Nyquist rate 2B samples/s. Then the capacity C (b/s) of the AWGN channel is given by • Equation above is known as the Shannon-Hartley law.
  • 28. Source Coding A conversion of the output of a Discrete Memoryless source (DMS) into a sequence of binary symbols (binary code word) is called source coding. The device that performs this conversion is called the source encoder (figure below). • An objective of source coding is to minimize the average bit rate required for representation of the source by reducing the redundancy of the information source.
  • 29. A. Code Length and Code Efficiency: • Let X be a DMS with finite entropy H(X) and an alphabet {X1,... , Xm} with corresponding probabilities of occurrence P(xi)(i = 1, ... , m). • Let the binary code word assigned to symbol Xi by the encoder have length ni, measured in bits. The length of a code word is the number of binary digits in the code word. • The average code word length L, per source symbol is given by • The parameter L represents the average number of bits per source symbol used in the source coding process. • The code efficiency ŋ is defined as • where L min is the minimum possible value of L. When ŋ approaches unity, the code is said to be efficient. • The code redundancy Îł is defined as
  • 30. B. Source Coding Theorem: • The source coding theorem states that for a DMS X with entropy H(X), the average code word length L per symbol is bounded as C. Classification of Codes: • Classification of codes is best illustrated by an example. • Consider Table below where a source of size 4 has been encoded in binary codes with symbol 0 and 1.
  • 31. 1. Fixed-Length Codes: A fixed-length code is one whose code word length is fixed. Code 1 and code 2 of Table above are fixed-length codes with length 2. 2. Variable-Length Codes: A variable-length code is one whose code word length is not fixed. All codes of Table above except codes 1 and 2 are variable-length codes. 3. Distinct Codes: A code is distinct if each code word is distinguishable from other code words. All codes of Table above except code 1 are distinct codes-notice the codes for X1 and X3. 4. Prefix-Free Codes: A code in which no code word can be formed by adding code symbols to another code word is called a prefix- free code. Thus, in a prefix-free code no code word is a prefix of another. Codes 2, 4, and 6 of Table above are prefix-free codes. 5. Uniquely Decodable Codes: A distinct code is uniquely decodable if the original source sequence can be reconstructed perfectly from the encoded binary sequence.
  • 32. 5. Uniquely Decodable Codes: • A distinct code is uniquely decodable if the original source sequence can be reconstructed perfectly from the encoded binary sequence. • Note that code 3 of Table above is not a uniquely decodable code. • For example, the binary sequence 1001 may correspond to the source sequences X2X3X2 or X2X1XIX2. • A sufficient condition to ensure that a code is uniquely decodable is that no code word is a prefix of another. • Thus, the prefix-free codes 2, 4, and 6 are uniquely decodable codes. • Note that the prefix-free condition is not a necessary condition for unique decodability. • For example, code 5 of Table above does not satisfy the prefix- free condition, and yet it is uniquely decodable since the bit 0 indicates the beginning of each code word of the code.
  • 33. 6. Instantaneous Codes: • A uniquely decodable code is called an instantaneous code if the end of any code word is recognizable without examining subsequent code symbols. • The instantaneous codes have the property previously mentioned that no code word is a prefix of another code word. • For this reason, prefix-free codes are sometimes called instantaneous codes. 7. Optimal Codes: • A code is said to be optimal if it is instantaneous and has minimum average length L for a given source with a given probability assignment for the source symbols. D. Kraft Inequality: • Let X be a DMS with alphabet {Xi} (i = 1,2, ... ,m). Assume that the length of the assigned binary code word corresponding to Xi is ni. • A necessary and sufficient condition for the existence of an instantaneous binary code is • which is known as the Kraft inequality. Note that the Kraft inequality assures us of the existence of an instantaneously decodable code with code word lengths that satisfy the inequality.
  • 34. Entropy Coding • The design of a variable-length code such that its average code word length approaches the entropy of the DMS is often referred to as entropy coding. • In this section we present two examples of entropy coding. A. Shannon-Fano Coding: • An efficient code can be obtained by the following simple procedure, known as Shannon-Fano algorithm: – List the source symbols in order of decreasing probability. – Partition the set into two sets that are as close to equi-probable as possible, and assign 0 to the upper set and 1 to the lower set. – Continue this process, each time partitioning the sets with as nearly equal probabilities as possible until further partitioning is not possible.
  • 35. • An example of Shannon-Fano encoding is shown in Table below. • Note that in Shannon-Fano encoding the ambiguity may arise in the choice of approximately equi-probable sets.
  • 36. B. Huffman Encoding: • In general, Huffman encoding results in an optimum code. Thus, it is the code that has the highest efficiency. The Huffman encoding procedure is as follows:
  • 37. • An example of Huffman encoding is shown in Table below.