📡 Shannon-Fano Coding Quiz
Undergraduate Communication Engineering Assessment
📋
12 Questions
⏱️
Estimated Time: 25-30 minutes
🎯
Passing Score: 70%
📚
Information Theory & Coding
1
Who developed the Shannon-Fano coding algorithm?
a) Claude Shannon alone
b) Claude Shannon and Robert Fano independently
c) David Huffman
d) Alan Turing and Claude Shannon
2
In what year was the Shannon-Fano coding technique first published?
a) 1945
b) 1946
c) 1948-1949 (Shannon's paper 1948, Fano's report 1949)
d) 1952
3
What is the first step in the Shannon-Fano coding algorithm?
a) Assign binary digits 0 and 1 to symbols
b) Sort symbols in decreasing order of probability
c) Divide the list into two equal parts by count
d) Calculate the entropy of the source
4
When splitting the sorted symbol list in Shannon-Fano coding, what criterion is used for the division?
a) The total probability of both parts should be as close to each other as possible
b) Split exactly in the middle by number of symbols
c) Group symbols with the same probability together
d) Random division to ensure code uniqueness
5
Given symbols with probabilities P(A)=0.4, P(B)=0.3, P(C)=0.2, P(D)=0.1, what is the first split in Shannon-Fano coding?
a) {A} and {B, C, D}
b) {A, B, C} and {D}
c) {A, D} and {B, C}
d) {A, B} and {C, D} (0.7 vs 0.3) or {A} and {B, C, D} (0.4 vs 0.6) - actually {A,B} vs {C,D} is not optimal. Correct is {A} (0.4) vs {B,C,D} (0.6) or checking {A,B} (0.7) vs {C,D} (0.3). The closest split is {A} vs {B,C,D} = 0.4 vs 0.6 (diff 0.2) or {A,B} vs {C,D} = 0.7 vs 0.3 (diff 0.4). So {A} vs {B,C,D} is better. Wait, let me recalculate: {A,B}=0.7, {C,D}=0.3, diff=0.4. {A}=0.4, {B,C,D}=0.6, diff=0.2. So {A} vs {B,C,D} is correct.
6
What is the primary difference between Shannon-Fano and Huffman coding?
a) Shannon-Fano uses fixed-length codes while Huffman uses variable-length
b) Shannon-Fano is lossy while Huffman is lossless
c) Shannon-Fano builds the tree top-down while Huffman builds bottom-up
d) Shannon-Fano requires sorted input while Huffman does not
7
For a source with entropy H(S) = 2.19 bits/symbol, what is the guaranteed bound for the average code word length L(C) of a Shannon-Fano code?
a) L(C) = H(S) exactly
b) H(S) ≤ L(C) < H(S) + 1
c) L(C) ≤ H(S)
d) L(C) = H(S) + 2
8
Given symbol probabilities P(X1)=0.25, P(X2)=0.25, P(X3)=0.2, P(X4)=0.15, P(X5)=0.15, what is the code length for X1 using Shannon's method (li = ⌈-log2(pi)⌉)?
a) 2 bits
b) 3 bits
c) 1 bit
d) 4 bits
9
Why is Shannon-Fano coding considered a "prefix code"?
a) Because all codes start with the same prefix
b) Because the code length is prefixed to each codeword
c) Because it uses prefix notation for binary representation
d) Because no codeword is a prefix of any other codeword, ensuring instantaneous decoding
10
For a source with symbols {A, B, C, D} with probabilities {0.4, 0.3, 0.2, 0.1}, what is the average code length if the Shannon-Fano codes are A=0, B=10, C=110, D=111?
a) 1.8 bits/symbol
b) 1.9 bits/symbol
c) 2.0 bits/symbol
d) 2.1 bits/symbol
11
What is the code efficiency (η) formula for Shannon-Fano coding?
a) η = L(C) / H(S)
b) η = 1 - H(S)/L(C)
c) η = H(S) / L(C)
d) η = H(S) × L(C)
12
In which scenario does Shannon-Fano coding achieve optimal (minimum) average code length?
a) When symbol probabilities are dyadic (powers of 1/2)
b) When all symbols have equal probability
c) When there are only two symbols
d) Shannon-Fano always achieves optimal code length
Submit Quiz & View Results
0/12
Calculating...
Retake Quiz