dc2ndMIDUandiStar

Embed Size (px)

Citation preview

  • 8/2/2019 dc2ndMIDUandiStar

    1/18

    JNTU ONLINE EXAMINATIONS [Mid 2 - dc]

    1. Two binary random variables X and Y are distributed according to the joint Distribution given as P(X=Y=0) = P(X== P(X=Y=1) = 1/3.Then, [01D01]a. H(X) = H(Y)b. H(X) = 2.H(Y)c. H(Y) = 2.H(X)d. H(X) + H(Y) = 1.

    2. An independent discrete source transmits letters from an alphabet consisting of A and B with respective probabilitand 0.4.If the consecutive letters are statistically independent , and two symbol words are transmitted, then theprobability of the words with different symbols is [01D02]

    a. 0.52b. 0.36c. 0.48d. 0.24

    3. A memoryless source emits 2000 binary symbols/sec and each symbol has a probability of 0.25 to be equal to 1 an0.75 to be equal to 0.The minimum number of bits/sec required for error free transmission of this source in bits/syis [01M01]a. 0.75b. 0.81c. 0.65d. 0.55

    4. Which of the following channel matrices respresent a symmetric channel? [01M02]

    a.

    b.

    c.

    d.

    5. The capacity of the channel with the channel Matrix

    where xi`s are transmitted messages and y

    j`s are received messages is [01M03]

    a. log3 bitsb. log5 bitsc. log4 bitsd. 1bit

    6. Information rate of a source is [01S01]a. the entropy of the source measured in bits/messageb. the entropy of the source measured in bits/sec.c. a measure of the uncertainity of the communication systemd. maximum when the source is continuous

    7. If `a` is an element of a Field `F , then its additive inverse is [01S02]a. 0b. - ac. ad. 1

    8. The minimum number of elements that a field can have is [01S03]a. 3b. 2c. 4d. 1

    9. Which of the following is correct? [01S04]a. The syndrome of a received Block coded word depends on the transmitted code word.b. The syndrome of a received Block coded word depends on the received code wordc. The syndrome of a received Block coded word depends on the error pattern

    d. The syndrome for a received Block coded word under error free reception consists of all 1`s.www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    2/18

    0. If the output of a continuous source is limited to an average power of 2, then the Maximum entropy of the source[01S05]a.

    b.

    c.

    d.

    1. A Convolutional encoder of code rate 1/2 consists of two stage shift register.The Generator sequence of top adder (1,1,1) and that of the bottom adder is (1,0,1). The constraint length of the encoder is [02D01]a. 2b. 3c. 4

    d. 5

    2. The Parity Check Matrix of a (6,3) Systematic Linear Block code is

    If the Syndrome vector computed for the received code word is [ 1 1 0] , then for error correction, which of the bitsthe received code word is to be complemented? [02D02]a. 2b. 3c. 4d. 5

    3. The minimum number of bits per message required to encode the output of source transmitting four different messwith probabilities 0.5,0.25,0.125 and 0.125 is [02M01]

    a. 2b. 1c. 1.5d. 1.75

    4. A Communication channel is represented by the channel Matrix given as

    In the above matrix, rows correspond to the Transmitter X and the columns correspond to the Receiver Y. Then, thConditional entropy H(Y/X) in bits/message is [02M02]a. zero

    b. log 5c. log 3d. 3

    5. The Channel Matrix of a Noiseless channel [02M03]a. consists of a single nonzero number in each columnb. consists of a single nonzero number in each rowc. is an Identity Matrixd. is a square Matrix

    6. Enropy of a source is [02S01]a. Average amount of information conveyed by the communication systemb. Average amount of information transferred by the channelc. Average amount of information available with the sourced. Average amount of information conveyed by the source to the receiver

    7. Relative to Hard decision decoding, soft decision decoding results in [02S02]a. better bit error probabilityb. better coding gainc. less circuit complexityd. lesser coding gain

    8. Which of the following is the essential requirement of a source coding scheme? [02S03]a. Comma free nature of the code wordsb. A Minimum Hamming distance of 3c. Error detection and correction capabilityd. The received code word should compatible with a Matched filter.

    9. The transition probabilities for a BSC will be represented using [02S04]a. Joint Probability Matrixb. State diagramc. Conditional Probability Matrix

    d. Trellis diagram www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    3/18

    0. A Field is [02S05]a. an Abelian group under additionb. a group with 0 as the multiplicative identity for its membersc. a group with 1 as the additive identity for its membersd. a group with 0 as the additive inverse for its members

    1. The constraint length of a convolutional encoder of code rate 1/3 is 5. If the input of the encoder is a 5 bit messagsequence, the length of the out put code word in bits is [03D01]a. 33b. 27c. 24d. 30

    2. A communication channel is represented by its channel matrix with rows representing the messages associated wi

    source and the columns representing the messages associated with the receiver given as

    Its capacity in bits is [03D02]a. log 4b. log 3c. log 12d. log 7

    3. A Binary Erasure channel has P(0/0) = P(1/1) = p; P(k/0) = P(k/1) = q. Its Capacity in bits/symbol is [03M01]a. pb. qc. pqd. p/q

    4. When a pair of dice is thrown, the average amount of information contained in the message " The sum of the facesin bits is [03M02]a. 0.75b. 0.86c. 0.96d. 0.68

    5. A source emits messages A and B with probability 0.8 and 0.2 respectively. The redundancy provided by the optimsource coding scheme for the above Source is [03M03]a. 72 %b. 27 %c. 45 %

    d. 55 %

    6. Information content of a message [03S01]a. increases with its certainty of occurrenceb. independent of the certainty of occurrencec. increases with its uncertainty of occurrenced. is the logarithm of its certainty of occurrence

    7. .Under error free reception, the syndrome vector computed for the received cyclic code word consists of [03S02]a. all onesb. alternate 1`s and 0`s starting with a 1c. alternate 0`s and 1 s starting with a 0d. all zeros

    8. A continuous source will have maximum entropy associated if the pdf associated with its output is [03S03]a. Poisson

    b. Exponentialc. Rayleighd. Gaussian

    9. Variable length source coding provides better coding efficiency, if all the messages of the source are [03S04]a. Equiprobableb. with different transmission probabilityc. discretely transmittedd. continuously transmitted

    0. Shanon's Limit deals with [03S05]a. maximum information content of a messageb. maximum entropy associated with a sourcec. maximum capacity of a channeld. maximum bit rate of a source

    www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    4/18

    1. The Parity Check Matrix of a (6,3) Systematic Linear Block code is

    If the Syndrome vector computed for the received code word is [ 0 1 1] , then for error correction, which of the bitsthe received code word is to be complemented? [04D01]a. 2b. 3c. 4d. 5

    2. A (7,4) Cyclic code has a generator polynomial given as 1+x+x3.If the error pattern 0001000, the correspondingsyndrome vector is [04D02]a. 001

    b. 010c. 100d. 110

    3. The Memory length of a convolutional encoder is 3.If a 5 bit message sequence is applied as the input for the encodthen for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is[04M01]a. 4b. 3c. 5d. 6

    4. The code words of a systematic (6,3) Linear Block code are 001110,010011,011101,100101,101011,110110,111000.Which of the following is also a code word of the c Code? [04M02]a. 000111b. 000101

    c. 000000d. 101101

    5. The syndrome S(x) of a cyclic code is given by Reminder of the division , where V(x) is the

    transmitted code polynomial, E(x) is the error polynomial and g(x) is the generator polynomial. The S(x) is also eq[04M03]a. Reminder of V(x) /g(x)b. Reminder of E(x)/g(x)c. Reminder of [V(x) . E(x)]/g(x)d. Remainder of g(x)/V(x)

    6. Source1 is transmitting two messages with probabilities 0.2 and 0.8 and Source 2 is transmitting two messages wiprobabilities 0.5 and 0.5.Then [04S01]a. Maximum uncertainty is associated with Source 1b. Maximum uncertainty is associated with Source 2

    c. Both the sources 1 and 2 are having maximum amount of uncertainty associatedd. There is no uncertainty associated with either of the two sources .

    7. A source X and the receiver Y are connected by a noise free channel. Its capacity is [04S02]a. Max H(X)b. Max H(X/Y)c. Max H(Y/X)d. Max H(X,Y)

    8. The entropy measure of a continuous source is a [04S03]a. Relative measureb. Absolute measurec. Linear Measured. Non-Linear Measure

    9. Which of the following is correct? [04S04]

    a. FEC is used for error control after receiver makes a decision about the received bitb. ARQ is used for error control after receiver makes a decision about the received bitc. FEC is used for error control when the receiver is unable to make a decision about the received bitd. FEC and ARQ are not used for error correction

    0. Error free communication may be possible by [04S05]a. reducing redundancy during transmissionb. increasing transmission power to the required levelc. providing redundancy during transmissiond. increasing the channel band width

    1. For the data word 1010 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the codepolynomial is [05D01]

    a. 1+x+x3+x5

    b. 1+x+x3+x4

    c. 1+x2+x3+x4

    d. 1+x+x2+x5www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    5/18

    2. The output of a continuous source is a uniform random variable in the range .The entropy of the source

    bits/sample is [05D02]a. 4b. 2c. 8d. 1

    3. For the source X transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8, Maximum coding efficiency be obtained by using [05M01]a. Convolutional codesb. Only Shanon-Fano methodc. Either of the Shanon- fano and Huffman methodsd. Block codes

    4. In Modulo-7 addition, 6+1 is equal to [05M02]a. 7b. 5c. 4d. 0

    5. A source is transmitting two messages A and B with probabilities 3/4, and 1/4 respectively. The coding efficiency ofirst order extension of the source is [05M03]a. 89 %b. 77 %c. 92 %d. 81 %

    6. The noise characteristic of a communication channel is given as

    Rows represent the source and columns represent the columns. The Channel is a [05M04]a. Noise free channelb. Asymmetric channelc. Symmetric channeld. Deterministic channel

    7. The source coding efficiency can be increased by [05S01]a. using source extensionb. increasing the entropy of the sourcec. decreasing the entropy of the sourced. using binary coding

    8. The capacity of a channel with infinite band width is [05S02]a. infinite because of infinite band widthb. finite because of increase in noise powerc. infinite because of infinite noise powerd. finite because of finite message word length

    9. The Hamming Weight of the (6,3) Linear Block coded word 101011 [05S03]a. 3b. 4c. 5d. 2

    0. The cascade of two Binary Symmetric Channels is a [05S04]a. symmetric Binary channelb. asymmetric quaternary channelc. symmetric quaternary channeld. asymmetric Binary channel

    1. In a (6,3) systematic Linear Block code, the number of `6` bit code words that are not useful is [06D01]a. 56b. 64c. 8d. 45

    2. The Parity check Matrix H of a (6,3) Linear systematic Block code is Then [06D02]

    a. C.HT = [1], where C is the code word of the code.b. if the syndrome vector S computed for the received code word is [1 1 0], the third bit of the received code word

    error.

    c. The syndrome vector S of the received code word is same as C.HT

    d. The syndrome vector is S= [1 1 1] under error free reception

    3. In a Binary Symmetric channel, a transmitted 0 is received as 0 with a probability of 1/8.Then, the transition probaof the transmitted 0 is [06M01] www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    6/18

    a. 1/8b. 7/8c. 6/8d. 5/8

    4. A source transmitting `n` number of messages is connected to a noise free channel. The capacity of the channel is[06M02]a. n bits/symbol

    b. n2 bits/symbolc. log n bits/symbold. 2n bits/symbol

    5. There are four binary words given as 0000,0001,0011,0111. Which of these can not be a member of the parity chematrix of a (15,11) linear Block code? [06M03]

    a. 0000,0001b. 0000c. 0011d. 0111

    6. If X is the transmitter and Y is the receiver and if the channel is the noise free, then, the mutual information I(X,Y)equal to [06S01]a. Joint entropy of the source and receiverb. Entropy of the sourcec. Conditional Entropy of the receiver, given the sourced. Conditional Entropy of the source, given the receiver

    7. Which of the following is correct? [06S02]a. Source coding introduces redundancyb. Channel coding is an efficient way of representing the output of a sourcec. ARQ scheme of error control is applied after the receiver makes a decision about the received bit

    d. ARQ scheme of error control is applied when the receiver is unable to make a decision about the received bit.

    8. Which of the following is an FEC scheme? [06S03]a. Shanon-Fano encodingb. Huffman encodingc. Non-systematic cyclic codesd. Duo-binary encoding

    9. A discrete source X is transmitting m messages and is connected to the receiver Y through a symmetric channel. Thcapacity of the channel is given as [06S04]a. log m - H(X/Y) bits/symbolb. log m bits/symbolc. log m - H(Y/X) bits/symbold. H(X) + H(Y) - H(X,Y) bits/symbol

    0. If the received code word of a (6,3) linear Block code is 100111 with an error in the bit, the corresponding erro

    pattern will be [06S05]a. 100000b. 000001c. 001000d. 000010

    1. For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x+x3, the codepolynomial is [07D01]

    a. 1+x+x3+x5

    b. 1+x2+x3+x5

    c. 1+x2+x3+x4

    d. 1+x4+x5

    2. The output of a source is band limited to 6KHz.It is sampled at a rate of 2KHz above the nyquist`s rate. If the entro

    the source is 2bits/sample, then the entropy of the source in bits/sec is [07D02]a. 24Kbpsb. 28Kbpsc. 12Kbpsd. 32Kbps

    3. When two fair dice are thrown simultaneously, the information content of the message ` the sum of the faces is 12bits is [07M01]a. 1b. 5.17c. 4.17d. 3.58

    4. The encoder of a (7,4) systematic cyclic encoder with generating polynomial g(x) = 1+x2+x3 is basically a [07M02a. 4 stage shift registerb. 3 stage shift registerc. 11 stage shift register www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    7/18

    d. 22 stage shift register

    5. A received code word of a (7,4) systematic received cyclic code word 1000011 is corrected as 1000111.Thecorresponding error pattern is [07M03]a. 0000100b. 0001000c. 0000010d. 0010001

    6. The product of 5 and 6 in Modulo-7 multiplication is [07S01]a. 30b. 1c. 2d. 3

    7. Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [07S02]

    a. x3+x+1

    b. x4+x3+1

    c. x7+x4+x3+1

    d. x5+x2+1

    8. The time domain behavior of a convolutional encoder of code rate 1/3 is defined in terms of a set of [07S03]a. 3 step responsesb. 3 impulse responsesc. 3 ramp responsesd. 3 sinusoidal responses

    9. Which of the following is correct? [07S04]a. In an (n,k) block code, each code word is the cyclic shift of an another codeword of the code.b. In an (n,k) systematic cyclic code, the sum of two code words is another code word of the code.c. In a convolutional encoder, the constraint length of the encoder is equal to the tail of the message sequence + d. Source encoding reduces the probability of transmission errors

    0. A linear block code with Hamming distance 5 is [07S05]a. Single error correcting and double error detecting codeb. double error detecting codec. Triple error correcting coded. Double error correcting code

    1. The channel capacity of a BSC with transition probability 1/2 is [08D01]a. 0 bitsb. 1bitc. 2 bitsd. infinity

    2. White noise of PSD w/Hz is applied to an ideal LPF with one sided band width of 1Hz .The two sided output nopower of the channel is [08D02]a. four times the input PSDb. thrice the input PSDc. twice the input PSDd. same as the input PSD

    3. A convolutional encoder of code rate 1/2 is a 3 stage shift register with a message word length of 6.The code wordlength obtained from the encoder ( in bits) is [08M01]a. 9b. 18c. 27d. 36

    4. A source X with entropy 2 bits/message is connected to the receiver Y through a Noise free channel. The conditionprobability of the source, given the receiver is H(X/Y) and the joint entropy of the source and the receiver H(X,Y) .

    [08M02]a. H(X,Y) = 2 bits/messageb. H(X/Y) = 2 bits/messagec. H(X,Y) = 0 bits/messaged. H(X/Y) = 1 bit/message

    5. A channel with independent input and output acts as [08M03]a. lossless networkb. resistive networkc. channel with maximum capacityd. Gaussian channel

    6. Automatic Repeat Request is a [08S01]a. Source coding schemeb. error correction scheme

    c. error control schemed. data conversion scheme www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    8/18

    7. Channel coding [08S02]a. avoids redundancyb. reduces transmission efficiencyc. increases signal power level relative to channel noised. results in reduced transmission band width requirement

    8. The information content available with a source is referred to as [08S03]a. Mutual informationb. Trans informationc. capacityd. entropy

    9. In a Linear Block code [08S04]a. the received power varies linearly with that of the transmitted power

    b. the encoder satisfies super position principlec. parity bits of the code word are the linear combination of the message bitsd. the communication channel is a linear system

    0. In Modulo-4 arithmetic, the product of 3 and 2 is [08S05]a. 6b. 3c. 2d. 4

    1. For the data word 1110 in a (7,4) non-systematic cyclic code with the generator polynomial 1+x2+x3, the codepolynomial is [09D01]

    a. 1+x+x3+x5

    b. 1+x2+x3+x5

    c. 1+x2+x3+x4

    d. 1+x+x5

    2. In a (7,4) systematic Linear Block code, the number of `7` bit code words that are not useful for the user is [09D0a. 16b. 112c. 128d. 96

    3. Which of the following is a valid source coding scheme for a source transmitting four messages? [09M01]a. 0,00,001,110b. 1,11,111,1110c. 0,10,110,111d. 1,01,001,0010

    4. A system has a band width of 4KHz and an S/N ratio of 28 at the input to the Receiver. If the band width of the chais doubled, then [09M02]

    a. Capacity of the channel gets doubledb. Capacity of the channel gets squaredc. S/N ratio at the input of the received gets halvedd. S/N ratio at the input of the received gets doubled

    5. The Memory length of a convolutional encoder is 4.If a 5 bit message sequence is applied as the input for the encodthen for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is[09M03]a. 4b. 3c. 5d. 6

    6. In Modulo-5 multiplication, the product of 4 and 3 is [09S01]a. 12b. 7c. 2d. 3

    7. Which of the following provides minimum redundancy in coding? [09S02]a. (15,11) linear block codeb. Shanon-Fano encodingc. (6,3) systematic cyclic coded. Convolutional code

    8. If C is the channel capacity and S is the signal input of the channel and is the Input noise PSD, then which of th

    following is the Shannon`s limit? [09S03]a.

    b.

    c.

    d. www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    9/18

    9. A communication channel is fed with an input signal x(t) and the noise in the channel is negative. The Power receivthe receiver input is [09S04]a. Signal power - Noise powerb. Signal power/ Noise powerc. Signal power + Noise Powerd. Signal Power x Noise Power

    0. The fundamental limit on the average number of bits/source symbol is [09S05]a. Information content of the messageb. Entropy of the sourcec. Mutual Informationd. Channel capacity

    1. The Parity Check Matrix of a (6,3) Systematic Linear Block code is

    If the Syndrome vector computed for the received code word is [ 0 1 0] , then for error correction, which of the bitsthe received code word is to be complemented? [10D01]a. 2b. 3c. 4d. 5

    2. White noise of PSD is applied to an ideal LPF with one sided band width of B Hz. The filter provides a gain of 2.

    output power of the filter is 8 , then the value of B in Hz is [10D02]a. 2b. 4c. 6d. 8

    3. The Memory length of a convolutional encoder is 5.If a 6 bit message sequence is applied as the input for the encothen for the last message bit to come out of the encoder, the number of extra zeros to be applied to the encoder is[10M01]a. 4b. 3c. 5d. 6

    4. A source is transmitting four messages with equal probability. Then,for optimum Source coding efficiency, [10M02a. necessarily, variable length coding schemes should be usedb. Variable length coding schemes need not necessarily be usedc. Fixed length coding schemes should not be usedd. Convolutional codes should be used

    5. Which of the following is a valid source coding scheme for a source transmitting five messages? [10M03]a. 0,00,110,1110,1111b. 1,11,001,0001,0000c. 0,10,1110,110,1111d. 1,01,001,0010, 1111

    6. In Modulo-7 addition, 6 + 4 is equal to [10S01]a. 10b. 2c. 3d. 5

    7. Which of the following provides minimum redundancy in coding? [10S02]a. (6,3) linear block codeb. (15,11) linear block code

    c. (6,3) systematic cyclic coded. Convolutional code

    8. Which of the following involves the effect of the communication channel? [10S03]a. Information content of a messageb. Entropy of the sourcec. Mutual informationd. information rate of the source

    9. Which of the following can be the generating polynomial for a (7,4) systematic Cyclic code? [10S04]

    a. x3+x2+1

    b. x4+x3+1

    c. x7+x4+x3+1

    d. x5+x2+1

    0. Which of the following provides the facility to recognize the error at the receiver? [10S05]a. Shanon-Fano Encoding www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    10/18

    b. ARQc. FECd. differential encoding

    1. Which of the following coding schemes is linear ? [11D01]a. C = { 00,01,10,}b. C = {000,111,110}c. C = {000,110,111,001}d. C = {111,110,011,101}

    2. If the transition probability of messages 0 and 1 in a communication system is 0.1, the noise matrix of thecorresponding Communication channel is [11D02]

    a.

    b.

    c.

    d.

    3. In a BSC, the rate of information transmission over the channel decreases as [11M01]a. transmission probability approaches 0.5b. transmission probability approaches 1c. transition probability approaches 0.5d. transition probability approaches 1

    4. A source X is connected to a receiver R through a lossless channel. Then [11M02]

    a. H(Y/X) = 0b. H(X,Y) = 0c. H(X) = I(X,Y)d. H(X/Y)= I(X,Y)

    5. Which of the following is a valid source coding scheme for a source transmitting four messages? [11M03]a. 0,00,110,1110b. 1,11,001,0001c. 0,10,1110,110d. 1,01,001,0010

    6. The Hamming distance of a triple error correcting code is [11S01]a. 5b. 6c. 7d. 8

    7. A channel whose i/p is xiand output is y

    jis deterministic if [11S02]

    a.

    b.

    c.

    d.

    8. If a memoryless source of information rate R is connected to a channel with a channel capacity C, then on which offollowing statements, the channel coding for the output of the source is based ? [11S03]a. R must be greater than or equal to Cb. R must be exactly equal to Cc. R must be less than or equal to Cd. Minimum number of bits required to encode the output of the source is its entropy

    9. Which of the following is correct? [11S04]a. Source coding reduces transmission efficiencyb. Channel coding improves transmission efficiencyc. Entropy of a source is a measure of uncertainty of its outputd. Cyclic code is an ARQ scheme of error control

    0. The minimum source code word length of the message of a source is equal to [11S05]a. its entropy measured in bits/secb. the channel capacityc. its entropy measured in bits/messaged. the sampling rate required for the source

    1. If the transition probability of messages 0 and 1 in a communication system is 0.2, the noise matrix of thecorresponding Communication channel is [12D01]

    a.

    www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    11/18

    b.

    c.

    d.

    2. In a BSC, if the transition probability of the messages 0 and 1 is P, and if they are of equal transmission probabilitythen, the probability of these symbols to appear at the channel output is [12D02]a. P,Pb. 1/2, 1/2c. 1,1d. P,1 - P

    3. The number of bits to be used by the efficient source encoder to encode the output of the source is equal to [12M0a. the information rate of the sourceb. entropy of the sourcec. channel capacityd. information content of each message

    4. A source X is connected to a receiver R through a deterministic channel. Then [12M02]a. H(X/Y) = 0b. H(Y/X) = 0c. H(X) = I(X,Y)d. H(X/Y)= I(X,Y)

    5. Which of the following can be valid source coding scheme for a source transmitting 3 messages? [12M03]a. 0,00,110b. 1,01,001c. 0,10,101d. 1,01,011

    6. For an (n,k) cyclic code, E(x) is the error polynomial, g(x) is the generator Polynomial, R(x) is the received codepolynomial and C(x) is the transmitted code polynomial. Then, the Syndrome polynomial S(x) is [12S01]a. Remainder of C(x)/g(x)b. Remainder of E(x)/g(x)c. E(x).g(x)d. R(x) + g(x)

    7. If is the input noise PSD and S is the input signal power for a communication channel of capacity C, then Which

    the following is Shanon`s Limit? [12S02]a.

    b.

    c.d.

    8. The Hamming distance of an error correcting code capable of correcting 4 errors is [12S03]a. 8b. 9c. 7d. 6

    9. BCH codes capable of correcting single error are [12S04]a. Systematic Linear Block codesb. Cyclic Hamming codesc. Convolutional codesd. Non-Systematic Linear Block codes

    0. Which of the following provides the facility to recognize the error at the receiver? [12S05]a. Shanon-Fano Encodingb. Parity Check codesc. Huffman encodingd. differential encoding

    1. The output of a source is a continuous random variable uniformly distributed over (0,2).The entropy of the source bits/sample is [13D01]a. 4b. 2c. 1.5d. 1

    2. An AWGN low pass channel with 4KHz band width is fed with white noise of PSD = 10 W/Hz. The two sided

    power at the output of the channel is [13D02]

    a. 4 nWb. 2 nW www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    12/18

    c. 6 nWd. 8 nW

    3. A system has a band width of 3KHz and an S/N ratio of 29dB at the input of the receiver. If the band width of thechannel gets doubled, then [13M01]a. its capacity gets doubledb. the corresponding S/N ratio gets doubledc. its capacity gets halvedd. the corresponding S/N ratio gets halved

    4. A source is transmitting two symbols A and B with probabilities 7/8 and 1/8 respectively. The average source codelength can be decreased by [13M02]a. reducing transmission probabilityb. increasing transmission probability

    c. using noise free channeld. using pair coding

    5. Non-Uniqueness of Huffman encoding results in [13M03]a. different coding efficienciesb. different average code word lengthsc. different entropiesd. different sets of source code words

    6. Shanon's limit is for [13S01]a. maximum entropy of sourceb. maximum information rate of a sourcec. maximum information content of a messaged. maximum capacity of a communication channel under infinite band width

    7. In Modulo-6 addition, the sum of 1 and 5 is [13S02]

    a. 4b. 1c. 2d. 0

    8. FEC and ARQ schemes of error control can be applied for the outputs of [13S03]a. Binary symmetric channel onlyb. Binary Erasure channel onlyc. Binary Erasure channel, and Binary symmetric channel respectivelyd. Binary symmetric channel, and Binary erasure channel respectively

    9. The Hamming distance of an (n,k) systematic cyclic code is [13S04]a. the weight of any non-zero code wordb. the weight of a code word consisting of all 1`sc. the weight the code word consisting of alternate 1 s and 0 sd. the minimum of weights of all non zero code words of the code

    0. Which of the following is effected by the communication channel? [13S05]a. Information content of a messageb. Entropy of the sourcec. information rate of the Mutual informationd. source

    1. The maximum average amount of information content measured in bits/sec associated with the output of a discretinformation source transmitting 8 messages and 2000messages/sec is [14D01]a. 6Kbpsb. 3Kbpsc. 16Kbpsd. 4Kbps

    2. Which of the following coding schemes is linear ? [14D02]a. C = {00, 01,10,11}

    b. C = {01,10,11}c. C = {110,111,001}d. C = {000,110,011}

    3. A communication source is connected to a receiver using a communication channel such that, the uncertainty aboutransmitted at the receiver, after knowing the received is zero. Then, the information gained by the observer at thereceiver is [14M01]a. same as the entropy of the sourceb. same as the entropy of the receiverc. same as the joint entropy of the source and the receiverd. same as the conditional entropy of the source, given the receiver

    4. X(t) and n(t) are the signal and the noise each is band limited to 2B Hz applied to a communication channel band lto BHz. Then ,the minimum number of samples/sec that should be transmitted to recover the input of the channel output is [14M02]a. 2Bb. 4Bc. B

    www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    13/18

    d. 6B

    5. The upper limit on the minimum distance of a linear block code is [14S01]a. minimum weight of the non-zero code word of the codeb. minimum number of errors that can be correctedc. maximum number of errors that can be correctedd. maximum weight of the non-zero code word of the code

    6. Information rate of a source can be used to [14S02]a. differentiate between two sourcesb. to find the entropy in bits/message of a sourcec. correct the errors at the receiving sided. design the matched filter for the receiver

    7. A source is transmitting only one message. Then [14S03]a. the reception of the message conveys zero information to the userb. the reception of the message conveys maximum information to the userc. the message received will be corrupted by noised. the channel capacity required is infinite

    8. If C is the code word and H is the Parity check Matrix of an (n,k) linear block code, then, [14S04]a. each code word of the (n,k) code is orthogonal to the code word of its dual codeb. each code word of the (n,k) code is orthogonal to the code word of the same codec. C.H = [0]

    d. HT.C = 0

    9. Theoritically, the entropy of a continuous random variable is [14S05]a. infinityb. zeroc. unity

    d. finite, but >0 and >1.

    0. A convolutional encoder is having a constraint length of 4 and for each input bit, a two bit word is the output of theencoder. If the input message is of length 5, the exact code rate of the encoder is [15D01]a. 50 %b. 31.25 %c. 45.3 %d. 23.3 %

    1. In a message conveyed through a sequence of independent dots and dashes, the probability of occurrence of a dasone third of that of dot.The information content of the word with two dashes in bits is [15D02]a. 2b. 4c. 8d. 16

    2. The voice frequency modulating signal of a PCM system is quantized into 16 levels. If the signal is band limited to 3the minimum symbol rate of the system is [15M01]a. 48 kilosymbols/secb. 6 kilosymbols/secc. 96 kilosymbols/secd. 3 kilosymbols/sec

    3. A source is transmitting four messages with probabilities 1/2, 1/4, 1/8 and 1/8. To have 100 % transmission efficthe average source code word length of the message of the source should be [15M02]a. 2 bitsb. 1.75 bitsc. 3 bitsd. 3.75 bits

    4. A source is transmitting six messages with probabilities, 1/2, 1/4, 1/8, 1/16,1/32, and 1/32.Then [15M03]a. Channel coding will reduce the average source code word length.b. Two different source code word sets can be obtained using Huffman coding.c. Source coding improves the error performance of the communication system.d. Two different source code word sets can be obtained using Shanon-Fano coding

    5. The average source code word length per bit can be decreased by [15S01]a. increasing the entropy of the sourceb. extending the order of the sourcec. using a channel with very large capacityd. increasing the transmitted power

    6. Trade-off between Band width and Signal to Noise ratio results in [15S02]a. Shanon s limitb. the concept of transmitting the given information using various combinations of signal power and band widthc. a noise free channeld. minimum redundancy

    7. Binary erasure channel is an example of [15S03]a. Wide band channel

    www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    14/18

    b. severe effect of channel noise on the message transmittedc. narrowband channeld. symmetric channel

    8. In a symmetric channel, [15S04]a. transmission errors will be lessb. rows and columns of the corresponding channel matrix are identical, except for permutationc. required transmission power will be lessd. rows and columns of the corresponding channel matrix are identical on permutation basis

    9. Which of the following is correct? [15S05]a. Mutual information of a communication system is same as the entropy of the sourceb. Mutual information of a communication system is the information gained by the observerc. Mutual information is independent of the channel characteristics

    d. Mutual information of a communication system is same as the entropy of the receiver0. A source generates three symbols with probabilities of 0.25, 0.25 and 0.5 at a rate of 3000 symbols/sec. Assuming

    independent generation of symbols, the most efficient source encoder would have an average bit rate of [16D01]a. 6000 bpsb. 4500 bpsc. 3000 bpsd. 1500 bps

    1. A memoryless source emits 2000 binary symbols/sec and each symbol has a Probability of 0.25 to be equal to 1 an0.75 to be equal to 0.Thre minimum number of bits/sec required for error free transmission of this source is [16D0a. 1500b. 1622c. 1734d. 1885

    2. In a communication system, due to noise in the channel, an average of one symbol in each 100 received is incorrecsymbol transmission rate is 1000.The number of bits in error in the received symbols is [16M01]a. 1b. 10c. 100d. 20

    3. Which of the following is a valid source coding scheme for a source transmitting Six messages? [16M02]a. 0, 10,110, 1100,1111, 11000b. 1, 11, 101, 1100, 11010,11001c. 0,10,110,1110,11110,11111d. 1,10,100,1110,11110,11111

    4. The encoder of an (15,11) systematic cyclic code requires [16M03]a. 4 bit shift register and 3 Modulo-2 adders.b. 4 bit shift register and 4 Modulo-2 adders.

    c. 3 bit shift register and 3 Modulo-2 adders.d. 3 bit shift register and 4 Modulo-2 adders.

    5. The distance between the any code word and an all zero code word of an (n,k) linear Block code is referred to as[16S01]a. Hamming distance of the codeb. Code rate of the codec. Redundancy of the coded. Hamming weight of the code word

    6. As per source coding Theorem, it is not possible to find any uniquely decodable code whose average length is [16Sa. less than the entropy of the sourceb. greater than the entropy of the sourcec. equal to number of messages from the sourced. equal to the efficiency of transmission of the source

    7. The coding efficiency due to second order extension of a source [16S03]a. is moreb. is lessc. remains unalteredd. can not be computed

    8. Exchange between Band width and Signal noise ratio can be justified based on [16S04]a. Shanon s limitb. Shanon`s source coding Theoremc. Hartley - Shanon`s Lawd. Shanon`s channel coding Theorem

    9. A source X is connected to a receiver Y through a noise free channel. Its capacity is [16S05]a. Maximum of H(X/Y)b. Maximum of H(X)c. Maximum of H(Y/X)d. Maximum of H(X,Y) www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    15/18

    0. A zero memory source emits two messages A and B with probability of 0.8 and 0.2 respectively. The entropy of thesecond order extension of the source is [17D01]a. 1.56 bits/ messageb. 0.72 bits/ messagec. 0.78 bits/ messaged. 1.44 bits/message

    1. A signal amplitude X is a uniform random variable in the range (-1,1).Its differential Entropy is [17D02]a. 2 bits/sampleb. 4 bits/samplec. 1 bit/sampled. 3 bits/sample

    2. A communication channel is so noisy that the output Y of the channel is statistically independent of the input X. Th

    [17M01]a. H(X,Y) = H(X).H(Y)b. H(X,Y) = H(X)+H(Y)c. H(X/Y) = H(Y/X)d. H(X) = H(Y)

    3. A transmitting terminal has 128 characters and the data sent from the terminal consist of independent sequences equiprobable characters. The entropy of the above terminal in bits/character is [17M02]a. 10b. 7c. 1.44d. 14

    4. For a (7,4) systematic Cyclic code, the generator polynomial is 1+x+x3.Then, the Syndrome vector corresponding terror pattern 0000010 is [17M03]a. 100

    b. 010c. 011d. 111.

    5. Which of the following is correct? [17S01]a. A noise free channel is not a deterministic channelb. For a noise free channel, H(X/Y) = 1c. The channel Matrix of a noise free channel is an Identity Matrixd. A noise free channel is of infinite capacity.

    6. In a communication system, information lost in the channel is measured using [17S02]a. H(X/Y)b. I(X,Y)c. H(X) H(X/Y)d. H(Y/X)

    7. Capacity of a BSC with infinite band width is not infinity, because [17S03]a. Noise power in the channel and the band width varies linearlyb. Noise power in the channel inversely varies with band widthc. Noise power in the channel is independent of band widthd. Noise power in the channel will not effect the signal power

    8. For a noise free channel, I(X,Y) is equal to [17S04]a. entropy of the source, given the receiverb. entropy of the receiver, given the sourcec. entropy of the sourced. joint entropy of the source and the receiver

    9. The output of a continuous source is a Gaussian random variable with variance 2 and is band limited to fm

    Hz. Th

    maximum entropy of the source is [17S05]a.

    b.

    c.

    d.

    0. If the generator polynomial of a (7,4) Non-systematic cyclic code is given as g(x) = 1+x+x2+x4, then the binary w

    corresponding to x2.g(x) + g(x) is [18D01]a. 1101110b. 1101001c. 1010101d. 1010111

    1. An (7,4) systematic cyclic code has a generator polynomial g(x) = 1+x+x3,and the Code polynomial is V(x) = x+. Then, the remainder of the division V(x)/g(x) is [18D02]

    a. Zerob. syndrome vector www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    16/18

    c. received code wordd. received and corrected code word.

    2. The out put of a continuous source is a uniform random variable of (0,1).Then [18M01]a. the absolute entropy of the source is zerob. output of the source is a gaussian random variablec. the relative entropy of the source is zerod. the source is discrete memory less source

    3. The generator sequence of an adder in a convolutional encoder is (1,1,1,1).It is its response for an input sequence[18M02]a. 0,1,0, 0, ..b. 0,0,1,0,0,0,1,0,...c. 1,0,0,0,..

    d. 1,1,1,..4. In a communication system, the average amount of uncertainty associated with the Source, sink, source and sink j

    in bits/message are 1.0613,1.5 and 2.432 respectively. Then the information transferred by the channel connectinsource and sink in bits is [18M03]a. 0.1293b. 4.9933c. 1.945d. 2.8707

    5. The efficiency of transmission of information can be measured by [18S01]a. comparing the entropy of the source and maximum limit for the rate of Ans.c transmission of information over the channel.b. comparing the and maximum limit for the rate of transmission of information over the channel and the conditional entropy o

    receiver, given the sourcec. comparing the actual rate of transmission and maximum limit for the rate of transmission of information over th

    channel

    d. comparing the entropy of the source and the information content of each individual message of the source.

    6. Binary Erasure channel is the mathematical modeling of [18S02]a. the effect of channel noise resulting in the incorrect decision of the message bit transmittedb. the inability of the receiver to make a decision about of the received message bit in the back ground of noisec. the error correction mechanism at the receiving sided. error detection mechanism at the receiving side

    7. In which of the following matrices, the sum of each row is one ? [18S03]a. Joint probability Matrixb. Channel Matrixc. Conditional probability of the source, given the receiverd. Generator Matrix

    8. If T is the code vector and H is the Parity check Matrix of a Linear Block code, then the code is defined by the set ofcode vectors for which [18S04]

    a. T.HT = 0.

    b. HT.T = 0c. H.T = 0d. = 0.

    9. Which of the following is correct? [18S05]a. The entropy measure of a continuous source is not an absolute measureb. A Binary symmetric channel is a noise free channelc. The channel capacity of a Symmetric channel is always 1 bit/symbold. Self information and mutual information are one and the same.

    0. A BSC has a transition probability of P. The cascade of two such channels is [19D01]a. an asymmetric channel with transition probability 2P

    b. a symmetric channel with transition probability P2.c. an asymmetric channel with transition probability P(1 - P)

    d. a symmetric channel with transition probability 2P(1 - P)

    1. A source is transmitting four messages with probabilities of 0.5, 0.25, 0.125and 0.125.By using Huffman coding, thpercentage reduction in the average source code word length is [19D02]a. 10 %b. 20 %c. 12.5 %d. 25 %

    2. The parity polynomial in the generation of a systematic (7,4) cyclic code for the data word 1 1 0 0 is 1+x2.Thecorresponding code word is [19M01]a. 1 1 0 0 1 0 1b. 1 1 1 0 0 1 0c. 1 1 1 0 1 0 0d. 1 1 0 1 0 1 0

    3. Which of the following are prefix free codes? [19M02]www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    17/18

    a. 0,01,011,0001b. 1,01,001,0001c. 0,1,00,11d. 00,01,00,11

    4. Which of the following is a single error correcting perfect code? [19M03]a. (6,3) systematic cyclic codeb. (7,4) systematic Linear Block codec. (15,11) Hamming coded. Convolutional code of code rate 1/2

    5. If X is the transmitted message and Y is the received message, then the average information content of the pair (Xequal to the average information of Y plus [19S01]a. average information of X

    b. the average information of Y after X is knownc. the average information of X after Y is knownd. mutual information of (X,Y).

    6. The entropy H( ) is [19S02]

    a. information X2

    to some one who knows X1.

    b. information X2

    with out knowing X1.

    c. Mutual information of X1

    and X2

    d. effect of noise in receiving X1

    as X2.

    7. If X and Y are the transmitter and the receiver, in a BSC, P(X = i/Y=j) measures [19S03]a. uncertainty about the received bit based on the transmittedb. certainty about the received bit based on the transmittedc. certainty about the transmitted bit based on the received

    d. uncertainty about the transmitted bit based on the received8. If X and Y are related in one-to-one manner, then, H(X/Y) in bits is [19S04]

    a. 1b. log m, m being the number of messages with sourcec. 0d. 0.5

    9. If the output of the channel is independent of the input, then [19S05]a. maximum information is conveyed over the channelb. no information is transmitted over the channelc. no errors will occur during transmissiond. information loss is zero

    0. The Parity check matrix of a linear block code is .Its Hamming distance is [20D01]

    a. 4b. 3c. 6d. 5

    1. A source with equally likely outputs is connected to a communication channel with channel matrix

    .The columns of the matrix represent the probability that a transmitted bit is identified as 0, a transmitted bitunidentified, and a transmitted bit is identified as 1 respectively. Then, the probability that the bit is not identified [20D02]a. 0.4b. 0.6c. 0.3

    d. 0.22. The Hamming distance of the code vectors Ci and Cj is [20M01]

    a. weight of

    b. minimum of the weights of Ciand C

    j

    c. weight of

    d. sum of the weights of Ciand C

    j

    3. The minimum number of parity bits required for the single error correcting linear block code for 11 data bits is [20a. 3b. 5c. 6d. 4

    4. A source X with symbol rate of 1000 symbols/sec is connected to a receiver Y using a BSC with transition probabiliThe messages of the source are equally likely. Then, rate of information transmission over the channel in bits per s[20M03]

    www.UandiStar.org

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/
  • 8/2/2019 dc2ndMIDUandiStar

    18/18

    a.

    b.

    c.

    d.

    5. Entropy coding is a [20S01]a. variable length coding schemeb. fixed length coding schemec. channel coding schemed. differential coding scheme

    6. Which of the following is correct? [20S02]a. Mutual information is symmetric about transmitted and received pairs

    b. Binary Erasure channel is a symmetric channelc. Channel matrix gives the joint probabilities of the transmitted and received pairsd. Channel capacity of a noise free channel is zero.

    7. For a BSC with transition probability P, the bit error probability is [20S03]a. 1 - Pb. Pc. 2Pd. 2(1 - P)

    8. A (4,3) Parity check code can [20S04]a. correct all single error patternsb. detect all double error patternsc. detect all triple error patternsd. correct all single error patterns and detect all double error patterns

    9. A source of information rate of 80 Kbps is connected to a communication channel of capacity 66.6 Kbps. Then [20Sa. error free transmission is not possibleb. channel coding results in error free transmissionc. source coding will make the errors corrected at the receiverd. mutual information becomes maximum

    http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/http://www.uandistar.org/