7
Research Article Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes H. D. Arora 1 and Anjali Dhiman 2 1 Department of Mathematics, Amity Institute of Applied Sciences, Amity University, Noida 201301, India 2 Department of Mathematics, Ideal Institute of Management and Technology, GGSIP University, Delhi 110092, India Correspondence should be addressed to Anjali Dhiman; [email protected] Received 3 March 2015; Accepted 1 April 2015 Academic Editor: Ram U. Verma Copyright Β© 2015 H. D. Arora and A. Dhiman. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. e study of codes is introduced in Information eory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. us, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order and type for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established. 1. Introduction In the 1940s, a new branch of mathematics was introduced as Information eory. Information eory considered the problems of how to process information, how to store infor- mation, how to retrieve information, and hence decision- making. Information eory deals with the study that how information can be transmitted over communication chan- nels. In 1924 and 1928, Nyquist [1, 2] first studied the infor- mation measures and Hartley [3] studied that information measures were logarithmic in nature. New properties of infor- mation sources and communication channels were published by Shannon [4] in his research paper β€œA Mathematical eory of Communication.” Wiener [5] also started considering results similar to Shannon. A large number of applications of Information eory in various fields of social, physical, and biological sciences have been developed, for example, eco- nomics, statistics, accounting, language, psychology, ecology, pattern recognition, computer sciences, and fuzzy sets. Later, RΒ΄ enyi [6] introduced entropy of order , which is the limiting case of Shannon entropy. Fuzzy set theory was introduced by Zadeh [7]. It relates to the uncertainty occurring in the human cognitive processes. Fuzzy set theory has found applications in various engi- neering, biological sciences, and business. Zadeh introduced fuzzy entropy, a measure of fuzzy information based on Shannon’s entropy. Let be a fuzzy set in the universe of discourse Ξ©. We define membership function as :Ξ© β†’ [0, 1], where () is a degree of membership for each ∈Ω. e following properties were given by De Luca and Termini [8] which are to be satisfied by fuzzy entropy. (1) Fuzzy entropy is the minimum iff set is crisp. (2) Fuzzy entropy is the maximum when membership value is 0.5. (3) Fuzzy entropy decreases if set is sharpened. (4) Fuzzy entropy of a set is the same as its complement. In coding theory, we study various properties of codes for application in data compression, cryptography, error correc- tion, and network coding. e study of codes is introduced Hindawi Publishing Corporation International Journal of Mathematics and Mathematical Sciences Volume 2015, Article ID 258675, 6 pages http://dx.doi.org/10.1155/2015/258675

Comparative Study of Generalized Quantitative-Qualitative

  • Upload
    dangnga

  • View
    221

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Comparative Study of Generalized Quantitative-Qualitative

Research ArticleComparative Study of GeneralizedQuantitative-Qualitative Inaccuracy Fuzzy Measures forNoiseless Coding Theorem and 1:1 Codes

H. D. Arora1 and Anjali Dhiman2

1Department of Mathematics, Amity Institute of Applied Sciences, Amity University, Noida 201301, India2Department of Mathematics, Ideal Institute of Management and Technology, GGSIP University, Delhi 110092, India

Correspondence should be addressed to Anjali Dhiman; [email protected]

Received 3 March 2015; Accepted 1 April 2015

Academic Editor: Ram U. Verma

Copyright Β© 2015 H. D. Arora and A. Dhiman.This is an open access article distributed under the Creative Commons AttributionLicense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properlycited.

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, andnetwork coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computersciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages canbe done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, theminimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper,we have introduced mean codeword length of order 𝛼 and type 𝛽 for 1:1 codes and analyzed the relationship between averagecodeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzyinformation measure has been established.

1. Introduction

In the 1940s, a new branch of mathematics was introducedas Information Theory. Information Theory considered theproblems of how to process information, how to store infor-mation, how to retrieve information, and hence decision-making. Information Theory deals with the study that howinformation can be transmitted over communication chan-nels. In 1924 and 1928, Nyquist [1, 2] first studied the infor-mation measures and Hartley [3] studied that informationmeasureswere logarithmic in nature.Newproperties of infor-mation sources and communication channels were publishedby Shannon [4] in his research paper β€œAMathematicalTheoryof Communication.” Wiener [5] also started consideringresults similar to Shannon. A large number of applications ofInformation Theory in various fields of social, physical, andbiological sciences have been developed, for example, eco-nomics, statistics, accounting, language, psychology, ecology,pattern recognition, computer sciences, and fuzzy sets. Later,Renyi [6] introduced entropy of order π‘Ÿ, which is the limitingcase of Shannon entropy.

Fuzzy set theory was introduced by Zadeh [7]. It relates tothe uncertainty occurring in the human cognitive processes.Fuzzy set theory has found applications in various engi-neering, biological sciences, and business. Zadeh introducedfuzzy entropy, a measure of fuzzy information based onShannon’s entropy.

Let 𝐴 be a fuzzy set in the universe of discourse Ξ©. Wedefine membership function as πœ‡

𝐴: Ξ© β†’ [0, 1], where

πœ‡π΄(π‘₯) is a degree of membership for each π‘₯ ∈ Ξ©.The following properties were given by De Luca and

Termini [8] which are to be satisfied by fuzzy entropy.(1) Fuzzy entropy is the minimum iff set is crisp.(2) Fuzzy entropy is the maximum when membership

value is 0.5.(3) Fuzzy entropy decreases if set is sharpened.(4) Fuzzy entropy of a set is the same as its complement.

In coding theory, we study various properties of codes forapplication in data compression, cryptography, error correc-tion, and network coding. The study of codes is introduced

Hindawi Publishing CorporationInternational Journal of Mathematics and Mathematical SciencesVolume 2015, Article ID 258675, 6 pageshttp://dx.doi.org/10.1155/2015/258675

Page 2: Comparative Study of Generalized Quantitative-Qualitative

2 International Journal of Mathematics and Mathematical Sciences

in Information Theory, electrical engineering, mathematics,and computer sciences for the transmission of data throughreliable and efficient methods. We have to consider how cod-ing of messages can be done efficiently so that the maximumnumber of messages can be sent over a noiseless channel ina given time. Thus, the minimum value of mean codewordlength subject to a given constraint on codeword lengths hasto be founded. Let 𝑋 be random variable with each valueπ‘₯𝑖, 𝑖 = 1, 2, . . . , 𝑛, that generates messages represented by

the set {π‘Ž1, π‘Ž2, . . . , π‘Ž

𝐷} which need to be transmitted. This

set is called code alphabet and sequence assigned to eachπ‘₯𝑖is called codeword. The codeword length denoted by 𝑛

𝑖

associated with π‘₯𝑖satisfying Kraft’s inequality is given by

π·βˆ’π‘™1+π·βˆ’π‘™2+ β‹… β‹… β‹… +𝐷

βˆ’π‘™π‘›

≀ 1, (1)

where 𝐷 is the size of alphabet. Further, codes are chosen tominimize average codeword length which is given by

𝐿 =

𝑛

βˆ‘

𝑖=!

𝑛𝑖𝑝𝑖, (2)

where𝑝𝑖is the probability of the occurrence of π‘₯

𝑖. Now, Shan-

non’s [4] noiseless coding theorem for uniquely decipherablecodes is given by

𝐻(𝑃)

log𝐷≀ 𝐿 <

𝐻 (𝑃)

log𝐷+ 1 (3)

finding the lower bounds on 𝐿 in terms of Shannon’s entropy𝐻(𝑃).

Campbell [9] proposed a special exponentiated meancodeword length of order 𝛼 for uniquely decipherable codesgiven by

𝐿𝛼=

𝛼

1 βˆ’ 𝛼log𝐷[

𝑛

βˆ‘

𝑖=1𝑝𝑖𝐷(1βˆ’π›Ό)𝑛

𝑖/𝛼] (4)

and proved a noiseless coding theorem. Campbell analyzedthat its lower bound lies between 𝑅

𝛼(𝑃) and 𝑅

𝛼(𝑃)+ 1, where

𝑅𝛼(𝑃) = (1 βˆ’ 𝛼)βˆ’1log

𝐷[βˆ‘π‘›

𝑖=1 𝑝𝛼

𝑖]; 𝛼 > 0, 𝛼 = 1.

The above is Renyi’s measure of entropy of order 𝛼. As𝛼 β†’ 1, it is easily shown that 𝐿

𝛼→ 𝐿 and𝑅

𝛼(𝑃) approaches

𝐻(𝑃).For uniquely decipherable code, a weighted average

length was given by Guiasu and Picard [10]:

𝐿 =

𝑛

βˆ‘

𝑖=1

𝑒𝑖𝑛𝑖𝑝𝑖

βˆ‘π‘›

𝑖=1 𝑒𝑖𝑝𝑖. (5)

This is defined as the average cost of transmitting letters π‘₯𝑖

with probability 𝑝𝑖and utility 𝑒

𝑖.

A quantitative-qualitativemeasure of entropywas definedby Belis and Guiasu [11] for which Longo [12] gave the lowerbound for useful mean codeword length. Further, a noiselesscoding theorem was verified by Guiasu and Picard by intro-ducing lower bound for useful mean codeword length. AndGurdial and Pessoa [13] proved noiseless coding theorem bygiving lower bounds for useful mean codeword lengths of

order 𝛼 in terms of usefulmeasures of information of order 𝛼.The following information measure was introduced by Belisand Guiasu:

𝐻(𝑃,π‘ˆ) = βˆ’

𝑛

βˆ‘

𝑖=1𝑒𝑖𝑝𝑖log𝑝𝑖. (6)

Similarly, quantitative-qualitative information measure wasgiven by Taneja and Tuteja:

𝐻(

𝑃

𝑄

,π‘ˆ) =

𝑛

βˆ‘

𝑖=1𝑝𝑖𝑒𝑖log(

𝑝𝑖

π‘žπ‘–

) . (7)

Further, Bhaker and Hooda gave the following informationmeasures:

𝐻(

𝑃

𝑄

,π‘ˆ) =

βˆ‘π‘›

𝑖=1𝑒𝑖𝑝𝑖log (𝑝

𝑖/π‘žπ‘–)

βˆ‘π‘›

𝑖=1𝑒𝑖𝑝𝑖

,

𝐻𝛼(

𝑃

𝑄

,π‘ˆ) =

11 βˆ’ 𝛼

βˆ‘π‘›

𝑖=1𝑒𝑖𝑝𝛼

π‘–π‘ž1βˆ’π›Όπ‘–

βˆ‘π‘›

𝑖=1 𝑒𝑖𝑝𝑖, 𝛼 = 1.

(8)

Now, Baig and Dar [14, 15] proposed a new considered fuzzyinformation measure of order 𝛼 and type 𝛽:

𝐻𝛽

𝛼(𝐴; π‘ˆ)

=

121βˆ’π›Ό βˆ’ 1

[

[

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

βˆ’ 1]

]

.

(9)

And for this information measure they introduced the givenmean code length of order 𝛼 and type 𝛽:

𝐿𝛽

𝛼(π‘ˆ)

=

121βˆ’π›Ό βˆ’ 1

[

[

[

{

{

{

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}𝐷((1βˆ’π›Ό)/𝛼)𝑙

𝑖

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

}

}

}

𝛼

βˆ’ 1]]

]

.

(10)

Choudhary and Kumar [16] proved some noiseless codingtheorem on generalized R-Norm entropy. Also, Choudharyand Kumar [17] proposed some coding theorems on gen-eralized Havrda-Charvat and Tsallis entropy. Baig and Dar[14, 15] introduced few coding theorems on fuzzy entropyfunction depending upon parameters R andV and gave fuzzycoding theorem on generalized fuzzy cost measure. Tanejaand Bhatia [18] proposed a generalized mean codewordlength for the best 1:1 code and Parkash and Sharma [19, 20]proved some noiseless coding theorems corresponding tofuzzy entropies and introduced a new class of fuzzy codingtheorems. Parkash [21] introduced a new parametricmeasureof fuzzy entropy. Gupta et al. [22] proposed 1:1 codes forgeneralized quantitative-qualitative measure of inaccuracy.

Page 3: Comparative Study of Generalized Quantitative-Qualitative

International Journal of Mathematics and Mathematical Sciences 3

Jain and Tuteja [23] introduced a coding theorem connectedwith useful entropy of order 𝛼 and type 𝛽. Tuli [24] intro-duced mean codeword lengths and their correspondencewith entropy measures. Tuli and Sharma [25] proved somenew coding theorems and consequently developed some newweighted fuzzy mean codeword lengths corresponding to thewell-known measures of weighted fuzzy entropy.

In the next section, we will prove the fuzzy noiselesstheorem for 1:1 codes of binary size and hence prove that theyare less constrained.

2. Fuzzy Noiseless Coding Theorem for1:1 Codes

Theorem 1. Themean codeword length order 𝛼 and type 𝛽 for1:1 binary code satisfy:

𝐻𝛽

𝛼(𝐴; π‘ˆ) ≀ 𝐿 (1 :1)𝛽

𝛼(π‘ˆ) , (11)

where

𝐻𝛽

𝛼(𝐴; π‘ˆ) =

121βˆ’π›Ό βˆ’ 1

[

[

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

π›½βˆ’ 1]

]

,

𝐿 (1 :1)𝛽𝛼(π‘ˆ) =

121βˆ’π›Ό βˆ’ 1

[

[

[

{

{

{

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

}

}

}

𝛼

βˆ’ 1]]

]

.

(12)

Proof. Consider the following fuzzy information measure:𝐻𝛽

𝛼(𝐴; π‘ˆ)

=

121βˆ’π›Ό βˆ’ 1

[

[

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

βˆ’ 1]

]

.

(13)

And hence corresponding to the above fuzzy informationmeasure we introduce the given mean codeword length for1:1 code as follows:

𝐿 (1 :1)𝛽𝛼(π‘ˆ) =

121βˆ’π›Ό βˆ’ 1

[

[

[

{

{

{

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

}

}

}

𝛼

βˆ’ 1]]

]

. (14)

Using, Holder’s inequality,

𝑛

βˆ‘

𝑖=1π‘₯𝑖𝑦𝑖β‰₯ (

𝑛

βˆ‘

𝑖=1π‘₯𝑝

𝑖)

1/𝑝

(

𝑛

βˆ‘

𝑖=1π‘¦π‘ž

𝑖)

1/π‘ž

. (15)

For all, π‘₯𝑖, 𝑦𝑖> 0, 𝑖 = 1, 2, . . . , 𝑛; 1/𝑝 + 1/π‘ž = 1, 𝑝 < 1( = 0),

π‘ž < 0 or π‘ž < 1( = 0), and 𝑝 < 0.Let 𝑝 = (𝛼 βˆ’ 1)/𝛼, π‘ž = 1 βˆ’ 𝛼,

π‘₯𝑖=

(𝑒𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))})𝛼𝛽/(π›Όβˆ’1) 2βˆ’log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 (𝑒𝑖 {πœ‡π΄ (π‘₯𝑖) + (1 βˆ’ πœ‡π΄ (π‘₯𝑖))})𝛼𝛽/(π›Όβˆ’1) ,

𝑦𝑖

=

𝑒𝛽/(1βˆ’π›Ό)𝑖

(πœ‡(𝛼+π›½βˆ’1)/(1βˆ’π›Ό)𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))(𝛼+π›½βˆ’1)/(1βˆ’π›Ό)

)

βˆ‘π‘›

𝑖=1 (𝑒𝑖 {πœ‡π΄ (π‘₯𝑖) + (1 βˆ’ πœ‡π΄ (π‘₯𝑖))})𝛽/(1βˆ’π›Ό) .

(16)

Then, (15) becomes

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))π›½βˆ’1} 2βˆ’log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))π›½βˆ’1}

β‰₯[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

]

]

𝛼/(π›Όβˆ’1)

β‹…[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

]

]

1/(1βˆ’π›Ό)

.

(17)

Page 4: Comparative Study of Generalized Quantitative-Qualitative

4 International Journal of Mathematics and Mathematical Sciences

Now, we use the following condition:

𝑛

βˆ‘

𝑖=1𝑒𝛽

𝑖{πœ‡π›½βˆ’1𝐴

(π‘₯𝑖) + (1βˆ’πœ‡

𝐴(π‘₯𝑖))π›½βˆ’1} β‹…π·βˆ’π‘™π‘–

≀

𝑛

βˆ‘

𝑖=1𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1βˆ’πœ‡

𝐴(π‘₯𝑖))𝛽} .

(18)

Thus, the equation is reduced as follows:

[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

]

]

𝛼/(1βˆ’π›Ό)

β‰₯[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

]

]

1/(1βˆ’π›Ό)

.

(19)

Now, we raise the power on both sides by (1 βˆ’ 𝛼), and we get

[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

]

]

𝛼

βˆ’ 1 β‰₯βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽}

βˆ’ 1.

(20)

Hence, we multiply both sides by 1/(21βˆ’π›Ό βˆ’1) > 0, and we get

121βˆ’π›Ό βˆ’ 1

[

[

[

{

{

{

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽} 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

𝛽

}

}

}

𝛼

βˆ’ 1]]

]

β‰₯

121βˆ’π›Ό βˆ’ 1

[

[

βˆ‘π‘›

𝑖=1 𝑒𝑖𝛽{πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

}

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖{πœ‡π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))}

π›½βˆ’ 1]

]

.

(21)

That is, 𝐿(1 :1)𝛽𝛼(π‘ˆ) β‰₯ 𝐻

𝛽

𝛼(𝐴; π‘ˆ).

Hence, 𝐻𝛽𝛼(𝐴; π‘ˆ) ≀ 𝐿(1 :1)𝛽

𝛼(π‘ˆ); this proves the

result.

Theorem 2. Themean codeword length order 𝛼 and type 𝛽 for1:1 binary code satisfy the following:

𝐿 (1 :1)𝛽𝛼(π‘ˆ) < 𝐻

𝛽

𝛼(𝐴; π‘ˆ) 21βˆ’π›Ό + 1. (22)

Proof. We have

2βˆ’ log((𝑖+2)/2) β‰€πœ‡π›Ό

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼

(βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)) / (βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽))

. (23)

That is,

log(𝑖 + 22

) ≀ log(πœ‡π›Ό

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼

(βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)) / (βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽))

)

βˆ’1

β‹… 2 (24)

Page 5: Comparative Study of Generalized Quantitative-Qualitative

International Journal of Mathematics and Mathematical Sciences 5

or

2log((𝑖+2)/2) < (πœ‡π›Ό

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼

(βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)) / (βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽))

)

βˆ’1

β‹… 2 (25)

or

2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2) < (πœ‡π›Ό

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼

(βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)) / (βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽))

)

(π›Όβˆ’1)/𝛼

β‹… 2(1βˆ’π›Ό)/𝛼.

(26)

Hence, we multiply both sides by (𝑒𝛽

𝑖(πœ‡π›½π΄(π‘₯𝑖) + (1 βˆ’

πœ‡π΄(π‘₯𝑖))𝛽))/(βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½π΄(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)).

And summing over 𝑖 = 1, 2, . . . , 𝑛, we get

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2))

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

< (

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

)

1/𝛼

β‹… 2(1βˆ’π›Ό)/𝛼

(27)

or

(

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2))

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

)

𝛼

< (

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

) β‹… 21βˆ’π›Ό

(28)

or

(

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2))

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

)

𝛼

βˆ’ 1

< (

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

βˆ’ 1)

β‹… 21βˆ’π›Ό + 21βˆ’π›Ό βˆ’ 1.

(29)

Since 21βˆ’π›Ό βˆ’ 1 > 0 for 0 < 𝛼 < 1 and after suitable operations,we get

121βˆ’π›Ό βˆ’ 1

[

[

[

(

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽 2((1βˆ’π›Ό)/𝛼) log((𝑖+2)/2))

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

)

𝛼

βˆ’ 1]]

]

<

121βˆ’π›Ό βˆ’ 1

[

[

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›Ό+π›½βˆ’1𝐴

(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛼+π›½βˆ’1

)

βˆ‘π‘›

𝑖=1 𝑒𝛽

𝑖(πœ‡π›½

𝐴(π‘₯𝑖) + (1 βˆ’ πœ‡

𝐴(π‘₯𝑖))𝛽)

βˆ’ 1]

]

β‹… 21βˆ’π›Ό + 21βˆ’π›Ό βˆ’ 121βˆ’π›Ό βˆ’ 1

(30)

Page 6: Comparative Study of Generalized Quantitative-Qualitative

6 International Journal of Mathematics and Mathematical Sciences

or we can write 𝐿(1 : 1)𝛽𝛼(π‘ˆ) < 𝐻

𝛽

𝛼(𝐴; π‘ˆ)21βˆ’π›Ό + 1, which

proves the result.

3. Conclusion

In modern data transmission and storage systems, the keyingredients that help in achieving the high degree of reliabilityare the error correcting codes. The noiseless channel facesthe problem of efficient coding of messages and hence wehave to maximize the number of messages to be sent througha channel in a given time. Thus, we have introduced meancodeword length of order 𝛼 and type 𝛽 for 1:1 codes andanalyzed the relationship between average codeword lengthand fuzzy informationmeasures for binary 1:1 codes. Further,noiseless coding theorems associated with fuzzy informationmeasure have been established.

4. Future Research Endeavors

Decision-making problems in general utilize informationfrom the knowledge base of experts in view of their per-ception about that system. Besides mathematical studies,algorithms for application in decision-making can also bediscussed in extension of the above research. Comparisonof other fuzzy information measures in the light of fuzzynoiseless theorem and 1:1 codes of binary size can also beproved and compared. Further, this fuzzy noiseless codingtheorem can be utilized for characterizing and applying R-Norm entropy measures.

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper.

References

[1] H. Nyquist, β€œCertain factors affecting telegraph speed,” BellSystem Technical Journal, pp. 324–325, 1924.

[2] H. Nyquist, β€œCertain topics in telegraph transmission theory,”Transactions of theAmerican Institute of Electrical Engineers, vol.47, no. 2, pp. 617–644, 1928.

[3] R. T. V. Hartley, β€œTransmission of information,” Bell SystemTechnical Journal, vol. 7, no. 3, pp. 535–563, 1928.

[4] C. E. Shannon, β€œAmathematical theory of communication,”TheBell System Technical Journal, vol. 27, no. 3, pp. 379–656, 1948.

[5] N. Wiener, Cybernetics, MIT Press, John Wiley & Sons, NewYork, NY, USA, 1948.

[6] A. Renyi, β€œOn measures of entropy and information,” inProceedings of the 4th Berkeley Symposium on MathematicalStatistics and Probability, pp. 547–561, University of CaliforniaPress, 1961.

[7] L. A. Zadeh, β€œFuzzy sets,” Information and Control, vol. 8, pp.338–353, 1965.

[8] A. De Luca and S. Termini, β€œA definition of a non-probabilisticentropy in setting of fuzzy sets,” Information and Control, vol.20, no. 4, pp. 301–312, 1972.

[9] L. L. Campbell, β€œA coding theorem and renyi’s entropy,” Infor-mation and Control, vol. 8, pp. 423–429, 1965.

[10] S. Guiasu and C.-F. Picard, β€œBorne in ferictur de la longuerurutile de certains codes,” Comptes Rendus Mathematique Aca-demic des Sciences, vol. 273, pp. 248–251, 1971.

[11] M. Belis and S. Guiasu, β€œA quantitative-qualitative measureof information in cybernetic systems,” IEEE Transactions onInformation Theory, vol. 14, no. 4, pp. 593–594, 1968.

[12] G. Longo, Quantitative-Qualitative Measures of Information,Springer, New York, NY, USA, 1972.

[13] F. Gurdial and F. Pesson, β€œOn useful information of order 𝛼,”Journal of Combanatrics, Information & System Sciences, vol. 2,no. 4, pp. 158–162, 1977.

[14] M. Baig andM. J. Dar, β€œSome coding theorems on fuzzy entropyfunction depending upon parameter R andG,” IOSR Journal ofMathematics, vol. 9, no. 6, pp. 119–123, 2014.

[15] M. A. K. Baig and M. J. Dar, β€œFuzzy coding theorem ongeneralized fuzzy cost measure,” Asian Journal of Fuzzy andApplied Mathematics, vol. 2, pp. 28–34, 2014.

[16] A. Choudhary and S. Kumar, β€œSomemore noiseless coding the-orem on generalized R-Norm entropy,” Journal of MathematicsResearch, vol. 3, no. 1, pp. 125–130, 2011.

[17] A. Choudhary and S. Kumar, β€œSome coding theorems on gen-eralized Havrda-Charvat and tsalli’s entropy,” Tamkang Journalof Mathematics, vol. 43, no. 3, pp. 437–444, 2012.

[18] H. C. Taneja and P. K. Bhatia, β€œOn a generalizedmean codewordlength for the best 1:1 code,” Soochow Journal of Mathematics,vol. 16, pp. 1–6, 1990.

[19] O. Parkash and P. K. Sharma, β€œNoiseless coding theoremscorresponding to fuzzy entropies,” Southeast Asian Bulletin ofMathematics, vol. 27, no. 6, pp. 1073–1080, 2004.

[20] O. Parkash and P. K. Sharma, β€œA new class of fuzzy codingtheorems,” Caribbean Journal of Mathematical and ComputingSciences, vol. 12, pp. 1–10, 2002.

[21] O. Parkash, β€œA new parametric measure of fuzzy entropy,” inProceedings of the 7th International Conference on InformationProcessing and Management of Uncertainty in Knowledge-BasedSystems (IPMU ’98), pp. 1732–1737, Paris, France, July 1998.

[22] P. Gupta, Niranjan, and H. Arora, β€œOn best 1:1 codes for gen-eralized quantitativequalitative measure of inaccuracy,” AfricanJournal of Mathematics and Computer Science Research, vol. 4,no. 3, pp. 159–163, 2011.

[23] P. Jain and R. K. Tuteja, β€œOn a coding theorem connected withuseful entropy of order 𝛼 and type 𝛽,” Kybernetika, vol. 23, no.5, pp. 420–427, 1987.

[24] R. K. Tuli, β€œMean codeword lengths and their correspondencewith entropy measures,” International Journal of Engineeringand Natural Sciences, vol. 4, pp. 175–180, 2010.

[25] R. K. Tuli and C. S. Sharma, β€œApplications of measures of fuzzyentropy to coding theory,”American Journal ofMathematics andSciences, vol. 1, pp. 119–124, 2012.

Page 7: Comparative Study of Generalized Quantitative-Qualitative

Submit your manuscripts athttp://www.hindawi.com

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttp://www.hindawi.com

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

CombinatoricsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

International Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttp://www.hindawi.com

Volume 2014 Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014

Stochastic AnalysisInternational Journal of