Upload
dangnga
View
221
Download
1
Embed Size (px)
Citation preview
Research ArticleComparative Study of GeneralizedQuantitative-Qualitative Inaccuracy Fuzzy Measures forNoiseless Coding Theorem and 1:1 Codes
H. D. Arora1 and Anjali Dhiman2
1Department of Mathematics, Amity Institute of Applied Sciences, Amity University, Noida 201301, India2Department of Mathematics, Ideal Institute of Management and Technology, GGSIP University, Delhi 110092, India
Correspondence should be addressed to Anjali Dhiman; [email protected]
Received 3 March 2015; Accepted 1 April 2015
Academic Editor: Ram U. Verma
Copyright Β© 2015 H. D. Arora and A. Dhiman.This is an open access article distributed under the Creative Commons AttributionLicense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properlycited.
In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, andnetwork coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computersciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages canbe done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, theminimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper,we have introduced mean codeword length of order πΌ and type π½ for 1:1 codes and analyzed the relationship between averagecodeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzyinformation measure has been established.
1. Introduction
In the 1940s, a new branch of mathematics was introducedas Information Theory. Information Theory considered theproblems of how to process information, how to store infor-mation, how to retrieve information, and hence decision-making. Information Theory deals with the study that howinformation can be transmitted over communication chan-nels. In 1924 and 1928, Nyquist [1, 2] first studied the infor-mation measures and Hartley [3] studied that informationmeasureswere logarithmic in nature.Newproperties of infor-mation sources and communication channels were publishedby Shannon [4] in his research paper βAMathematicalTheoryof Communication.β Wiener [5] also started consideringresults similar to Shannon. A large number of applications ofInformation Theory in various fields of social, physical, andbiological sciences have been developed, for example, eco-nomics, statistics, accounting, language, psychology, ecology,pattern recognition, computer sciences, and fuzzy sets. Later,Renyi [6] introduced entropy of order π, which is the limitingcase of Shannon entropy.
Fuzzy set theory was introduced by Zadeh [7]. It relates tothe uncertainty occurring in the human cognitive processes.Fuzzy set theory has found applications in various engi-neering, biological sciences, and business. Zadeh introducedfuzzy entropy, a measure of fuzzy information based onShannonβs entropy.
Let π΄ be a fuzzy set in the universe of discourse Ξ©. Wedefine membership function as π
π΄: Ξ© β [0, 1], where
ππ΄(π₯) is a degree of membership for each π₯ β Ξ©.The following properties were given by De Luca and
Termini [8] which are to be satisfied by fuzzy entropy.(1) Fuzzy entropy is the minimum iff set is crisp.(2) Fuzzy entropy is the maximum when membership
value is 0.5.(3) Fuzzy entropy decreases if set is sharpened.(4) Fuzzy entropy of a set is the same as its complement.
In coding theory, we study various properties of codes forapplication in data compression, cryptography, error correc-tion, and network coding. The study of codes is introduced
Hindawi Publishing CorporationInternational Journal of Mathematics and Mathematical SciencesVolume 2015, Article ID 258675, 6 pageshttp://dx.doi.org/10.1155/2015/258675
2 International Journal of Mathematics and Mathematical Sciences
in Information Theory, electrical engineering, mathematics,and computer sciences for the transmission of data throughreliable and efficient methods. We have to consider how cod-ing of messages can be done efficiently so that the maximumnumber of messages can be sent over a noiseless channel ina given time. Thus, the minimum value of mean codewordlength subject to a given constraint on codeword lengths hasto be founded. Let π be random variable with each valueπ₯π, π = 1, 2, . . . , π, that generates messages represented by
the set {π1, π2, . . . , π
π·} which need to be transmitted. This
set is called code alphabet and sequence assigned to eachπ₯πis called codeword. The codeword length denoted by π
π
associated with π₯πsatisfying Kraftβs inequality is given by
π·βπ1+π·βπ2+ β β β +π·
βππ
β€ 1, (1)
where π· is the size of alphabet. Further, codes are chosen tominimize average codeword length which is given by
πΏ =
π
β
π=!
ππππ, (2)
whereππis the probability of the occurrence of π₯
π. Now, Shan-
nonβs [4] noiseless coding theorem for uniquely decipherablecodes is given by
π»(π)
logπ·β€ πΏ <
π» (π)
logπ·+ 1 (3)
finding the lower bounds on πΏ in terms of Shannonβs entropyπ»(π).
Campbell [9] proposed a special exponentiated meancodeword length of order πΌ for uniquely decipherable codesgiven by
πΏπΌ=
πΌ
1 β πΌlogπ·[
π
β
π=1πππ·(1βπΌ)π
π/πΌ] (4)
and proved a noiseless coding theorem. Campbell analyzedthat its lower bound lies between π
πΌ(π) and π
πΌ(π)+ 1, where
π πΌ(π) = (1 β πΌ)β1log
π·[βπ
π=1 ππΌ
π]; πΌ > 0, πΌ = 1.
The above is Renyiβs measure of entropy of order πΌ. AsπΌ β 1, it is easily shown that πΏ
πΌβ πΏ andπ
πΌ(π) approaches
π»(π).For uniquely decipherable code, a weighted average
length was given by Guiasu and Picard [10]:
πΏ =
π
β
π=1
π’πππππ
βπ
π=1 π’πππ. (5)
This is defined as the average cost of transmitting letters π₯π
with probability ππand utility π’
π.
A quantitative-qualitativemeasure of entropywas definedby Belis and Guiasu [11] for which Longo [12] gave the lowerbound for useful mean codeword length. Further, a noiselesscoding theorem was verified by Guiasu and Picard by intro-ducing lower bound for useful mean codeword length. AndGurdial and Pessoa [13] proved noiseless coding theorem bygiving lower bounds for useful mean codeword lengths of
order πΌ in terms of usefulmeasures of information of order πΌ.The following information measure was introduced by Belisand Guiasu:
π»(π,π) = β
π
β
π=1π’πππlogππ. (6)
Similarly, quantitative-qualitative information measure wasgiven by Taneja and Tuteja:
π»(
π
π
,π) =
π
β
π=1πππ’πlog(
ππ
ππ
) . (7)
Further, Bhaker and Hooda gave the following informationmeasures:
π»(
π
π
,π) =
βπ
π=1π’πππlog (π
π/ππ)
βπ
π=1π’πππ
,
π»πΌ(
π
π
,π) =
11 β πΌ
βπ
π=1π’πππΌ
ππ1βπΌπ
βπ
π=1 π’πππ, πΌ = 1.
(8)
Now, Baig and Dar [14, 15] proposed a new considered fuzzyinformation measure of order πΌ and type π½:
π»π½
πΌ(π΄; π)
=
121βπΌ β 1
[
[
βπ
π=1 π’ππ½{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
β 1]
]
.
(9)
And for this information measure they introduced the givenmean code length of order πΌ and type π½:
πΏπ½
πΌ(π)
=
121βπΌ β 1
[
[
[
{
{
{
βπ
π=1 π’ππ½{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}π·((1βπΌ)/πΌ)π
π
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
}
}
}
πΌ
β 1]]
]
.
(10)
Choudhary and Kumar [16] proved some noiseless codingtheorem on generalized R-Norm entropy. Also, Choudharyand Kumar [17] proposed some coding theorems on gen-eralized Havrda-Charvat and Tsallis entropy. Baig and Dar[14, 15] introduced few coding theorems on fuzzy entropyfunction depending upon parameters R andV and gave fuzzycoding theorem on generalized fuzzy cost measure. Tanejaand Bhatia [18] proposed a generalized mean codewordlength for the best 1:1 code and Parkash and Sharma [19, 20]proved some noiseless coding theorems corresponding tofuzzy entropies and introduced a new class of fuzzy codingtheorems. Parkash [21] introduced a new parametricmeasureof fuzzy entropy. Gupta et al. [22] proposed 1:1 codes forgeneralized quantitative-qualitative measure of inaccuracy.
International Journal of Mathematics and Mathematical Sciences 3
Jain and Tuteja [23] introduced a coding theorem connectedwith useful entropy of order πΌ and type π½. Tuli [24] intro-duced mean codeword lengths and their correspondencewith entropy measures. Tuli and Sharma [25] proved somenew coding theorems and consequently developed some newweighted fuzzy mean codeword lengths corresponding to thewell-known measures of weighted fuzzy entropy.
In the next section, we will prove the fuzzy noiselesstheorem for 1:1 codes of binary size and hence prove that theyare less constrained.
2. Fuzzy Noiseless Coding Theorem for1:1 Codes
Theorem 1. Themean codeword length order πΌ and type π½ for1:1 binary code satisfy:
π»π½
πΌ(π΄; π) β€ πΏ (1 :1)π½
πΌ(π) , (11)
where
π»π½
πΌ(π΄; π) =
121βπΌ β 1
[
[
βπ
π=1 π’ππ½{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½β 1]
]
,
πΏ (1 :1)π½πΌ(π) =
121βπΌ β 1
[
[
[
{
{
{
βπ
π=1 π’ππ½{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
}
}
}
πΌ
β 1]]
]
.
(12)
Proof. Consider the following fuzzy information measure:π»π½
πΌ(π΄; π)
=
121βπΌ β 1
[
[
βπ
π=1 π’ππ½{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
β 1]
]
.
(13)
And hence corresponding to the above fuzzy informationmeasure we introduce the given mean codeword length for1:1 code as follows:
πΏ (1 :1)π½πΌ(π) =
121βπΌ β 1
[
[
[
{
{
{
βπ
π=1 π’ππ½{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
}
}
}
πΌ
β 1]]
]
. (14)
Using, Holderβs inequality,
π
β
π=1π₯ππ¦πβ₯ (
π
β
π=1π₯π
π)
1/π
(
π
β
π=1π¦π
π)
1/π
. (15)
For all, π₯π, π¦π> 0, π = 1, 2, . . . , π; 1/π + 1/π = 1, π < 1( = 0),
π < 0 or π < 1( = 0), and π < 0.Let π = (πΌ β 1)/πΌ, π = 1 β πΌ,
π₯π=
(π’π{ππ΄(π₯π) + (1 β π
π΄(π₯π))})πΌπ½/(πΌβ1) 2βlog((π+2)/2)
βπ
π=1 (π’π {ππ΄ (π₯π) + (1 β ππ΄ (π₯π))})πΌπ½/(πΌβ1) ,
π¦π
=
π’π½/(1βπΌ)π
(π(πΌ+π½β1)/(1βπΌ)π΄
(π₯π) + (1 β π
π΄(π₯π))(πΌ+π½β1)/(1βπΌ)
)
βπ
π=1 (π’π {ππ΄ (π₯π) + (1 β ππ΄ (π₯π))})π½/(1βπΌ) .
(16)
Then, (15) becomes
βπ
π=1 π’π½
π{ππ½β1π΄
(π₯π) + (1 β π
π΄(π₯π))π½β1} 2βlog((π+2)/2)
βπ
π=1 π’π½
π{ππ½β1π΄
(π₯π) + (1 β π
π΄(π₯π))π½β1}
β₯[
[
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
]
]
πΌ/(πΌβ1)
β [
[
βπ
π=1 π’π½
π{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
]
]
1/(1βπΌ)
.
(17)
4 International Journal of Mathematics and Mathematical Sciences
Now, we use the following condition:
π
β
π=1π’π½
π{ππ½β1π΄
(π₯π) + (1βπ
π΄(π₯π))π½β1} β π·βππ
β€
π
β
π=1π’π½
π{ππ½
π΄(π₯π) + (1βπ
π΄(π₯π))π½} .
(18)
Thus, the equation is reduced as follows:
[
[
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
]
]
πΌ/(1βπΌ)
β₯[
[
βπ
π=1 π’π½
π{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
]
]
1/(1βπΌ)
.
(19)
Now, we raise the power on both sides by (1 β πΌ), and we get
[
[
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
]
]
πΌ
β 1 β₯βπ
π=1 π’π½
π{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½}
β 1.
(20)
Hence, we multiply both sides by 1/(21βπΌ β1) > 0, and we get
121βπΌ β 1
[
[
[
{
{
{
βπ
π=1 π’ππ½{ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½} 2((1βπΌ)/πΌ) log((π+2)/2)
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½
}
}
}
πΌ
β 1]]
]
β₯
121βπΌ β 1
[
[
βπ
π=1 π’ππ½{ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
}
βπ
π=1 π’π½
π{ππ΄(π₯π) + (1 β π
π΄(π₯π))}
π½β 1]
]
.
(21)
That is, πΏ(1 :1)π½πΌ(π) β₯ π»
π½
πΌ(π΄; π).
Hence, π»π½πΌ(π΄; π) β€ πΏ(1 :1)π½
πΌ(π); this proves the
result.
Theorem 2. Themean codeword length order πΌ and type π½ for1:1 binary code satisfy the following:
πΏ (1 :1)π½πΌ(π) < π»
π½
πΌ(π΄; π) 21βπΌ + 1. (22)
Proof. We have
2β log((π+2)/2) β€ππΌ
π΄(π₯π) + (1 β π
π΄(π₯π))πΌ
(βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)) / (βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½))
. (23)
That is,
log(π + 22
) β€ log(ππΌ
π΄(π₯π) + (1 β π
π΄(π₯π))πΌ
(βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)) / (βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½))
)
β1
β 2 (24)
International Journal of Mathematics and Mathematical Sciences 5
or
2log((π+2)/2) < (ππΌ
π΄(π₯π) + (1 β π
π΄(π₯π))πΌ
(βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)) / (βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½))
)
β1
β 2 (25)
or
2((1βπΌ)/πΌ) log((π+2)/2) < (ππΌ
π΄(π₯π) + (1 β π
π΄(π₯π))πΌ
(βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)) / (βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½))
)
(πΌβ1)/πΌ
β 2(1βπΌ)/πΌ.
(26)
Hence, we multiply both sides by (π’π½
π(ππ½π΄(π₯π) + (1 β
ππ΄(π₯π))π½))/(βπ
π=1 π’π½
π(ππ½π΄(π₯π) + (1 β π
π΄(π₯π))π½)).
And summing over π = 1, 2, . . . , π, we get
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½ 2((1βπΌ)/πΌ) log((π+2)/2))
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
< (
βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
)
1/πΌ
β 2(1βπΌ)/πΌ
(27)
or
(
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½ 2((1βπΌ)/πΌ) log((π+2)/2))
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
)
πΌ
< (
βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
) β 21βπΌ
(28)
or
(
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½ 2((1βπΌ)/πΌ) log((π+2)/2))
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
)
πΌ
β 1
< (
βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
β 1)
β 21βπΌ + 21βπΌ β 1.
(29)
Since 21βπΌ β 1 > 0 for 0 < πΌ < 1 and after suitable operations,we get
121βπΌ β 1
[
[
[
(
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½ 2((1βπΌ)/πΌ) log((π+2)/2))
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
)
πΌ
β 1]]
]
<
121βπΌ β 1
[
[
βπ
π=1 π’π½
π(ππΌ+π½β1π΄
(π₯π) + (1 β π
π΄(π₯π))πΌ+π½β1
)
βπ
π=1 π’π½
π(ππ½
π΄(π₯π) + (1 β π
π΄(π₯π))π½)
β 1]
]
β 21βπΌ + 21βπΌ β 121βπΌ β 1
(30)
6 International Journal of Mathematics and Mathematical Sciences
or we can write πΏ(1 : 1)π½πΌ(π) < π»
π½
πΌ(π΄; π)21βπΌ + 1, which
proves the result.
3. Conclusion
In modern data transmission and storage systems, the keyingredients that help in achieving the high degree of reliabilityare the error correcting codes. The noiseless channel facesthe problem of efficient coding of messages and hence wehave to maximize the number of messages to be sent througha channel in a given time. Thus, we have introduced meancodeword length of order πΌ and type π½ for 1:1 codes andanalyzed the relationship between average codeword lengthand fuzzy informationmeasures for binary 1:1 codes. Further,noiseless coding theorems associated with fuzzy informationmeasure have been established.
4. Future Research Endeavors
Decision-making problems in general utilize informationfrom the knowledge base of experts in view of their per-ception about that system. Besides mathematical studies,algorithms for application in decision-making can also bediscussed in extension of the above research. Comparisonof other fuzzy information measures in the light of fuzzynoiseless theorem and 1:1 codes of binary size can also beproved and compared. Further, this fuzzy noiseless codingtheorem can be utilized for characterizing and applying R-Norm entropy measures.
Conflict of Interests
The authors declare that there is no conflict of interestsregarding the publication of this paper.
References
[1] H. Nyquist, βCertain factors affecting telegraph speed,β BellSystem Technical Journal, pp. 324β325, 1924.
[2] H. Nyquist, βCertain topics in telegraph transmission theory,βTransactions of theAmerican Institute of Electrical Engineers, vol.47, no. 2, pp. 617β644, 1928.
[3] R. T. V. Hartley, βTransmission of information,β Bell SystemTechnical Journal, vol. 7, no. 3, pp. 535β563, 1928.
[4] C. E. Shannon, βAmathematical theory of communication,βTheBell System Technical Journal, vol. 27, no. 3, pp. 379β656, 1948.
[5] N. Wiener, Cybernetics, MIT Press, John Wiley & Sons, NewYork, NY, USA, 1948.
[6] A. Renyi, βOn measures of entropy and information,β inProceedings of the 4th Berkeley Symposium on MathematicalStatistics and Probability, pp. 547β561, University of CaliforniaPress, 1961.
[7] L. A. Zadeh, βFuzzy sets,β Information and Control, vol. 8, pp.338β353, 1965.
[8] A. De Luca and S. Termini, βA definition of a non-probabilisticentropy in setting of fuzzy sets,β Information and Control, vol.20, no. 4, pp. 301β312, 1972.
[9] L. L. Campbell, βA coding theorem and renyiβs entropy,β Infor-mation and Control, vol. 8, pp. 423β429, 1965.
[10] S. Guiasu and C.-F. Picard, βBorne in ferictur de la longuerurutile de certains codes,β Comptes Rendus Mathematique Aca-demic des Sciences, vol. 273, pp. 248β251, 1971.
[11] M. Belis and S. Guiasu, βA quantitative-qualitative measureof information in cybernetic systems,β IEEE Transactions onInformation Theory, vol. 14, no. 4, pp. 593β594, 1968.
[12] G. Longo, Quantitative-Qualitative Measures of Information,Springer, New York, NY, USA, 1972.
[13] F. Gurdial and F. Pesson, βOn useful information of order πΌ,βJournal of Combanatrics, Information & System Sciences, vol. 2,no. 4, pp. 158β162, 1977.
[14] M. Baig andM. J. Dar, βSome coding theorems on fuzzy entropyfunction depending upon parameter R andG,β IOSR Journal ofMathematics, vol. 9, no. 6, pp. 119β123, 2014.
[15] M. A. K. Baig and M. J. Dar, βFuzzy coding theorem ongeneralized fuzzy cost measure,β Asian Journal of Fuzzy andApplied Mathematics, vol. 2, pp. 28β34, 2014.
[16] A. Choudhary and S. Kumar, βSomemore noiseless coding the-orem on generalized R-Norm entropy,β Journal of MathematicsResearch, vol. 3, no. 1, pp. 125β130, 2011.
[17] A. Choudhary and S. Kumar, βSome coding theorems on gen-eralized Havrda-Charvat and tsalliβs entropy,β Tamkang Journalof Mathematics, vol. 43, no. 3, pp. 437β444, 2012.
[18] H. C. Taneja and P. K. Bhatia, βOn a generalizedmean codewordlength for the best 1:1 code,β Soochow Journal of Mathematics,vol. 16, pp. 1β6, 1990.
[19] O. Parkash and P. K. Sharma, βNoiseless coding theoremscorresponding to fuzzy entropies,β Southeast Asian Bulletin ofMathematics, vol. 27, no. 6, pp. 1073β1080, 2004.
[20] O. Parkash and P. K. Sharma, βA new class of fuzzy codingtheorems,β Caribbean Journal of Mathematical and ComputingSciences, vol. 12, pp. 1β10, 2002.
[21] O. Parkash, βA new parametric measure of fuzzy entropy,β inProceedings of the 7th International Conference on InformationProcessing and Management of Uncertainty in Knowledge-BasedSystems (IPMU β98), pp. 1732β1737, Paris, France, July 1998.
[22] P. Gupta, Niranjan, and H. Arora, βOn best 1:1 codes for gen-eralized quantitativequalitative measure of inaccuracy,β AfricanJournal of Mathematics and Computer Science Research, vol. 4,no. 3, pp. 159β163, 2011.
[23] P. Jain and R. K. Tuteja, βOn a coding theorem connected withuseful entropy of order πΌ and type π½,β Kybernetika, vol. 23, no.5, pp. 420β427, 1987.
[24] R. K. Tuli, βMean codeword lengths and their correspondencewith entropy measures,β International Journal of Engineeringand Natural Sciences, vol. 4, pp. 175β180, 2010.
[25] R. K. Tuli and C. S. Sharma, βApplications of measures of fuzzyentropy to coding theory,βAmerican Journal ofMathematics andSciences, vol. 1, pp. 119β124, 2012.
Submit your manuscripts athttp://www.hindawi.com
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical Problems in Engineering
Hindawi Publishing Corporationhttp://www.hindawi.com
Differential EquationsInternational Journal of
Volume 2014
Applied MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Probability and StatisticsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Mathematical PhysicsAdvances in
Complex AnalysisJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
OptimizationJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
CombinatoricsHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Operations ResearchAdvances in
Journal of
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Function Spaces
Abstract and Applied AnalysisHindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
International Journal of Mathematics and Mathematical Sciences
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
The Scientific World JournalHindawi Publishing Corporation http://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Algebra
Discrete Dynamics in Nature and Society
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Decision SciencesAdvances in
Discrete MathematicsJournal of
Hindawi Publishing Corporationhttp://www.hindawi.com
Volume 2014 Hindawi Publishing Corporationhttp://www.hindawi.com Volume 2014
Stochastic AnalysisInternational Journal of