Documentos de Académico
Documentos de Profesional
Documentos de Cultura
ABSTRACT
A parametric mean length is defined as the quantity
,
1
α
1 ui α − ni αα−1
Lα ( U ; P ) = 1 − ∑ Pi
β β
β
D
∑ ui pi
α −1
are derived in terms of ‘useful’ information measure for the incomplete power
distribution, .
β
p
Key words: Tsallis’s Entropy, ‘Useful’ Information measure, Utilities and Power
probabilities.
1. INTRODUCTION
Consider the following model for a random experiment S,
S N = [ E ; P; U ]
where is a finite system of events happening with
E = ( E1 , E 2 , ...., E N )
respective probabilities , , and and
P = ( p1 , p2 , ...., pN ) pi ≥ 0 ∑
N
pi = 1
i =1
where H(U; P) reduces to Shannon’s [26] entropy when the utility aspect of the
scheme is ignored i.e., when for each i. Throughout the paper, will
ui = 1 ∑
stand for unless otherwise stated and logarithms are taken to base
∑
N
i =1
.
D ( D > 1)
Guiasu and Picard [10] considered the problem of encoding the
outcomes in (1.1) by means of a prefix code with codewords
w1 , w2 , .... , w N
having lengths and satisfying Kraft’s[18] inequality.
n1 , n 2 , .... , n N
2
. ...(1.3)
N
∑D
i =1
− ni
≤1
where D is the size of the code alphabet. The useful mean length of
L ( U ; P)
code was defined as:
...(1.4)
L(U; P) =
∑ ui ni pi
∑u p i i
and the authors obtained bounds for it in terms of H(U; P). Longo [19], Gurdial
and Pessoa [12], Autar and Khan [4], Hooda[14] have studied generalized
coding theorems by considering different generalized measures of (1.2) and
(1.4) under condition (1.3) of unique decipherability.
In this paper, we study some coding theorems by considering a new
information measure depending on the parameters and and a utility
α
β
function. Our motivation for studying this new function is that it generalizes
‘useful’ information measure already existing in the paper Tsallis’s[29]entropy.
2. CODING THEOREMS
In this section, we define generalized ‘useful’ information measure as:
…(2.1)
1 ∑ ui pi
αβ
H (U ; P ) =
β
1−
α − 1 ∑ ui piβ
α
where ,
α > 0 ( ≠ 1) , β > 0
∑p i =1
3
(iii) When and = 1, (2.1) reduces to entropy as considered by
ui = 1 β
Tsallis[29] ,
i.e. ...(2.4)
1
Hα ( P ) =
α −1
1− ( ∑ p ) α
i
∑u p i i
(v) When for each i, i.e. when the utility aspect is ignored and
ui = 1
, the measure (2.1) reduces to entropy considered by Mathur and
α→1
Mitter [21] for - power distribution,
β
i.e. ...(2.6)
H β ( P) = −
∑ piβ log piβ
∑ pβ i
∑ ui pi β
Further consider
Definition: The generalized ‘useful’ mean length with respect to
Lαβ ( U ; P)
‘useful’ information measure is defined as :
...(2.9)
α
α −1
1
1 ui α− ni
Lαβ ( U ; P ) = 1 − ∑ Pi
β
β
D α
α − 1 ∑ ui pi
4
where
α > 0 ( ≠ 1) , β > 0, pi > 0,∑ pi =1, i =1, 2,............., N
(i) For = 1 ,then (2.9) reduces to the new useful mean length,
β
i.e. ...(2.10)
α
α −1
1
1 ui α − ni
Lα ( U ; P ) = 1 − ∑ pi
α
D
α − 1 ∑ i i
u p
(ii) For for each i, then (2.9) becomes new optimal code length,
ui = 1
i.e. ...(2.11)
α
α −1
1
1 1 α
− ni
α
Lαβ ( P ) = 1 − ∑ pi
β
D
∑ pi
β
α −1
(iii) For = 1, and then (2.9) reduced to considered by
β α→1 L
ui = 1
Shannon[26],
i.e. ...(2.12)
L = ∑ ni pi
(iv) For = 1 and for each i, then (2.9) becomes new optimal code
β ui = 1
length,
i.e. ...(2.13)
α
1
α −1
− ni
Lα ( P ) = 1 − ∑ pi D α
α − 1
5
...(2.16)
∑ x y ≥ (∑ x ) (∑ y )
1 1
p p q q
i i i i
1
αβ
ui 1−α
yi = p 1−α
β
,
∑ ui pi
i
and in (2.16) and using (1.3) we obtain the result (2.14) after
1 q = 1− α
p = 1−
α
simplification for as .
1 α >1
>0
α −1
Theorem 2.2 For every code with lengths , can be
{ ni } , i = 1, 2, ..., N Lαβ ( U ; P)
made to satisfy,
...(2.18)
1
L (U; P) < H (U; P) D ( 1−α )
1− D ( )
β β 1−α
α α +
α −1
Proof: Let be the positive integer satisfying, the inequality
ni
...(2.19)
u pαβ ui piαβ
− log D i i αβ ≤ n < − logD + 1
∑u p i ∑ u pαβ
i i i i
Consider the intervals
…(2.20)
u pαβ up αβ
δ i = − log D i i αβ , − log D i i
+ 1
∑u p
∑u p
αβ
i i i i
of length 1. In every , there lies exactly one positive number such that
δi ni
6
…(2.21)
ui piαβ ui piαβ
0 < − log D ≤ ni < − logD + 1
∑ u pαβ
∑ ui pi
αβ
i i
It can be shown that the sequence thus defined, satisfies
{ ni } , i = 1, 2, ..., N
(1.3). From (2.21) we have
u pαβ
ni < − log D i i αβ + 1
∑u p
i i
up αβ
⇒ D − ni > i i αβ D−1
∑u p
i i
...(2.22)
α −1
α −1
− ni
α
u pαβ α 1−α
⇒D > i i αβ Dα
∑u p
i i
Multiplying both sides of (2.22) by , summing over
1
ui α
piβ
∑ u p β
i i
7
Suppose is the unique integer between and + 1, then obviously,
ni ni ni
satisfied (1.3).
ni
Since , we have
α > 0 ( ≠ 1)
α
1
( α −1)
ui − ni
α
∑ p β
D α
∑ ui p i
i β
α
1
( α −1)
ui − ni
α
≤ ∑ piβ D α
∑ u pβ
i i
…(2.25)
α
( α −1)
1
ui − ni α
< D∑ p i
β
D α
∑ u p β
i i
Since,
α
1
∑ ui pi
ui − ni
α ( α −1) αβ
∑ p β
D α
=
∑ ui pi ∑ ui pi
i β β
Hence, (2.25) becomes
α
1
ui α ( α −1)
∑ ui piαβ ∑ ui piαβ
∑ p β
−n
D i α
≤ < D
∑ ui pi ∑ ui pi ∑ ui pi
i β β β
which gives the result (2.23).
8
References
[1] RAYEES AHMAD DAR and M.A.K BAIG, Coding Theorems on
Generalized Cost Measure, Journal of inequalities in pure and applied
mathematics, vol 9, Iss.1, art.8. (2008).
[2] J. ACZEL and Z. DAROCZY, Uber Verallegemeineste quasiliniare
mittelveste die mit grewinebts functionen gebildet Sind. Pub. Math.
Debrecan, vol. 10 (1963),171-190.
[3] C. ARNDT, Information Measure-Information and its description in
Science and Engineering, Springer, Berlin, (2001).
[4] R. AUTAR and A.B. KHAN, On Generalized`Useful’Information for
Incomplete distribution, J. of Comb. Information and Syst. Sci., 14(4)
(1989), 187-191.
[5] M.A.K. BAIG and RAYEES AHMAD DAR, Some noiseless coding
theorems of inaccuracy measure of order and type , Sarajevo
α β
Journal of Mathematics, Volume.3(15) (2007), 137-143.
[6] M. BELIS and S. GUIASU, A Qualitative-Quantitative Measure of
Information in Cybernetics Systems, IEEE Transation in Information
Theory, IT(14) , (1968) , 593-594.
[7] L.L.CAMPBELL, A coding theorem and Renyi’s entropy, Information
and Control, Vol. 8 (1965), 423-429.
[8] Z.DAROCZY, Generalized Information function, Information and
Control, Vol.16 (1970), 36-51.
[9] A. FEINSTEIN, Foundation of Information Theory, McGraw Hill, New
York, (1956).
[10] S. GUIASU and C.F. PICARD, Borne Inferieture de la Longuerur
de utile Certain Codes, C.R. Acad. Sci, Paris, SerA-B 273, (1971) , 248-
251.
[11] S.GUIASU, “Weighted Entropy”. Reports on Math Physics, 2, (1971),
165-179
[12] GURDIAL and F.PESSOA, On Useful Information of order , J. Comb.
α
Information and Syst. Sci., 2, (1977), 30-35.
9
[13] HAVRDA and CHARVAT, Qualification Method of Classification
Process, the concept of structural -entropy, Kybernetika, vol.3, (1967),
α
30-35.
[14] D.S. HOODA, A Coding Theorem on Generalized R-Norm Entropy,
Korean J. Comput. & Applied Math. Vol. 8 (2001), No. 3, 657-664.
[15] D.S. HOODA and A. RAM, Characterization of the generalized R-norm
entropy, Caribbean Journal of Mathematical & Computer Science, Vol.
8, (1998).
[16] D.S.HOODA, and U.S.BHAKER, Mean Value Characterization of Useful
Information Measure, Tamkang Journals of Mathematics, vol. 24(1993),
383-394.
[17] A. B. KHAN and HASSEEN AHMAD, Some noiseless coding
theorems of entropy of order of the power distribution , Metron,
α Pβ
39(3-4) (1981), 87-94.
[18] L.G.KRAFT, A device for quantizing, grouping and coding amplitude
modulated pulses, MS Thesis, Electrical Engineering Department, MIT,
1949.
[19] G. LONGO, A Noiseless Coding Theorem for Sources Having Utilities,
SIAM J. Appl. Math., 30(4) , (1976) ,739-748.
[20] MC-MILLAN, Two inequalities implied by using decipherability IEEE
Trans Inform. Theory IT-2 (1956), 115-116.
[21] J.MITTER and Y.D MATHUR, Comparison of entropies of power
distribution, ZAMM, 52(1972), 239-240.
[22] OM PARKASH and P.K SHARMA Noiseless Coding Theorems
Corresponding to Fuzzy Entropies, Southeast Asian Bulletin of
Mathematics, 27, (2004), 1073-1080.
[23] S. PIRZADA and B.A. BHAT, Some more results In coding theory, J.
KSIAM, vol. 10, No. 2, (2006), 123-131.
[24] A. RENYI, On Measure of entropy and information, Proc. 4 th Berkeley
Symp. Maths. Stat. Prob., Vol.1 (1961), 547-561.
[25] L. K. ROY, Comparison of Renyi’s entropy of power distribution,
ZAMM, 56(1976), 217-218.
[26] C.E. SHANNON, A Mathematical Theory of Communication, Bell
System Tech., J., 27 (1948) 379-423, 623-636.
[27] O. SHISHA, Inequalities, Academic Press, New York, (1967).
[28] I.J.TANEJA and P. KUMAR, Relative information of types, Csiszar’s f-
divergence, and information Inequalities, Information Sciences,
116(2004), 105-125 .
[29] C .TSALLIS, Possible Generalization of Boltzmann Gibbs Statistics, J.
Stat. Phy., 52(1988), 479.
[30] R.K. TUTEJA and P. GUPTA, On a Coding Theorem connected with
useful entropy of order and , Int. J. Math. and Mathemetical
α β
Sci.Vol.12 (1989), 193-198.
10