Está en la página 1de 19

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

JOINT AND CONDITIONAL R-NORM INFORMATION


MEASURE

*Dr Meenakshi Darolia

The present paper depicts the joint and conditional probability distribution of two
random variables and having probability distributors P and Q over the Sets
X = {x1 , x 2 , ..., x n }

and Y = {y1 , y 2 , ..., y m } respectively. Then the R-norm information

of the random variables is denoted by H R ( ) = H R (P ) and H R ( ) = H R (Q ) , where


p i = P ( = xi ), i = 1, 2, ... , n , p i = P ( = y j ), j = 1, 2, ... , m

are the probabilities of the possible values of the random variables. Similarly, we
consider a two-dimensional discrete random variable (,) with joint probability
distribution = ( 11, 12, ... 1n ) ,
where ij = P ( = xi , = y j ), i = 1,2....., n, j = 1, 2...., m is the joint probability for the
values (xi , y j ) of ( , ). We shall denote conditional probabilities by pij and qij such
m

that ij = pij q j = q ji pi

And

ij .

ij and q i =

pi =
j =1

j =1

*Assistant Professor, Isharjyot Degree College Pehova

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

DEFINITION: The joint R-norm information measure for R R + and is given by

H R ( , ) =

R
1
R 1

i =1

j =1

ij R

1
R

(1.1)

Proposition 1: H R ( , ) is symmetric in and .


Proof: The joint R-norm information measure is defined by
H R ( , ) =

R
1
R 1

R
1
R 1

R
=
1
R 1

R
1
R 1

R
=
1
R 1

R
1
R 1

i =1

j =1

i =1

j =1

i =1

j =1

i =1

j =1

i =1

j =1

i =1

j =1

1
R

ij R

P R ( = xi , = y j )

1
R

P R ( = xi )P R ( = y i )

P R ( = y i )P R ( = xi )

P R ( = y i , = x j )

ji R

1
R

1
R

1
R

1
R

= H R ( , )

This implies that H R ( , ) is symmetric in , .

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

Proposition 2: If and are stochastically independent. Then the following


holds
R 1
H R ( )H R ( )
R

H R ( , ) = H R ( ) + H R ( )

(1.2)

Proof: Since the joint R-norm information measure for R R + and is given by

H R ( , ) =

R
1
R 1

R
1
=
R 1

R
1
R 1

R
=
1
R 1

i =1

j =1

i =1

j =1

i =1

j =1

i =1

j =1

ij R

1
R

P R ( = xi , = y j )

1
R

P R ( = xi )P R ( = y i )

P R ( = xi )P R ( = y i )

1
R

1
R

(1.3)

Since and are stochastically independent, thus (1.3) becomes


R
H R ( , ) =
1
R 1

P R ( = xi )
i =1

1
R

P R ( = y j )

1
R

j =1

R
R
R 1
R 1

1
H R ( ) 1
H R ( )
R 1 R 1
R
R

= H R ( ) + H R ( )

R 1
H R ( )H R ( )
R

H R ( , ) = H R ( ) + H R ( )

Thus finally

R 1
H R ( )H R ( )
R

International Journal in IT and Engineering


http://www.ijmr.net

(1.4)

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

In the limiting case R 1 we find the additive form of Shannons information


measure for independent random variables.i.e. when R
H R ( , ) = H R ( ) + H R ( )

11
H R ( )H R ( ) ,
1

1 in (1.4), then we get

H R ( , ) = H R ( ) + H R ( )

To construct a conditional R-norm information measure we can use a direct and an


indirect method. The indirect method leads to next definition
DEFINITION: The average subtractive conditional R-norm information of
given for R R + and is defined as

H R ( / ) = H R ( , ) H R ( )

R
1
=
R 1

R
=
R 1

i =1

ij R

R
1

R 1

j =1

pi

1
R

i =1

R
=
1
R 1

1
R

ij R

pi

1
R

(1.4)

i =1

1
R

(1.5)

i =1 j =1

ij R

i =1 j =1

1
R

1
R 1

pi
i =1

= H R ( / ) + H R ( )

Thus H R ( , ) = H R ( / ) + H R ( )

International Journal in IT and Engineering


http://www.ijmr.net

1
R

(1.6)

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

A direct way to construct a conditional R-norm information is the following.


DEFINITION: The average conditional R-norm information of given
is for R R + defined as

H R ( / ) =

R
1
pi
R 1
i =1

1
R

(1.7)

q ji
j =1

Or alternatively
**

H R ( / ) =

R
1
R 1

pi
i =1

1
R

(1.8)

q ji
j =1

The two conditional measure given in (1.7) and (1.8) differ by the way the
probabilities p i are incorporated. The expression (1.7) is a true mathematical
expression over, whereas the expression (1.8) is not.
Theorem: If and are statistically independent random variables then for
RR+

H R ( / ) =
R 1

(1)

pi
i =1

1
R

pi

i =11

(2)

H R ( / ) = H R ( , ) H R ( ) = H R ( )

(3)

H R ( / ) = H R ( )

1
R

qj

1
R

j=

R 1
H R ( )H R ( )
R

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

(4)

**

ISSN: 23211776

H R ( / ) = H R ( )

Proof: (I) Since the average subtractive conditional R-norm information of


given is for R R + defined as
n

H R ( / ) =
R 1

pi

1
R

i =1

ij R

1
R

(1.9)

i =1 j =1

Substitute ij = pij q j in (1.9), we get

H R ( / ) =

R
R 1

pi

1
R

( pij q j ) R

i =1

1
R

(1.10)

i =1 j =1

Since and are stochastically independent. Thus (1.10) becomes

(II)

H R ( / ) =

R
R 1

pi
i =1

1
R

pi

1
R

i =1

qj

1
R

j =1

Since we know that if and are stochastically independent. Then the

following holds
H R ( , ) = H R ( ) + H R ( )

R 1
H R ( )H R ( )
R

H R ( , ) H R ( ) = H R ( )

R 1
H R ( )H R ( )
R

(1.11)

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

And we know

H R ( / ) = H R ( , ) H R ( )

(1.12)

Using (1.12) in (1.11), we get

H R ( / ) = H R ( , ) H R ( ) = H R ( )

R 1
H R ( )H R ( )
R

(III) Since the average conditional R-norm information of given for R R +


and is defined as

H R ( / ) =

R
1
pi
R 1
i =1

1
R

(1.13)

q ji
j =1

Substitute q ji = q j in (1.13), we get


*

H R ( / ) =

R
1
pi
R 1
i =1

R
=
1
R 1

Hence

qj
j =1

H R ( / ) = H R ( )

1
R

qj
j =1

1
R

= H R ( )

pi

=1

i =1

(1.14)

(IV) Since the average conditional R-norm information of given for R R +

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

and is defined as

**

H R ( / ) =

R
1
R 1

pi
i =1

1
R

(1.15)

q ji
j =1

Substitute q ji = q j in (1.15), we get

**

H R ( / ) =

R
=
1
R 1

Hence

**

R
1
R 1

pi
i =1

1
R

qj
j =1

1
R

= H R ( )

qj

j =1

pi

=1

i =1

H R ( / ) = H R ( )

From this theorem we may conclude that the measure H R ( / ) , which is obtained
by the formal difference between the joint and the marginal information measure,
does not satisfy requirement (I). Therefore it is less attractive than the two other
measure. In the next theorem we consider requirement (II), for the conditional
information measures * H R ( / ) and ** H R ( / ) .
Theorem: If and are discrete random variables then for R R +
following results hold.
(I)

H R ( / ) H R ( ) (II)

**

H R ( / ) H R ( )

International Journal in IT and Engineering


http://www.ijmr.net

then the

IJITE
(III)

Vol.01 Issue-07, (December, 2013)

**

H R ( / ) * H R ( / )

ISSN: 23211776

(IV) ** H R ( / ) * H R ( / ) H R ( )

The equality signs holds if and are independent. Proof:


(I) Here we consider two cases: Cases I: when R < 1
We know by [4] that for R > 1 .
m

1
R R

xijR

xi j
j=1

i=1

i=1

1
R

(1.16)

j=1

Setting xi j = i j 0 in (1.16), we have


m

1
R

i j
j =1

ijR

i =1

i =1

1
R

(1.17)

j =1

Since

ij and i j = pi q ji

qi =

(1.18)

j =1

Using (1.18) in (1.17), we get

qj

1
R

(q ji p i ) R

j =1

i =1

1
R

j =1

It can be written as

International Journal in IT and Engineering


http://www.ijmr.net

(1.19)

IJITE

Vol.01 Issue-07, (December, 2013)


m

1
R

qj

pi

j =1

q ji

i =1

qj

1
R

pi

qj

1
R

1
R

j =1

pi

j =1

We know

q ji

i =1

1
R

j =1

j =1

i =1

q ji

qj

1
R

1
R 1

i =1

pi

q ji

i =1

R
1
R 1

(1.20)

R
> 0 if R > 1
R 1

R
1
R 1

But

1
R

j =1

Multiplying both sides of (1.20) by

R
1
R 1

ISSN: 23211776

R
, we get
R 1

pi
i =1

1
R

q ji

1
R

(1.21)

j =1

= + H R ( / ) and

j =1

qj

1
R

= H R ( ) Thus (1.21) becomes

j =1

H R ( / ) H R ( )

for R > 1

(1.22)

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

Cases II: when 0 <R < 1


We know by [4] that for 0 < R < 1

1
R R

1
R

xijR

xi j
j=1

i=1

i=1

(1.23)

j=1

Setting xi j = i j 0 in (1.23), we have

1
R

i j
j =1

1
R

ijR

i =1

i =1

(1.24)

j =1

Since

ij

qi =
j =1

Thus (1.24) becomes

qj

1
R

n
i =1

qj
j =1

(q

j =1

1
R

pi )

ji

j =1

i =1

1
R

(q

pi )

ji

1
R

j =1

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)


m

qj

1
R

i =1

j =1

j =1

We know

1
R

pi )

ji

(1.25)

R
< 0 if 0 < R < 1
R 1

Multiplying both sides of (1.25) by

R
1
R 1

But

(q

ISSN: 23211776

qj

1
R

R
1
R 1

pi
i =1

R
1
R 1

1
R 1

i =1

q ji

R
, we get
R 1

pi
i =1

q ji

1
R

(1.26)

j =1

1
R

= H R ( / )

and

j =1

qj

1
R

= H R ( )

j =1

Thus (1.26) becomes


*

H R ( / ) H R ( )

for 0 < R < 1 .

Thus from (1.22) and (1.27), we get


*

H R ( / ) H R ( ) for R R

International Journal in IT and Engineering


http://www.ijmr.net

(1.27)

IJITE

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

(II) H ere we consider two cases


Cases I: when R >1
From Jensens inequality for R > 1,we find
n

pi q j i

= q Rj

pi q j i

i =1

(1.28)

i =1

After summation over j and raising both sides of (1.28) by power

pi
i =1

q ji

qj

j =1

pi
i =1

q ji

1
R

qj

j =1

pi

1
R

j =1

i =1

1
R

j =1

Using

1
R

1
, we have
R

q ji

1
R

j =1

qj

1
R

(1.29)

j =1

R
> 0 as R > 1, Thus (1.29) becomes
R 1

R
1
R 1

pi
i =1

q ji
j =1

1
R

1
R 1

qj

1
R

(1.30)

jj =1

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)


n

R
1
R 1

But

pi

q ji

i =1

1
R

= + + H R ( / )

j =1

R
1
R 1

And

ISSN: 23211776

1
R

qj

= H R ( )

jj =1

Thus (1.30) becomes


**

H R ( / ) H R ( ) for R > 1

(1.31)

Case II: when 0 < R <1


From Jensens inequality for 0 < R < 1 we find
n

pi q j i

= q Rj

pi q j i

i =1

(1.32)

i =1

1
, we have
R

After summation over j and raising both sides of (1.32) by power

pi
i =1

q ji

i =1

j =1

pi
i =1

1
R

qj

1
R

j =1

q ji
j =1

qj

1
R

j =1

pi

j =1

q ji

1
R

1
R

qj

1
R

(1.33)

j =1

International Journal in IT and Engineering


http://www.ijmr.net

IJITE
Using

Vol.01 Issue-07, (December, 2013)


R
< 0 as 0 < R <1, Thus (1.33) becomes
R 1

R
1
R 1

pi
i =1

R
1
R 1

And

q ji

1
R

j =1

R
1
R 1

But

ISSN: 23211776

pi
i =1

qj

q ji

1
R 1

1
R

qj

1
R

(1.34)

jj =1

= + + H R ( / )

j =1

1
R

= H R ( )

jj =1

Thus (1.34) becomes


**

H R ( / ) H R ( )

for

0<R<1

(1.35)

Thus from (1.31) and (1.35), we get


**

H R ( / ) H R ( )

for

R R+

(III) Here we consider two cases:


Cases I: when R >1
We know from Jensens inequality
n

pi
i =1

q ji
j =1

1
R

pi
i =1

q ji

1
R

for R>1

j =1

International Journal in IT and Engineering


http://www.ijmr.net

(1.36)

IJITE

Vol.01 Issue-07, (December, 2013)


n

pi

q ji

i =1

j =1

pi

Using

pi

q ji

1
R

j =1

1
R

pi
i =1

q ji

1
R

(1.37)

j =1

R
> 0 if R >1, then (1.37) becomes
R 1

pi

q ji

i =1

R
1
R 1

(1.38)

1
R

j =1

pi

q ji

i =1

R
1
R 1

And

Thus

j =1

R
1
R 1

But

n
i =1

q ji

i =1

1
R

ISSN: 23211776

1
R

R
1
R 1

pi
i =1

q ji

1
R

(1.38)

j =1

= + H R ( / )

j =1

pi
i =1

q ji

1
R

= + + H R ( / )

j =1

**

becomes

H R ( / ) * H R ( / )

for

(1.39) Case II: when 0 < R < 1 We know from Jensens inequality

pi
i =1

q ji
j =1

1
R

pi
i =1

q ji

1
R

for 0 < R < 1

(1.40)

j =1

International Journal in IT and Engineering


http://www.ijmr.net

>

IJITE

Vol.01 Issue-07, (December, 2013)

pi

q ji

i =1

j =1

pi
i =1

Using

1
R

pi

q ji

i =1

q ji

1
R

1
R

j =1

pi

j =1

ISSN: 23211776

i =1

q ji

1
R

(1.41)

j =1

R
< 0 if 0 < R < 1, then (3.41) becomes
R 1

R
1
R 1

pi
i =1

**

pi
i =1

R
1
R 1

And

j =1

R
1
R 1

But

q ji

1
R

q ji

R
1
R 1

1
R

pi
i =1

q ji

1
R

(1.42)

j =1

= + H R ( / )

j =1

pi
i =1

q ji

1
R

= + + H R ( / ) Thus (1.42) becomes

j =1

H R ( / ) * H R ( / )

Thus from (1.39) and (1.43), we get

for 0 < R <1


**

H R ( / ) * H R ( / )

(1.43)
for R R+

(IV) From (I) and (III), we have * H R ( / ) H R ( ) And


**

H R ( / ) * H R ( / ) Thus finally we find

International Journal in IT and Engineering


http://www.ijmr.net

IJITE

Vol.01 Issue-07, (December, 2013)

**

ISSN: 23211776

H R ( / ) * H R ( / ) H R ( )

(1.44)

HENCE PROVED

REFRENCES

[1].

ACZEL, J. AND

DARCOZY, Z. (1975), On Measure of Information and their

Characterizations, Academic Press, New York.


[2]. ARIMOTO , S. (1971), Information theoretical considerations on problems, Inform. Contr.
19, 181-194.
[3]. BECKENBACH, E.F. AND BELLMAN, R.(1971),Inequalities, Springer- Verlag, Berlin.
[4]. BOEKEE, D.E. AND VAR DER LUBEE, J. C.A. (1979), Some aspects of error bounds
in features selection, Pattern Recognition 11, 353-360.
[5]. CAMPBELL ,L.L (1965), A Coding Theorem and Renyis Entropy, Information and
Control, 23, 423-429.
[6]. DAROCZY, Z. (1970), Generalized information function, Inform. Contr.16, 36-51.
[7].

D. E. BOEKKE AND J. C. A. VAN DER LUBBE ,R-Norm Information Measure,


Information and Control. 45, 136-155(1980

[8]. GYORFI ,L. AND NEMETZ, T. (1975), On the dissimilarity of

probability measure, in

Proceedings Colloq. on Information Theory, Keshtely, Hungary.


[9]. HARDY, G. H., LITTLEWOOD, J. E., AND POLYA, G.(1973), Inequalities Cambridge
Univ. Press, London /New York

International Journal in IT and Engineering


http://www.ijmr.net

IJITE
[10].

Vol.01 Issue-07, (December, 2013)

ISSN: 23211776

HAVDRA, J. AND CHARVAT, F. (1967). Quantification method of Classification


processes, Concept of structural - entropy, Kybernetika 3, 30-35.

[11]. Nath, P.(1975), On a Coding Theorem Connected with Renyis Entropy Information and
Control 29, 234-242.
[12]. O.SHISHA, Inequalities,Academic Press, New York.
[13].

RENVI, A. (1961),On measure of entropy and information, in proceeding, Fourth


Berkeley Symp. Math. Statist. Probability No-1, pp. 547-561.

[14]. R. P. Singh (2008), Some Coding Theorem for weighted Entropy of order .. Journal of
Pure and Applied Mathematics Science : New Delhi.

International Journal in IT and Engineering


http://www.ijmr.net

También podría gustarte