# Producto y cociente de variables independientes hipergeometrica de Gauss.

Product and Quotient of Independent Gauss Hypergeometric Variables

Produto e Quociente de Variaveis Independentes Gauss Hipergeometrica

1 Introduction

A random variable X is said to have a beta (type 1) distribution with parameters [alpha] and [beta] if its probability density function (pdf) is given by

[B.sub.1](x; [alpha], [beta]) = [x.sup.[alpha]-1][(1 - x).sup.[beta]- 1]/B([alpha], [beta]), 0 < x < 1, (1)

where [alpha] > 0 and [beta] > 0, and

B([alpha], [beta]) > [[integral].sup.1.sub.0] [t.sup.[alpha]-1][(1 - t).sup.[beta]-1] dt = [GAMMA]([alpha])[GAMMA]([beta])/[GAMMA]([alpha] + [beta]), Re([alpha]) > 0, Re([beta]) > 0,

denotes the beta function. The beta distribution is very versatile and a variety of uncertainties can be usefully modeled by it. Many of the finite range distributions encountered in practice can easily be transformed into the standard beta distribution. Several univariate and matrix variate generalizations of this distribution are given in Gordy , Gupta and Nagar , Johnson, Kotz and Balakrishnan , McDonald and Xu , and Nagar and Zarrazola . A natural univariate generalization of the beta distribution is the Gauss hypergeometric distribution defined by Armero and Bayarri . The random variable X is said to have a Gauss hypergeometric distribution, denoted by X ~ GH([alpha], [beta], [gamma], [xi]), if its density function is given by

[f.sub.GH](x; [alpha], [beta], [gamma], [xi]) = C([alpha], [beta], [gamma], [xi]) [x.sup.[alpha]-1][(1 - x).sup.[beta]- 1]/[(1 + [xi]x).sup.[gamma]], 0 < x < 1, (2)

where [alpha] > 0, [beta] > 0, -[infinity] < [gamma] < [infinity] and [xi] > -1. The normalizing constant C([alpha], [beta], [gamma], [xi]) is given by

[{C([alpha], [beta], [gamma], [xi]}.sup.-1] = B([alpha], [beta]) [sub.2][F.sub.1]([alpha], [gamma]; [alpha] + [beta] - [xi]), (3)

where [sub.2][F.sub.1] represents the Gauss hypergeometric function (Luke ). Note that the Gauss hypergeometric function [sub.2][F.sub.1] in (3) can be expanded in series form if - 1 < [xi] < 1. However, if [xi] > 1, then we use suitably (7) to rewrite [sub.2][F.sub.1] to have absolute value of the argument less than one.

The above distribution was suggested by Armero and Bayarri  in connection with the prior distribution of the parameter [rho], 0 < [rho] < 1, which represents the traffic intensity in a M/M/1 queueing system. A brief introduction of this distribution is given in the encyclopedic work of Johnson, Kotz and Balakrishnan [3, p. 253]. In the context of Bayesian analysis of unreported Poisson count data, while deriving the marginal posterior distribution of the reporting probablity p. Fader and Hardie  have shown that q = 1 - p has a Gauss hypergeometric distribution. The Gauss hypergeometric distribution has also been used by Dauxois  to introduce conjugate priors in the Bayesian inference for linear growth birth and death processes. Sarabia and Castillo  have pointed out that this distribution is conjugate prior for the binomial distribution.

Gauss hypergeometric distribution reduces to a beta type 1 distribution when either [gamma] or [epsilon] equals to zero. Further, for [gamma] = [alpha] + [beta] and [xi] = 1, the Gauss hypergeometric distribution simplifies to a beta type 3 distribution given by the density (Cardeno, Nagar and Sanchez , Sanchez and Nagar ),

B3(x; [alpha], [beta]) = [2.sup.[alpha]][x.sup.[alpha]-1][(1 - x).sup.[beta]- 1]/B([alpha], [beta])[(1 + x).sup.[alpha] + [beta]], 0 < x < 1 (4)

where [alpha] > 0 and [beta] > 0. For [gamma] - [alpha] + [beta] and f = -(1 - [gamma]) it slides to a three parameter generalized beta distribution defined by the density

B1(x; [alpha], [beta]; [gamma]) = [[lambda].sup.[alpha]][x.sup.[alpha]-1][(1 - x).sup.[beta]-1]/B([alpha], [beta])[[1 - (1 - [lambda])x].sup.[alpha]+[beta]], 0 < x < 1, (5)

where [alpha] > 0 and [beta] > 0. This distribution was defined and used by Libby and Novic  for utility function fitting. The beta distribution sometimes does not provide sufficient flexibility for a prior for probability of success in a binomial distribution. Among various properties, the distribution defined by the density (5) can more flexibly account for heavy tails or skewness, and it reduces to the ordinary beta (type 1) distribution for certain parameter choices. The resulting posterior distribution in this case is a four-parameter type of beta. Chen and Novic  provided tables as evidence for its usefulness. Several properties and special cases of this distribution are given in Johnson, Kotz and Balakrishnan [3, p. 251]. For further results and properties, the reader is referred to Aryal and Nadarajah , Nadarajah , Nagar and Rada-Mora , Pham-Gia and Duong , and Sarabia and Castillo .

In this article, we derive distributions of the product and the ratio of two independent random variables when at least one of them is Gauss hypergeometric. We also study several properties of this distribution including Renyi and Shannon entropies.

2 Some Known Definitions and Results

In this section, we give definitions and results that are used in subsequent sections. Throughout this work we will use the Pochhammer coefficient [(a).sub.n] defined by [(a).sub.n] = a(a+l) ... (a+n - 1) = [(a).sub.n-1](a+n - 1) for n = 1,2, ..., and [(a).sub.0] = 1. The integral representation of the Gauss hypergeometric function is given as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (6)

Note that, by expanding [(1 - zt).sup.-b], [absolute value of zt] < 1, in (6) and integrating t, the series expansion for [sub.2][F.sub.1] can be obtained as

[sub.2][F.sub.1], (a, b; c; z) = [[infinity].summation over (r=0)] [[(a).sub.r][(b).sub.r]/[(c).sub.r]] [z.sup.r]/r!. [absolute value of z] < 1.

The Gauss hypergeometric function [sup.2][F.sub.1](a, b; c; z) satisfies Euler's relation

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (7)

For properties and further results the reader is referred to Luke .

The Appell's first hypergeometric function [F.sub.1] is defined by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (8)

where [absolute value of [z.sub.1]] < 1 and [absolute value [z.sub.2]] < 1. It is straightforward to show that

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (9)

where [sub.2][F.sub.1] is the Gauss hypergeometric series. Using the results

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

where Re(c) > Re (a) > 0,

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

in (8), one obtains

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (10)

where Re(c) > Re(a) > 0. Note that for [b.sub.1] = 0, [F.sub.1] reduces to a [sub.2][F.sub.1] function. For properties and further results of these function the reader is referred to Srivastava and Karlsson .

Lemma 2.1. Let

I(a, b, d, p, q; [[theta].sub.1], [[theta].sub.2], z) = [[integral].sup.1.sub.0] [[v.sup.a-1][(1 - zv).sup.b-1][(1 - v).sup.d-1] dv]/[(1 + [[theta].sub.1]zv).sup.p][(1 + [[theta].sub.2]v).sup.q]. (11)

Then, for a > 0, d > 0 and 0 < z < 1, we have

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Proof. Writing

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]

in (11) and substituting v - 1 - w. we obtain

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (12)

Now, using the definition of the Appell's first hypergeometric function, we get the desired result.

3 Properties

The graph of the Gauss hypergeometric density for different vaiues of the parameters is shown in the Figure 1.

[FIGURE 1 OMITTED]

The k-th moment of the variable X having Gauss hypergeometric distribution, obtained in Armero and Bayarri , can be calculated as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (13)

from which the expected value and the variance of this distribution are obtained as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Moreover, the cumulative distribution function (CDF) can be derived in terms of special functions as shown in the following theorem.

Theorem 3.1. Let X ~ GH([alpha], [beta], [gamma], [xi]). Then, the CDF of X is given by

[[x.sup.[alpha]]/[alpha]B([alpha], [beta])] [[F.sub.1]([alpha], 1 - [beta], [gamma]; [alpha] + 1; x, - [epsilon]x)]/[sub.2][F.sub.1]([alpha], [gamma]; [alpha] + [beta]; - [xi]), 0 < x < 1.

Proof. The CDF of X is evaluated as

P(X [less than or equal to] x) = C([alpha], [beta], [gamma], [xi])[[integral].sup.x.sub.0] [[t.sup.[alpha]-1][(1 - t).sup.[beta]-1]/[(1 + [xi]t).sup.[gamma]]] dt.

By making the substitution u = t/x, we can express the above integral as

C([alpha], [beta], [gamma], [xi])[x.sup.[alpha]][[integral].sup.1.sub.0] [[u.sup.[alpha]-1][(1 - u).sup.([alpha]+1)- [alpha]-1]/[(1 - xu).sup.1-[beta]][(1 + [xi]xu).sup.[gamma]]] du.

Now, using the integral representation (10) of [F.sub.1], substituting for C([alpha], [beta], [gamma], [xi]) from (3) and simplifying, we obtain the desired result.

The following theorem suggests a generalized beta type 2 distribution, from the Gauss hypergeometric distribution.

Theorem 3.2. Let X ~ GH([alpha], [beta], [gamma], [xi]). Then, the pdfofthe random variable Y = X/(l - X) is given by

C([alpha], [beta], [gamma], [epsilon]) [[y.sup.[alpha]-1][(1 + y).sup.[gamma]- ([alpha]+[beta])]]/[[1 + [xi](1 + y)].sup.[gamma]], y > 0, (14)

where C([alpha], [beta], [gamma], [xi]) is the normalizing constant given by (3).

Proof. Making the transformation Y = X/(l -- X), with the Jacobian J(x [right arrow] y) = [(1 + y).sup.-2] in (2), we get the desired result.

As expected, if in the density (14) we take [gamma] = 0 or [xi] = 0 with [beta] > [gamma], then we obtain the beta type 2 density.

4 Entropies

In this section, exact forms of Renyi and Shannon entropies are obtained for the Gauss hypergeometric distribution defined in Section 1.

Let (X, B, V) be a probability space. Consider a pdf f associated with P, dominated by [sigma]--finite measure [mu] on X. Denote by [H.sub.SH](f) the well-known Shannon entropy introduced in Shannon . It is defined by

[H.sub.SH](f) = -[[integral].sub.x] f(x) log f(x)d[mu]. (15)

One of the main extensions of the Shannon entropy was defined by Renyi . This generalized entropy measure is given by

[H.sub.R]([eta], f) = log G([eta])/[1 - [eta]] (for [eta] > 0 and [eta] [not equal to] 1) (16)

where

G([eta]) = [[integral].sub.X] [f.sup.[eta]]d[mu].

The additional parameter [eta] is used to describe complex behavior in probability models and the associated process under study. Renyi entropy is monotonically decreasing in [eta], while Shannon entropy (15) is obtained from (16) for [eta] [up arrow] 1. For details see Nadarajah and Zografos , Zografos , and Zografos and Nadarajah .

First, we give the following lemma useful in deriving these entropies.

Lemma 4.1. Let g([alpha], [beta], [gamma], [xi]) = [lim.sub.[eta] [right arrow] 1] h([eta]), where

h([eta]) = [d/d[eta]] [sub.2][F.sub.1]([eta]([alpha] - 1) + 1, [eta][gamma]; [eta]([alpha] + [beta] - 2) + 2; - [xi]). (17)

Then, for -1 < [xi] < 1, we have

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (18)

and for [xi] [greater than or equal to] 1,

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (19)

where [psi](z) = [GAMMA]'(z)/[GAMMA][z] is the digamma function.

Proof. Using the series expansion of [sub.2][F.sub.1], for -1 < [xi] < 1, we write

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (20)

where

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Now, differentiating the logarithm of [[DELTA].sub.j]([eta]) w.r.t. [eta], we arrive at

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (21)

Finally, substituting (21) in (20) and taking [eta] [right arrow] 1, we obtain (18). To obtain (19), we use (7) to write

[sub.2][F.sub.1]([eta]([alpha] - 1) + 1, [eta][gamma]; [eta]([alpha] + [beta] - 2) + 2; -[xi]) = [(1 + [xi]).sup.- [eta][gamma]] [sub.2][F.sub.1]([eta]([beta] - 1) + 1, [eta][gamma]; [eta]([alpha] + [beta] - 2) + 2; [xi]/[1 + [xi]]) (22)

and proceed similarly.

Theorem 4.1. For the Gauss hypergeometric distribution defined by the pdf (2), the Renyi and the Shannon entropies are given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (23)

and

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (24)

respectively, where for -1 < [xi] < 1, g([alpha], [beta], [gamma], [xi]) is given by (18), and for [xi] [greater than or equal to] 1 g([alpha], [beta], [gamma], [xi]) is given by (19).

Proof. For [eta] > 0 and [eta] [not equal to] 1, using the density of X given by (2), we have

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

where the last line has been obtained by using (3). Now, taking logarithm of G([eta]) and using (16) we get (23). The Shannon entropy is obtained from (23) by taking [eta] [up arrow] 1 and using L'Hopital's rule.

5 Distribution of The Product

In this section, we obtain distributional results for the product of two independent random variables involving Gauss hypergeometric distribution.

Theorem 5.1. Let [X.sub.1] and [X.sub.2] be independent, [X.sub.i] ~ GH([[alpha].sub.i], [[beta].sub.i], [[gamma].sub.i], [[xi].sub.i]), i = 1,2. Then, the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Proof. Using the independence, the joint pdf of [X.sub.1] and [X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (25)

Transforming Z = [X.sub.1][X.sub.2], [X.sub.2] = [X.sub.2] with the Jacobian J([x.sub.1], [x.sub.2] [right arrow] z, [x.sub.2]) z, [x.sub.2]) = 1/[x.sub.2], we obtain the joint pdf of Z and [X.sub.2] as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (26)

To find the marginal pdf of Z, we integrate (26) with respect to [x.sub.2] to get

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (27)

In (27) change of variable w = (1 - [x.sub.2])/(1 - z) yields

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

where the last step has been obtained by expanding [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] in power series. Finally, applying (10), we obtain the desired result.

Corollary 5.1.1. Let [X.sub.1] ~ GH([[alpha].sub.1], [[beta].sub.1], [[gamma].sub.1], [[xi].sub.1]) and [X.sub.2] ~ [B.sub.1]([[alpha].sub.2], [[beta].sub.2]) be independent. Then, the pdf of Z = [X.sub.1][X.sub.2] is

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Proof. Substituting [[gamma].sub.2] = 0 in Theorem 5.1 and using (9) we get the desired result.

Corollary 5.1.2. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.1] ~ [B.sub.1]([[alpha].sub.1], [[beta].sub.1]) and [X.sub.2] ~ B3([[alpha].sub.2], [[beta].sub.2]). If [[alpha].sub.2] = [[alpha].sub.1] + [[beta].sub.1], then the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

The graphs of the pdf of Z = [X.sub.1][X.sub.2], where [X.sub.1] and [X.sub.2] are independent, [X.sub.1] ~ GH([[alpha].sub.1], [[beta].sub.1], [[gamma].sub.1], [[xi].sub.1]) and [X.sub.2] ~ [B.sub.1]([[alpha].sub.2], [[beta].sub.2]) for different values of ([[alpha].sub.1], [[beta].sub.1], [[gamma].sub.1], [[xi].sub.1], [[alpha].sub.2], [[beta].sub.2]) for different values of the parameters is shown in the Figure 2. Further, Figure 3 depicts graphs of the density of Z = [X.sub.1][X.sub.2]. where [X.sub.1] and [X.sub.2] are independent, [X.sub.1] ~ [B.sub.1]([[alpha].sub.1], [[beta].sub.1]) and [X.sub.2] ~ B3([[alpha].sub.1] + [[beta].sub.1], [[beta].sub.2]) for different values of ([[alpha].sub.1], [[beta].sub.1], [[beta].sub.2]).

[FIGURE 2 OMITTED]

[FIGURE 3 OMITTED]

Theorem 5.2. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.1] ~ GH([[alpha].sub.1], [[beta].sub.1], [[gamma].sub.1], [[xi].sub.1]) and [X.sub.2] ~ B2([[alpha].sub.2], [[beta].sub.2]). Then, the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Proof. Since [X.sub.1] and [X.sub.2] are independent, their joint pdf is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]

Now consider the transformation Z = [X.sub.1][X.sub.2], W = 1 - [X.sub.1] whose Jacobian is J([x.sub.1], [x.sub.2] [right arrow] w, z) = 1/(1 - w). Thus, we obtain the joint pdf of W and Z as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (28)

where z > 0 and 0 < w < 1. Finally, integrating to using (10) and substituting for K2 in (28), we obtain the desired result.

Corollary 5.2.1. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.1] ~ B1([[alpha].sub.1], [[beta].sub.1]; [[lambda].sub.1]) and [X.sub.2] ~ B2([[alpha].sub.2], [[beta].sub.2]). Then, the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Corollary 5.2.2. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.1] ~ Bl([[alpha].sub.1], [[beta].sub.1]) and [X.sub.2] ~ B2([[alpha].sub.2], [[beta].sub.2]). Then, the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

Corollary 5.2.3. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.1] ~ B3([[alpha].sub.1], [[beta].sub.1]) and [X.sub.2] ~ B2([[alpha].sub.1], [[beta].sub.1])- Then, the pdf of Z = [X.sub.1][X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

6 Distribution of The Quotient

In this section we obtain distributional results for the quotient of two independent random variables involving Gauss hypergeometric distribution.

In the following theorem, we consider the case where both the random variables are distributed as Gauss hypergeometric.

Theorem 6.1. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.i] ~ GH([[alpha].sub.1], [[beta].sub.1], [[gamma].sub.i], [[xi].sub.i]), i = 1,2. Then, the pdf of Z = [X.sub.1]/[X.sub.2] is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

for 0 < z [less than or equal to] 1, and

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

for z > 1.

Proof. The joint pdf of [X.sub.1] and [X.sub.2] is given by (25). Now, transforming Z = [X.sub.1]/[X.sub.2], V = [X.sub.2] with the Jacobian J([x.sub.1], [x.sub.2] [right arrow] z, v) = v, we obtain the joint pdf of Z and V as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (29)

where 0 < v < 1 for 0 < z [less than or equal to] 1, and 0 < v < l/z for z > 1. For 0 < z [less than or equal to] 1, the marginal pdf of Z is obtained by integrating (29) over 0 < v < 1. Thus, the pdf of Z, for 0 < 2 [less than or equal to] 1, is obtained as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (30)

Now, using Lemma 2.1 and substituting [K.sub.1] in the density (30) we get the desired result. For z > 1, the density of Z is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (31)

where the last line has been obtained by substituting w = vz. Finally, using Lemma 2.1 and substituting for [K.sub.1], we obtain the pdf of Z for z > 1.

Theorem 6.2. Let the random variables [X.sub.1] and [X.sub.2] be independent, [X.sub.i] ~ Gn([[alpha].sub.1], [[beta].sub.1]; [[gamma].sub.i], [[epsilon].sub.i]), i = 1,2. TTien, the pdf of T = [X.sub.1]/([X.sub.1] + [X.sub.2]) is given by

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

for 0 < t [less than or equal to] 1/2, and

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII],

for 1/2 < t < 1

Proof. Making the transformation T = Z/(1 + Z) with the Jacobian J(z [right arrow] i) = [(1 - i).sup.-2] in Theorem 6.1 we get the desired result

Acknowledgment

This research work was supported by the Comite para el Desarrollo de la Investigacion(CODI), Universidad de Antioquia research grant no. IN560CE.

References

 Michael B. Gordy. Computationally convenient distributional assumptions for common-value auctions. Computational Economics, ISSN: 0927-7099, EISSN: 1572-9974, 12(1), 61-78 (1998). 30

 A. K. Gupta and D. K. Nagar. Matrix variate distributions. Chapman & Hall/CRC Monographs and Surveys in Pure and Applied Mathematics, 104, ISBN: 1-58488-046-5, Chapman & Hall/CRC, Boca Raton, FL, 2000. 30

 N. L. Johnson, S. Kotz and N. Balakrishnan. Continuous univariate distributions. Vol. 2. Second edition, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. A Wiley- Interscience Publication, ISBN: 978-0-471-58494-0, John Wiley & Sons, Inc., New York, 1995. 30, 31, 32

 J. B. McDonald and Y. J. Xu. A generalization of the beta distribution with applications. Journal of Econometrics, ISSN: 0304-4076, 66(1&2), 133-152 (1995). 30

 D. K. Nagar and E. Zarrazola. Distributions of the product and the quotient of independent Kummer-beta variables. Scientiae Mathematicae Japonicae, ISSN: 1346-0862, EISSN: 1346-0447, 61(1), 109-117 (2005). 30

 C. Armero and M. Bayarri. Prior assessments for predictions in queues. The Statistician, ISSN: 1467-9884, 43(1), 139-153 (1994). 30, 31, 35

 Y. L. Luke. The special functions and their approximations, Vol. I. Mathematics in Science and Engineering, Vol. 53, ISBN: 0-124-59901-X, Academic Press, New York-London, 1969. 31, 33

 Peter S. Fader and Bruce G. S. Hardie. A note on modelling underreported Poisson counts. Journal of Applied Statistics, PISSN: ISSN: 0266-4763, OISSN: 1360-0532, 27(8), 953-964 (2000). 31

 J.-Y. Dauxois. Bayesian inference for linear growth birth and death processes. Journal of Statistical Planning and Inference, ISSN: 0378-3758, 121(1), 1-19 (2004). 31

 Jose Maria Sarabia and Enrique Castillo. Bivariate distributions based on the generalized three-parameter beta distribution. Advances in distribution theory, order statistics, and inference, ISBN: 978-0-8176-4361-4, 85-110, Stat. Ind. Technol., Birkhauser Boston, Boston, MA, 2006. 31, 32

 Liliam Cardeho, Daya K. Nagar and Luz Esteia Sanchez. Beta type 3 distribution and its multivariate generalization. Tamsui Oxford Journal of Mathematical Sciences, ISSN: 1561-8307, 21(2), 225-241 (2005). 31

 Luz E. Sanchez and Daya K. Nagar. Distributions of the product and quotient of independent beta type 3 variables. Far East Journal of Theoretical Statistics, ISSN: 0972-0863, 17(2), 239-251 (2005). 31

 D. L. Libby and M. R. Novic. Multivariate generalized beta distributions with applications to utility assessment. Journal of Educational Statistics, ISSN: 0362-9791, 7(4), 271-294 (1982). 32

 J. J. Chen and M. R. Novick. Bayesian analysis for binomial models with generalized beta prior distributions. Journal of Educational Statistics, ISSN: 0362-9791, 9, 163-175 (1984). 32

 Gokaraa Aryal and Saralees Nadarajah. Information matrix for beta distributions. Serdica Mathematical Journal, ISSN: 1310-6600, 30(4), 513-526 (2004). 32

 Saralees Nadarajah. Sums, products and ratios of generalized beta variables. Statistical Papers, PISSN: 0932-5026, EISSN: 1613-9798, 47, (1), 69-90 (2006). 32

 Daya K. Nagar and Erika Alejandra Rada-Mora. Properties of multivariate beta distributions. Far East Journal of Theoretical Statistics, ISSN: 0972-0863, 24(1), 73-94 (2008). 32

 T. Pham-Gia and Q. P. Duong. The generalized beta- and F-distributions in statistical modelling. Mathematics and Computer Modelling, ISSN: 0895-7177, 12(12), 1613-1625 (1989). 32

 H. M. Srivastava and P. W. Karlsson. Multiple Gaussian hypergeometric series, ISBN: 0-470-20100-2, Halsted Press [John Wiley & Sons], New York, 1985. 33

 C. E. Shannon. A mathematical theory of communication. Bell System Technical Journal, ISSN : 0005-8580, 27, 379-423, 623-656 (1948). 37

 A. Renyi. On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, ISSN: 00970433, University of California Press, Berkeley, California, pp. 547-561 (1961). 37

 S. Nadarajah and K. Zografos. Expressions for Renyi and Shannon entropies for bivariate distributions. Information Sciences, ISSN: 00200255, 170(2-4), 173-189 (2005). 37

 K. Zografos. On maximum entropy characterization of Pearson's type II and VII multivariate distributions. Journal of Multivariate Analysis, ISSN: 0047-259X, 71(1), 67-75 (1999). 37

 K. Zografos and S. Nadarajah. Expressions for Renyi and Shannon entropies for multivariate distributions. Statistics & Probability Letters, ISSN: 0167-7152, 71(1), 71-84 (2005). 37

Daya K. Nagar (1) and Danilo Bedoya Valencia (2)

(1) Ph. D. en ciencia, nagar@matematicas.udea.edu.co, Profesor Titular, Departamento de Matematicas, Universidad de Antioquia, Calle 67, No. 53-108, Medellin- Colombia.

(2) Magister en Matematicas, danilo.bv@gmail.com, Estudiante dei Doctorado en Ingenieria--Sistemas e Informatica, Escuela de Sistemas, Universidad Nacional de Colombia, Carrera 80, No 65-223--Nucleo Robledo, Medellin-Colombia.

Recepcion:03-jun-2010/Modificacion: 27-oct-2011/Aceptacion:22-nov-2011 Se aceptan comentarios y/o discusiones al articulo
COPYRIGHT 2011 Universidad EAFIT
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters