Achievability of the Rate ${1/2}log(1+es)$ in the Discrete-Time Poisson Channel
A simple lower bound to the capacity of the discrete-time Poisson channel with average energy $\es$ is derived. The rate ${1/2}\log(1+\es)$ is shown to be the generalized mutual information of a modified minimum-distance decoder, when the input follo…
Authors: ** Alfonso Martínez (Centrum Wiskunde & Informatica, 네덜란드) **
1 Achie v ability of the Rate 1 2 log (1 + ε s ) in the Discrete- T ime Po isson Channel Alfonso Martinez Abstract A simple lower bound to the capacity of the discrete-time Poisson channel with average energy ε s is deriv ed. The rate 1 2 log(1 + ε s ) is shown to b e the g eneralized mutu al inf ormation of a m odified minimum- distance decoder, when the in put follows a gamma distribution of parameter 1 / 2 and me an ε s . I . I N T RO D U C T I O N Consider a memoryless disc rete-time whose output Y is distrib uted acc ording to a Po isson d istri - buti on of parameter X , the channel input. By construction, the ou tput is a non-negative integer , and the inp ut a non-negative real number . Th e channe l transition probability W ( y | x ) is thus giv en by W ( y | x ) = e − x x y y ! . (1) This model, the discrete-time Poiss on (DTP) c hannel, appears often in the analysis of optical com- munication channe ls. In this case , o ne can identify the input with a signal energy and the output with an integer nu mber o f quanta of energy . Let P X ( x ) den ote the proba bilit y den sity function o f the channel inp ut. W e as sume that the input energy is constrained, i. e. E[ X ] ≤ ε s , where E[ · ] d enotes the expectation operator and ε s is t he a verage energy . Random variables are denoted by ca pital letters, a nd their realiza tions b y small letters. An exact formula for the c apacity C( ε s ) of the DTP chann el is n ot kn o wn. Recen tly , Lapidoth and Moser [1], deriv ed the following lower bound C( ε s ) ≥ log 1 + 1 ε s 1+ ε s √ ε s ! − 1 + r π 24 ε s . (2) Observe that this bound di ver ges for v anishing ε s . Capacity is giv en in nats and the logarithms are in ba se e . A. Martinez is with C entrum W iskunde & Informatica, The Netherlands. e-mail: alfonso.martinez@ieee.org. Nov ember 21, 2018 DRAFT 2 A close d-form expression for the mutual information I ( X ; Y ) achieved by an input with a gamma distrib ution of parameter ν was derived by Martinez in [2], namely I ( X ; Y ) = Z 1 0 ε s − 1 − ν ν ( ν + ε s (1 − u )) ν u ν − 1 1 − u ! du log u + ( ε s + ν ) log ε s + ν ν + ε s ψ ( ν + 1) − 1 , (3) where ψ ( y ) is Euler’ s diga mma function. F or ν = 1 / 2 , numerical ev aluation of the mutual informati on giv es a rate which would se em to exceed 1 2 log(1 + ε s ) for a ll values of ε s . In this pa per , we prove that the rate 1 2 log(1 + ε s ) is indeed ach ie v able by this input distribution. Th e a nalysis uses a s uboptimum minimum-distance deco der , similar in spirit to Lapidoth’ s analys is of ne arest ne ighbor de coding [3]. I I . M A I N R E S U LT Let the input X follow a g amma distrib ution o f parameter 1 / 2 and mean ε s , tha t is, P X ( x ) = 1 √ 2 π ε s x e − x 2 ε s . (4) This cho ice led to good lower and up per bounds in [1] and [2] respe cti vely . W e c onsider a maximum-metric decoder; the codeword metric is given by the product of symbo l metrics q ( x, y ) over all cha nnel uses. The optimum maximum-likelihood dec oder , for which q ( x, y ) = W ( y | x ) , is somewhat unwieldy to analyz e (Eq. (3) giv es the exact mutual information). W e cons ider instead a symbol dec oding me tri c o f the form q ( x, y ) = e − ax − y 2 x , (5) where a = 1 + 1 ε s . Th e reasons for this cho ice of a will be apparent later . Clearly , the de coder is unchan ged if we replace the s ymbol metric q ( x, y ) by a s ymbol distance d ( x, y ) = − log q ( x, y ) , and select the co de word wit h sma llest total distance , summed ov er all cha nnel uses. This alternative formulation is reminiscent of minimum-distance, or nearest-neigh bor decoding. Indeed, the metric in Eq. (5) is equiv alent to a minimum-distance deco der which uses the distance d ( x, y ) = ( y − √ ax ) 2 x = y 2 x + ax − 2 y √ a. (6) The term − 2 y √ a is common to all sy mbols x and can be removed, since it does not aff ect the decision. For a = 1 , the distance in Eq. (6) na turally arises from a Gaussian approximation to the ch annel output, wh ereby the chann el output is modeled a s a Gauss ian random variable of mean x and variance x . This a pproximation is sugges ted by the fact tha t a Poisson random variable of mean x approac hes a G aussian ran dom variable o f mean a nd variance x for large x . Nov ember 21, 2018 DRAFT 3 Minimum-distance decoders were conside red by Lap idoth [3] in his analysis of additiv e non- Gaussian -noise ch annels. For our channel mod el, even though noise is n either a dditi v e (i t is signal- depend ent), nor Ga ussian, similar techniques to the on es use d in [3] ca n be app lied. More specifica lly , since we have a mismatched dec oder , we determine the gene ralized mutual information [4]. For a giv en d ecoding metric q ( x, y ) and a positiv e number s , it can b e p rov ed [4] that the follo wing rate —the ge neralized mutual information— is ach ie v able I GMI ( s ) = E log q ( X, Y ) s R P X ( x ) q ( x ′ , Y ) s dx ′ . (7) The expe ctation is carried out acc ording to P X ( x ) W ( y | x ) . This q uantity is obviously a lower bound to the channe l capa city . Our main result is Theorem 1. In the discrete-time P oisson channel with ave ra ge signal ene r gy ε s , the rate 1 2 log(1 + ε s ) is achievable. This rate i s reminiscent of the capa city o f a real-v alue Gaussian channel with a verage signal-to- noise ratio ε s . Similarly to the situa tion in this chann el, the rate is achieved by a form of minimum-distan ce decoding . Diff erently , the input follows a g amma distributi on, rather than a Gaussian . Pr oof: W e ev aluate the gene ralized mutual information I GMI ( s ) for an input dis tri buted acc ording to the g amma den sity , in E q. (4). First, we ev aluate the expectation in the denominator [5, Eq. 3.471- 15] Z ∞ 0 e − x ′ 2 ε s − asx ′ − sy 2 x ′ √ 2 π ε s x ′ dx ′ = e − y q 2 s (1+2 aε s s ) ε s √ 1 + 2 aε s s . (8) Further , using the expression of the first two moments o f the P oisson distrib ution, n amely 1 X y W ( y | x ) y = x, X y W ( y | x ) y 2 = x 2 + x, (9) together with the input co nstraint R P X ( x ) x dx = ε s , we can explicitl y carry out the expectation in 1 The moment generating function of a Poisson random v ariable of mean x is readily computed to be e x ( e t − 1) . The fi rst two moments are the first two deri v ati ves, ev aluated at t = 0 . Nov ember 21, 2018 DRAFT 4 Eq. (7), I GMI ( s ) = Z P X ( x ) X y W ( y | x ) log q ( x, y ) s dx − Z P X ( x ) X y W ( y | x ) log Z P X ( x ′ ) q ( x ′ , y ) s dx ′ dx (10) = s Z P X ( x ) X y W ( y | x ) − ax − y 2 x dx − Z P X ( x ) X y W ( y | x ) − y s 2 s (1 + 2 aε s s ) ε s − log √ 1 + 2 aε s s dx (11) = − s (( a + 1) ε s + 1) + p 2 ε s s (1 + 2 aε s s ) + 1 2 log(1 + 2 aε s s ) . (12) Choosing ˆ s = 2 ε s ( a − 1) 2 ε 2 s +2 ε s ( a +1)+1 , the first two summands c ancel out. And for a = 1 + 1 ε s we have that 2 a ˆ s = 1 , and therefore I GMI ( ˆ s ) = 1 2 log(1 + ε s ) . (13) The same rate, 1 2 log(1 + ε s ) , is also achievable b y a dec oder with a = 1 . In this cas e, we have to replace the generalized mutual information by the alternativ e expression I LM [4], given by I LM = E log a ( X ) q ( X , Y ) s R P X ( x ) a ( x ′ ) q ( x ′ , Y ) s dx ′ . (14) As for I GMI , s is a non-negativ e numbe r; a ( x ) is a weighting function. Se tti ng a ( x ) = e − s ε s x we have that I LM is given by Eq. (11), thus proving the achiev ability . The boun d provided in this paper is s impler a nd tighter tha n Eq. (2). It would be interesting to extend Theorem 1 to channe l models Y = S ( X ) + Z , wh ere S ( X ) co rresponds to the c ase cons idered here and Z is s ome additi ve noise Z , with a Poiss on or a geo metric distri bution. A d if feren t input distrib ution a nd another mo dified decod ing metric are likely required for either c ase. R E F E R E N C E S [1] A. Lapidoth and S. M. Moser , “Bounds on the capacity of the discrete-time P oisson channel, ” in Proceed ings of the 41st Allerton Conf. on Communication, Contr ol, and Computing , October 2003. [2] A. Martinez, “Spectral efficiency of optical direct detection, ” J. Opt. Soc. A m. B , vol. 24, no. 4, pp. 739–749, April 2007. [3] A. Lapidoth, “Nearest neighbor decoding for additive non-gau ssian noise channels, ” IEE E T rans. Inf. Theory , vol. 42, no. 5, pp. 1520–1529, September 1996. [4] A. Ganti, A. Lapidoth, and ˙ I. E. T elatar, “Mismatched decoding revisited: general alphabets, channels with memory , and the wide-band l imit, ” IEEE T ra ns. Inf. Theory , vol. 46, no. 7, pp. 2315–23 28, Nov ember 2000. [5] I. S. Gr adsh teyn and I. M. Ryzhik, T ables of Inte grals, Series, and Pr oducts , 6th ed., A. Jeffre y , Ed. Academ ic Press, 2000. Nov ember 21, 2018 DRAFT
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment