The Poisson Channel at Low Input Powers

Reading time: 6 minute
...

📝 Original Info

  • Title: The Poisson Channel at Low Input Powers
  • ArXiv ID: 0810.3564
  • Date: 2008-10-21
  • Authors: Researchers from original ArXiv paper

📝 Abstract

The asymptotic capacity at low input powers of an average-power limited or an average- and peak-power limited discrete-time Poisson channel is considered. For a Poisson channel whose dark current is zero or decays to zero linearly with its average input power $E$, capacity scales like $E\log\frac{1}{E}$ for small $E$. For a Poisson channel whose dark current is a nonzero constant, capacity scales, to within a constant, like $E\log\log\frac{1}{E}$ for small $E$.

💡 Deep Analysis

Deep Dive into The Poisson Channel at Low Input Powers.

The asymptotic capacity at low input powers of an average-power limited or an average- and peak-power limited discrete-time Poisson channel is considered. For a Poisson channel whose dark current is zero or decays to zero linearly with its average input power $E$, capacity scales like $E\log\frac{1}{E}$ for small $E$. For a Poisson channel whose dark current is a nonzero constant, capacity scales, to within a constant, like $E\log\log\frac{1}{E}$ for small $E$.

📄 Full Content

We consider the discrete-time memoryless Poisson channel whose input x is in the set R + 0 of nonnegative reals and whose output y is in the set Z + 0 of nonnegative integers. Conditional on the input X = x, the output Y has a Poisson distribution of mean (λ + x) where λ ≥ 0 is called the dark current. We denote the Poisson distribution of mean ξ by P ξ (•) so P ξ (y) = e -ξ ξ y y! , y ∈ Z + 0 .

With this notation the channel law W (•|•) is given by

This channel is often used to model pulse-amplitude modulated optical communication with a direct-detection receiver [1]. Here the input x is proportional to the product of the transmitted light intensity by the pulse duration; the output y models the number of photons arriving at the receiver during the pulse duration; and λ models the average number of extraneous counts that appear in y in addition to those associated with the illumination x.

The average-power constraint1 on the input is

where E > 0 is the maximum allowed average power.

The peak-power constraint on the input is that with probability one

When no peak-power constraint is imposed, we write A = ∞.

No analytic expression for the capacity of the Poisson channel is known. In [1] Shamai showed that capacityachieving input distributions are discrete whose numbers of mass points depend on E and A. In [2,3] Lapidoth and Moser derived the asymptotic capacity of the Poisson channel in the regime where both the average and peak powers tend to infinity with their ratio fixed.

In the present paper, we seek the asymptotic capacity of the Poisson channel when the average input power tends to zero. The peak-power constraint, when considered, is held constant and hence does not tend to zero with the average power. We consider two different cases for the dark current λ. The first case is when the dark current tends to zero proportionally with the average power. This corresponds to the wide-band regime where the pulse duration tends to zero. The second case is when the dark current is constant. This corresponds to the regime where the transmitter is weak.

Our lower bounds on channel capacity are all based on binary inputs. In some cases we show that this is asymptotically optimal. Our upper bounds are derived using the duality expression (see [4] and references therein). An efficient way to compute asymptotic capacities at low average input powers is to compute the capacity per unit cost [5]. However, we shall see that, apart from one case (Equation ( 7)), the capacity per unit cost does not exist, namely, the capacity tends to zero more slowly than linearly with the average power.

Among the results in this paper, the special case of zero dark current has been derived independently in [6,7].

The rest of the paper is arranged as follows: in Section 2 we state the results of this paper; in Section 3 we prove the lower bounds; and in Section 4 we sketch the proofs for the upper bounds.

Let C(λ, E, A) denote the capacity of the Poisson channel with dark current λ under Constraints (2) and ( 3)

where the supremum is over all input distributions satisfying (2) and (3).

When λ is proportional to E, the asymptotic capacity of the Poisson channel as E ↓ 0 is given in the following proposition. Note that this also includes the case where the dark current is the constant zero.

Recall that, for any α, β > 0, the sum of two independent random variables with the Poisson distributions P α (•) and P β (•) has the Poisson distribution P α+β (•). Thus, we can produce any Poisson channel with nonzero dark current by adding noise to a Poisson channel with zero dark current. Consequently,

Thus, to prove Proposition 1, we only need to show the following two bounds:

We shall prove (4) in Section 3.1 and shall sketch a proof for (5) in Section 4.1.

of the latter.

Remark 2. Because the pure-loss bosonic channel with coherent input states and direct detection reduces to a Poisson channel, the lower bound (4) and the achievability of its lefthand side using binary signaling combine with (6) to show that the asymptotic (quantum-receiver) capacity of the pureloss bosonic channel is achievable with binary modulation (on-off keying) and direct detection.

For a Poisson channel with constant nonzero dark current, we have the following result.

) and

The proof of ( 7) is a simple application of the formula for capacity per unit cost [5, Theorem 2]. The proof of the lower bound in ( 8) is in Section 3.2; and a sketch of the proof of the upper bound in ( 8) is in Section 4.2.

The achievability results in this section are obtained by choosing binary input distributions and then computing the mutual informations. We denote by Q b the binary distribution

where ζ > 0, p ∈ (0, 1). If we choose the parameters ζ and p in such a way that Constraints (2) and (3) are satisfied, then

In this subsection we shall derive Inequality (4). To this end, we compute the mutual information I(Q b , W ) for input distribution Q b given by (9):

(1 -

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut