Fundamentals of the Backoff Process in 802.11: Dichotomy of the Aggregation

Reading time: 6 minute
...

📝 Original Info

  • Title: Fundamentals of the Backoff Process in 802.11: Dichotomy of the Aggregation
  • ArXiv ID: 0904.4155
  • Date: 2010-08-23
  • Authors: Researchers from original ArXiv paper

📝 Abstract

This paper discovers fundamental principles of the backoff process that governs the performance of IEEE 802.11. A simplistic principle founded upon regular variation theory is that the backoff time has a truncated Pareto-type tail distribution with an exponent of $(\log \gamma)/\log m$ ($m$ is the multiplicative factor and $\gamma$ is the collision probability). This reveals that the per-node backoff process is heavy-tailed in the strict sense for $\gamma>1/m^2$, and paves the way for the following unifying result. The state-of-the-art theory on the superposition of the heavy-tailed processes is applied to establish a dichotomy exhibited by the aggregate backoff process, putting emphasis on the importance of time-scale on which we view the backoff processes. While the aggregation on normal time-scales leads to a Poisson process, it is approximated by a new limiting process possessing long-range dependence (LRD) on coarse time-scales. This dichotomy turns out to be instrumental in formulating short-term fairness, extending existing formulas to arbitrary population, and to elucidate the absence of LRD in practical situations. A refined wavelet analysis is conducted to strengthen this argument.

💡 Deep Analysis

Deep Dive into Fundamentals of the Backoff Process in 802.11: Dichotomy of the Aggregation.

This paper discovers fundamental principles of the backoff process that governs the performance of IEEE 802.11. A simplistic principle founded upon regular variation theory is that the backoff time has a truncated Pareto-type tail distribution with an exponent of $(\log \gamma)/\log m$ ($m$ is the multiplicative factor and $\gamma$ is the collision probability). This reveals that the per-node backoff process is heavy-tailed in the strict sense for $\gamma>1/m^2$, and paves the way for the following unifying result. The state-of-the-art theory on the superposition of the heavy-tailed processes is applied to establish a dichotomy exhibited by the aggregate backoff process, putting emphasis on the importance of time-scale on which we view the backoff processes. While the aggregation on normal time-scales leads to a Poisson process, it is approximated by a new limiting process possessing long-range dependence (LRD) on coarse time-scales. This dichotomy turns out to be instrumental in for

📄 Full Content

Since its introduction, the performance of IEEE 802.11 has attracted a lot of research attention and the center of the attention has been the throughput [6], [27]. Recently, other critical performance aspects of 802.11 also burst onto the scene, which include short-term fairness [12], [26] and delay [38]. It goes without saying that there has been a phenomenal growth of Skype and IPTV users [14], [15] and it is reported in [23] that an ever-increasing percentage of these users connects to the Internet through wireless connections in US. Remarkably, it is found in [15] that jitter is more negatively correlated with Skype call duration than delay, i.e., Skype users tend to hang up their calls earlier with large jitters. This finding empirically testifies large jitter of access networks annoys Skype users, let alone QoS (quality of service). This quantified dissatisfaction of users provides a motivation for a thorough understanding of delay and jitter performance in 802.11.

This work was supported in part by “Centre for Quantifiable Quality of Service in Communication Systems, Centre of Excellence” appointed by The Research Council of Norway, and funded by The Research Council, NTNU and UNINETT. A part of this work was done when J. Cho was with EPFL, Switzerland. A preliminary abstract version of this work appeared at ACM SIGMETRICS Workshop on Mathematical Performance Modeling and Analysis (MAMA'09).

J. Cho and Y. Jiang are with the Centre for Quantifiable Quality of Service in Communication Systems, Norwegian University of Science and Technology (NTNU), NO-7491 Trondheim, Norway (email: {jeongwoo,jiang}@q2s.ntnu.no).

For throughput analysis, Kumar et al., in the seminal paper [27], axiomized several remarkable observations based on a fixed point equation (FPE), advancing the state of the art to more systematic models and paving the way for more comprehensive understanding of 802.11. Above all, one of the key findings of [27], already adopted in the field [28], [34], is that the full interference model 1 , also called the singlecell model [27], in 802.11 networks leads to the backoff synchrony property [31] which implies the backoff process can be completely separated and analyzed through the FPE technique. Another observation in [27] was that if the collision probability γ is constant, one can derive the so-called Bianchi’s formula by appealing to renewal reward theorem [13], without the Markov chain analysis in [6].

An intriguing notion, called short-term fairness, has been introduced in some recent works [5], [12], [26], defining P[z|ζ] as the probability that other nodes transmit z packets while a tagged node is transmitting ζ packets. It can be easily seen that this notion pertains to a purely backoff-related argument also owing to the backoff synchrony property in the full interference model [27]. The two papers [5], [12], in the course of deriving equations for P[z|ζ], assumed that the summation of the backoff values generated per packet, which we denote by Ω, is uniformly and exponentially distributed, respectively. Specifically, despite the same situation where two nodes contend for the medium, the former [5] assumed that Ω is uniformly distributed because the initial backoff is uniformly distributed over the set {0, 1, • • • , 2b 0 -1} where 2b 0 is the initial contention window and observed in [5,Fig. 2] that this assumption leads to a good match between the expression P[z|ζ] derived under the uniform assumption on Ω and the testbed data measured in their experiments, while the latter [12] also observed in [12,Fig. 5(a)] that the testbed data measured in their experiments closely match the expression P[Z|ζ] derived under the the exponential assumption on Ω: Q1:“What makes two different observations?” (to be answered in Section III)

In addition, the two works [5], [12] acquired the expression of P[z|ζ] only for the two node case. A more general formula for arbitrary number of nodes should deepen our appreciation of short-term fairness. It is natural to ask the following pertinent questions: Q2:“Can we develop a general model for short-term fairness?” (to be answered in Corollaries 1 & 2) In proportion as people take a growing interest in the delay performance of 802.11, the number of fundamental questions that we face increases. In [1], it was argued based on simulation results that the access delay in 802.11 closely follows a Poisson distribution. They have shown that the number of successful packet transmissions by any node in the network over a time interval has a probability distribution that is close to Poisson by an upper bounded distribution distance. This raises an intriguing question: Q3:“Is there a Poissonian property? If yes, what is the cause?” (to be answered in Theorem 1) Another case in point is found in a recent work [34] that extends the access delay analysis in the seminal paper of Kwak et al. [28] and makes an attempt at analyzing higher order moments by applying the FPE tech

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut