Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information
We consider the problem of transmission of several distributed sources over a multiple access channel (MAC) with side information at the sources and the decoder. Source-channel separation does not hold for this channel. Sufficient conditions are prov…
Authors: R. Rajesh, Vinod Sharma
Distributed Join t Source-Channel Co ding on a Multiple Access Channel with Side Informat ion R Ra jesh and Vino d Sharma Dept. of Electrical Communic ation Engg. Indian Institute of Science Bangalore, India Email: ra jesh@pal.ece.iisc.e rnet.in, vino d@ece.iisc.ernet.in Octob er 31, 20 18 Abstract W e co ns ider the problem of transmissio n of several distributed sources o ver a multiple access channel (MAC) with side information at the sources and the deco der. Source-channel separ a tion do es not hold for this channel. Sufficien t conditions are provided for transmis- sion of so urces with a given distortion. The so urce and/or the channel could have contin uous alphab ets (thus Ga ussian s ources a nd Ga us- sian MA Cs ar e spec ial cases ). V ar ious previous results ar e obtained as sp ecial cas es. W e also provide several goo d joint source-channel co d- ing s chemes for a discrete/ contin uous source a nd disc r ete/contin uous alphab et c hannel. Channels with feedback a nd fading are also cons id- ered. Keyw ords : Multiple access c hannel, side information, lossy join t source- channel co ding, channels with feedback, fading channels. 1 In tro duction In this rep ort we consid er the transmiss ion of v arious sour ces ov er a multiple access c hannel (MA C). W e su rv ey th e r esult a v ailable when the system m a y ha v e side inf ormation at th e sources an d /or at the deco der. W e also consider a MAC w ith feedback or when the c hannel exp eriences time v arying fading. This system do es n ot satisfy source-c hannel separation ([21]). Th us for optim um transmission one needs to consid er join t sour ce-c hannel coding. Th us we will provide several go o d join t source-c hannel co din g schemes. Al- though this topic has b een stud ied for last several d ecades, one recen t m o- tiv ation is the problem of estimating a random field via sensor net wo rks. Sensor no des ha ve limited computational and storage capabilities and v ery limited energy [3]. T hese sensor n o des n eed to trans m it their observ ations to a fusion cen ter wh ich uses this data to estimate th e sensed random field. Since tran s mission is very energy in tensiv e, it is imp ortan t to m in imize it. The pro ximit y of the sensing no des to eac h other induces h igh correla- tions b et w een the observ atio ns of adjacen t sensors. One can exploit these correlations to compress the transmitted data significan tly . F urthermore, some of th e no des can b e more p ow erful and can act as cluster heads ([6]). Neigh b oring no d es can first transmit their data to a cluster head whic h can further compress information b efore tran s mission to the fusion center. The transmission of data from sensor no des to their cluster-head is usu ally through a MAC. A t th e fusion cent er the underlyin g p hysical pro cess is esti- mated. The main trad e-off p ossible is b et we en the rates at whic h the s ensors send their observ ations and the distortion incur red in the estimation at the fusion cente r. The a v aila bilit y of side information at the enco d ers and/or the d eco d er can redu ce the rate of transmission ([82 ],[31]). The ab ov e consid erations op en up n ew interesting problems in multi- user information theory and the qu est for find in g the op timal p erformance for v arious mo dels of sources, channels an d side inf ormation ha v e made this an activ e area of r esearc h. The optimal solution is u nknown except in a few simple cases. In this r ep ort a join t source c hannel co din g approac h is discussed under v arious assump tions on side information an d distortion cri- teria. Suffi cien t conditions for transmission of d iscrete/co ntin uous alphab et sources ov er a d iscrete/con tin uous alphab et MA C are giv en. These results generalize th e previous r esults a v ail able on this pr ob lem. The rep ort is organized as follo ws. S ection 2 p ro vides the backg roun d and sur v eys the r elated literature. T ransmission of d istributed sour ces o v er a MA C with side information is consid ered in s ection 3. T he sources and the c hannel alphab ets can b e contin uous or discrete. S everal previous results are reco v ered as sp ecial cases in section 4. Section 5 considers th e imp ortan t case of transmission of discrete correlat ed sources ov er a Gaussian MAC (GMA C) and p resen ts a n ew co ding scheme. S ection 6 discusses sev eral join t source-c hannel cod ing sc hemes for transmission of Gaussian sources o v er a GMA C and compares th eir p erformance. It also su ggests cod in g schemes for general cont inuous sources ov e r a GMA C. T ransmiss ion of correlated sources o v er orthogonal channels is considered in s ection 7. Section 8 discusses a MA C with feedbac k. A MA C w ith m ulti p ath fading is add r essed in section 9 . Section 10 p ro vides practical sc hemes for joint source-c hannel co ding. Section 11 gives th e d irections f or future researc h and section 12 concludes the rep ort. 2 Bac kground In the follo wing we surv ey the related literature. Ahlswe de ([1]) and L iao ([47]) obtained the capacit y region of a discrete memoryless MAC w ith ind e- p endent inputs. Cov er, E l Gamal and S alehi in [21] made fu rther significant progress b y pro viding sufficient conditions for transmitting losslessly corr e- lated observ ations o v er a MA C. They pr op osed a ‘correla tion preserving’ sc heme for transmitting the sources. This mappin g is extended to a more general sys tem with sev eral principle sources and sev eral side information sources su b ject to cross observ a tions at the enco ders in [2]. Ho w ev er single letter c haracterization of the capacit y r egion is still unkn o wn. In deed Duek [25] p ro v ed that the conditions giv en in [21] are only suffi cient and ma y not b e necessary . In [40] a single letter upp er b ound for the problem is obtained. It is also shown in [21] that the sour ce-c hannel separation do es not hold in this case. The authors in [65] obtain a condition for separation to hold in a m ultiple access c hannel. The capacit y r egion for d istributed lossless sour ce co ding problem is giv en in the classic p ap er b y S lepian and W olf ([69]). Co v er ([20 ]) extended Slepian-W olf results to an arbitrary n umb er of discrete, ergo dic sour ces using a tec hnique called ‘r andom binning’. Other related p ap ers on th is problem are [8],[2]. Inspired by Slepian-W olf resu lts, Wyner and Ziv [82] obtained the rate distortion fu nction for sour ce co din g with side inform ation at the deco der . Unlik e for th e lossless case, it is sho wn that the knowledge of the side infor- mation at the enco ders in add ition to th e d eco d er, p ermits the transmission at a lo we r r ate. The latter result wh en enco der an d d ecod er h a v e sid e information was fir st obtained by Gra y and is kn o wn as conditional rate distortion fu nction (See [11]). Related work on side information cod ing is [7, 58, 24]. The lossy v ersion of Slepian-W olf pr oblem is called multi- terminal source co din g pr oblem and despite n umerous attempts (e.g., [12],[54]) the exact rate region is n ot kno wn except for a few sp ecial cases. First ma j or adv a ncement was in Berger and T ung ([11]) wh er e an inner and an outer b ound on the rate distortion region was obtained. Lossy co ding of con tin- uous sources at the high resolution limit is giv en in [87] where an explicit single-lette r b oun d is obtained. Gastpar ([32]) deriv ed an in ner and an outer b ound with side in formation and p ro v ed the tightness of h is b ounds wh en the sources are conditionally indep enden t giv en th e side information. The authors in [72] obtain inner and outer b ound s on the rate region with sid e information at the en co d ers and the d ecod er. References [71],[64] extend the result in [72] by r equiring the enco d ers to communicate o v er a MA C, i.e., they obtain sufficien t conditions f or transmission of correlated sour ces o v er a MA C with giv en distortion constraints. In [53] ac hiev a ble rate region for a MA C with correlated sources and f eedbac k is giv en. The distribu ted Gaussian sou r ce cod ing pr oblem is discussed in [54],[76]. Exact rate r egion is provided in [76]. The capac it y of a Gaussian MAC (GMA C) with feedbac k is giv en in [56 ]. In [44] a necessary and t wo suf- ficien t conditions for transmitting a j oin tly Gaussian source ov er a GMA C are pro vided. It is sho wn that the amplify and f orw ard sc heme is optimal b elo w a certain SNR determined by source correlations. The p erformance comparison of the sc hemes given in [44] with a Separation based sc heme is giv en in [63]. GMA C un d er receiv ed p ow er constrain ts is studied in [30] and it is sho wn that the source-c hannel separation holds in this case. In [33] th e authors discuss a join t source c hannel co d ing sc heme o v er a MA C and show th e scaling b eha vior f or the Gaussian c hann el. A Gaussian sensor net wo rk in distributed and collab orativ e setting is stud ied in [38 ]. The authors sh o w that it is b etter to compress the lo cal estimates than to compress the ra w data. Th e scaling la ws for a many-to- one data-g athering c hannel are d iscussed in [29]. It is s h o wn that the transp ort capacit y of the net w ork scales as O ( l og N ) when the n umber of sensors N gro ws to infinity and the tota l a v erage p o wer remains fixed. The scaling la ws for the p roblem without s id e information are discussed in [34] and it is shown that separating sou r ce co d in g f r om c hannel cod ing ma y require exp onenti al gro wth, as a function of n um b er of sensors , in comm unication band width. A lo w er b ound on b est ac hiev able distortion as a fun ction of the num b er of sensors, total tr an s mit p ow er, the d egrees of freedom of th e underlying pro cess and the sp atio-te mp oral comm unication bandwidth is giv en. The join t source-c hannel co ding problem also b ears relationship to the CEO p roblem [13]. In this problem, m ultiple encoders observ e different, noisy v ersions of a sin gle information source an d communicat e it to a single deco der called the CEO whic h is required to reconstruct the source within a certai n distortion. The Gaussian ve rsion of the C E O problem is studied in [55]. When Time Division Mu ltiple Access (TDMA), Co d e Division Multiple Access (CDMA) or F requency Division Multiple Access (FDMA) are u sed then a MA C b ecomes a s y s tem of orthogonal channels. These proto cols, although sub optimal are frequently used in p ractice an d hence ha ve b een extensiv ely studied ([23],[60]). Lossless transmission of correlated sources o v er orthogonal channels is addressed in [9]. The authors p r o v e th at the source-c hannel separation holds for this system. They also obtain the exact rate region. Reference [83] extends these results to the lossy case and sho ws that separation holds for the lossy case to o. Dist ribu ted scalar quant izers w ere designed for co rrelated Gaussian sources and ind ep endent Gaussian c hannels in [77]. The in formation-theoretic and comm unication f eatures of a fading MA C are giv en in an excellen t sur v ey p ap er [14]. A su rv ey of practical schems for distributed source co ding for sensor netw orks is giv en in [84]. Pr actical sc hemes f or distribu ted source co din g are also pro vided in [59],[19]. 3 T ransmission of correlated sources o v er a MAC In this section we consider the transmission of memoryless dep enden t sources, through a memoryless multiple access c hannel (Fig. 1). Th e sou r ces and/or the c hannel input/output alphab ets can b e discrete or co ntin uous. F ur- thermore, side in f ormation ma y b e a v aila ble at the enco ders and the d e- co der. Th us our system is very general and co v ers m an y systems stud - ied o v er the y ears as sp ecia l cases. W e consider t w o sources ( U 1 , U 2 ) and Figure 1: T ransmission of correlated sour ces o v er a MAC with side inf orma- tion side inform ation rand om v aria bles Z 1 , Z 2 , Z w ith a kno wn joint d istribution F ( u 1 , u 2 , z 1 , z 2 , z ). Side in formation Z i is a v ail able to enco der i, i ∈ { 1 , 2 } and th e d ecod er has s id e in formation Z . The r andom vect or sequence { ( U 1 n , U 2 n , Z 1 n , Z 2 n , Z n ) , n ≥ 1 } formed from the s ou r ce outpu ts and the side in formation with distribu tion F is indep enden t identic ally distribu ted ( iid ) in time. The sou r ces trans m it their co d ew ords X i ’s to a single d e- co der th r ough a memoryless multiple acce ss channel. The channel output Y has distrib u tion p ( y | x 1 , x 2 ) if x 1 and x 2 are transmitted at that time. The deco der receiv es Y and also has access to the sid e information Z . T he en- co ders at the t w o users do not communicate with eac h other except via the side inf ormation. It uses Y and Z to estimate the sensor observ a- tions U i as ˆ U i , i ∈ { 1 , 2 } . I t is of in terest to fi nd enco ders and a d e- co der suc h that { U 1 n , U 2 n , n ≥ 1 } can b e transmitted o v er the giv en MA C with E [ d 1 ( U 1 , ˆ U 1 )] ≤ D 1 and E [ d 2 ( U 2 , ˆ U 2 )] ≤ D 2 where d i are non-n egativ e distortion measures and D i are the giv en d istortion constrain ts. If the dis- tortion measures are u n b ounded we assume that u ∗ i , i = 1 , 2 exist such that E [ d i ( U i , u ∗ i )] < ∞ , i = 1 , 2. Source channel separation do es not hold in this case. F or discrete s ou r ces a common distortion measur e is Hamming distance d ( x, x ′ ) = 1 if x 6 = x ′ , d ( x, x ′ ) = 0 if x = x ′ . F or contin uous alphab et sour ces the most common d istortion measure is d ( x, x ′ ) = ( x − x ′ ) 2 . W e will denote { U ij , j = 1 , 2 , ..., n } by U n i , i = 1 , 2. Definition 1 The sour c e ( U n 1 , U n 2 ) c an b e tr ansmitte d over the multiple ac- c ess channel with distortions D ∆ =( D 1 , D 2 ) if for any ǫ > 0 ther e is an n 0 such that for al l n > n 0 ther e exist enc o ders f n E ,i : U n i × Z n i → X n i , i ∈ { 1 , 2 } and a de c o der f n D : Y n × Z n → ( ˆ U n 1 , ˆ U n 2 ) such that 1 n E h P n j =1 d ( U ij , ˆ U ij ) i ≤ D i + ǫ, i ∈ { 1 , 2 } wher e ( ˆ U n 1 , ˆ U n 2 ) = f D ( Y n , Z n ) , U i , Z i , Z , X i , Y , ˆ U i ar e the sets in which U i , Z i , Z , X i , Y , ˆ U i take values. W e denote the join t distribution of ( U 1 , U 2 ) by p ( u 1 , u 2 ) and let p ( y | x 1 , x 2 ) b e the transition probabilities of th e MA C. S ince the MAC is memoryless, p ( y n | x n 1 , x n 2 ) = Q n j =1 p ( y j | x 1 j , x 2 j ). X ↔ Y ↔ Z will indicate that { X, Y , Z } form a Mark o v chain. No w w e state th e main Theorem. Theorem 1 A sour c e c an b e tr ansmitte d over the multiple ac c ess channel with distortions ( D 1 , D 2 ) if ther e exist r ando m variables ( W 1 , W 2 , X 1 , X 2 ) such that 1 . p ( u 1 , u 2 , z 1 , z 2 , z , w 1 , w 2 , x 1 , x 2 , y ) = p ( u 1 , u 2 , z 1 , z 2 , z ) p ( w 1 | u 1 , z 1 ) p ( w 2 | u 2 , z 2 ) p ( x 1 | w 1 ) p ( x 2 | w 2 ) p ( y | x 1 , x 2 ) and 2 . th er e exists a function f D : W 1 × W 2 × Z → ( ˆ U 1 × ˆ U 2 ) such tha t E [ d ( U i , ˆ U i )] ≤ D i , i = 1 , 2 , wher e ( ˆ U 1 , ˆ U 2 ) = f D ( W 1 , W 2 , Z ) and the c on- str aints I ( U 1 , Z 1 ; W 1 | W 2 , Z ) < I ( X 1 ; Y | X 2 , W 2 , Z ) , I ( U 2 , Z 2 ; W 2 | W 1 , Z ) < I ( X 2 ; Y | X 1 , W 1 , Z ) , (1) I ( U 1 , U 2 , Z 1 , Z 2 ; W 1 , W 2 | Z ) < I ( X 1 , X 2 ; Y | Z ) ar e satisfie d wher e W i ar e the sets in which W i take values. In Theorem 1 the enco ding scheme inv o lv es distribu ted qu an tization ( W 1 , W 2 ) of the sources ( U 1 , U 2 ) and th e side information Z 1 , Z 2 follo w ed b y correlation p reserving mapping to the c hannel co dew ords ( X 1 , X 2 ). The deco ding appr oac h in vo lv es first deco d ing ( W 1 , W 2 ) and then obtaining es- timate ( ˆ U 1 , ˆ U 2 ) as a fun ction of ( W 1 , W 2 ) and the deco der side inf ormation Z . The p ro of of the theorem is giv en in App end ix A. If the c hannel alphab ets are con tinuous (e.g., GMA C) then in addition to the conditions in Th eorem 1 certain p o w er constrain ts E [ X 2 i ] ≤ P i , i = 1 , 2 are also n eeded. F or the discrete sources to reco v er the results with lossless tr ansmission one can use Hamming distance as the distortion measure. If the source-c hannel separation holds then one can talk ab out the ca- pacit y r egion of the c hannel. F or example, w hen there is no side inf ormation Z 1 , Z 2 , Z and the sources are indep end en t then we obtain the rate region R 1 ≤ I ( X 1 ; Y | X 2 ) , R 2 ≤ I ( X 2 ; Y | X 1 ) , R 1 + R 2 ≤ I ( X 1 , X 2 ; Y ) . (2) This is the well kno wn rate r egion of a MA C ([23]). Other s p ecial cases will b e p r o vided in Sec. 4. In Th eorem 1 it is p ossible to include other distortion constrain ts. F or example, in add ition to the b ounds on E [ d ( U i , ˆ U i )] one ma y wan t a b ound on the join t distortion E [ d (( U 1 , U 2 ) , ( ˆ U 1 , ˆ U 2 ))]. Then the only mo dification needed in the statement of the ab o ve theorem is to in clude th is also as a condition in defining f D . If w e only w an t to estimate a function g ( U 1 , U 2 ) at the decoder and not ( U 1 , U 2 ) themselves, then again one can use the tec hniques in p ro of of Theorem 1 to obtain sufficien t conditions. Dep endin g up on g , the conditions needed may b e weak e r than those needed in (1) The main pr oblem in using Theorem 1 is in obtaining go o d source- c hannel co din g sc hemes pro viding ( W 1 , W 2 , X 1 , X 2 ) w hic h satisfy the condi- tions in the theorem for a giv en source ( U 1 , U 2 ) and c hannel. A su bstan tial part of this rep ort will b e d ev oted to this problem. 3.1 Extension to m ultiple sources The ab o v e results can b e generalized to the multiple ( ≥ 2) sour ce case. Let S = 1 , 2 , ..., M b e the set of sources w ith joint d istribution p ( u 1 , u 2 ...u M ). Theorem 2 Sour c es ( U n i , i ∈ S ) c an b e c o mmunic ate d in a distribute d fash- ion over the memoryless multiple ac c ess channel p ( y | x i , i ∈ S ) with distor - tions ( D i , i ∈ S ) if ther e exist auxiliary r andom varia bles ( W i , X i , i ∈ S ) satisfying 1 . p ( u i , z i , z , w i , x i , y , i ∈ S ) = p ( u i , z i , z , i ∈ S ) p ( y | x i , i ∈ S ) Y j ∈S p ( w j | u j , z j ) p ( x j | w j ) 2 . Ther e exists a function f D : Q j ∈S W j × Z → ( ˆ U i , i ∈ S ) such that E [ d ( U i , ˆ U i )] ≤ D i , i ∈ S and the c onstr aints I ( U A , Z A ; W A | W A c , Z ) < I ( X A ; Y | X A c , W A c , Z ) for al l A ⊂ S (3) ar e satisfie d (in c ase of c ontinuous channel alphab ets we also ne e d the p ower c o nstr aints E [ X 2 i ] ≤ P i , i = 1 , ..., M ) . 3.2 Example W e p ro vide an example to show the reduction p ossible in transmiss ion rates b y exploiting the correlation b et w een the sources, the side information and the p ermissible distortions. Consider ( U 1 , U 2 ) with the join t distribu tion: P ( U 1 = 0; U 2 = 0) = P ( U 1 = 1; U 2 = 1) = 1 / 3; P ( U 1 = 1; U 2 = 0) = P ( U 1 = 0; U 2 = 1) = 1 / 6 . If we use indep end en t enco ders wh ic h do n ot exploit the correlatio n among the sources then we need R 1 ≥ H ( U 1 ) = 1 bit an d R 2 ≥ H ( U 2 ) = 1 bit for lossless codin g of the sources. If we use the cod ing scheme in [69], then R 1 ≥ H ( U 1 | U 2 ) = 0 . 918 bits, R 2 ≥ H ( U 2 | U 1 ) = 0 . 918 bits and R 1 + R 2 ≥ H ( U 1 , U 2 ) = 1 . 918 bits suffice. Next consid er a m ultiple access channel such that Y = X 1 + X 2 where X 1 and X 2 tak e v alues from the alph ab et { 0 , 1 } an d Y tak es v alues from the alphab et { 0 , 1 , 2 } . This do es not satisfy the separation conditions ([65]). The sum capacit y C of suc h a c hannel with ind ep endent X 1 and X 2 is 1 . 5 bits and if w e use sour ce-c hannel separation, the give n sources cannot b e transmitted losslessly b ecause H ( U 1 , U 2 ) > C . No w we use a j oint source-c hannel code to imp ro v e the capacit y of the channel. T ak e X 1 = U 1 and X 2 = U 2 . Then the capacit y of the c hannel is imp ro v ed to I ( X 1 , X 2 ; Y ) = 1 . 585 bits . This is still not enough to transmit the sour ces o v er the given MAC. Next w e exploit th e side inform ation. The s ide-information rand om v ariables are generated as follo ws. Z 1 is generated from U 2 b y using a b inary symmetric channel (BSC ) with cross o v er pr obabilit y p = 0 . 3. Similarly Z 2 is generated from U 1 b y using the same BSC. Let Z = ( Z 1 , Z 2 , V ), where V = U 1 .U 2 .N , N is a b inary random v a riable with P ( N = 0) = P ( N = 1) = 0 . 5 indep endent of U 1 and U 2 and ‘.’ d enotes the logica l AND op eration. This d enotes the case when the deco der can observe the encoder side information an d also h as some extra side information. Then fr om (1) if we u se just the s id e information Z 1 the sum rate for the sources n eeds to b e 1 . 8 bits . By symmetry the same holds if w e only ha v e Z 2 . If w e use Z 1 and Z 2 then w e can use the sum r ate 1 . 683 bits . If only V is u sed then the s um rate needed is 1 . 606 bits . So far w e can s till not transmit ( U 1 , U 2 ) losslessly if we use the co d ing U i = X i , i = 1 , 2. If all the information in Z 1 , Z 2 , V is used then we need R 1 + R 2 ≥ 1 . 4120 bits . Th us with the aid of Z 1 , Z 2 , Z w e can transmit ( U 1 , U 2 ) losslessly o v er the MA C even with ind ep endent X 1 and X 2 . Next we consider the d istortion criterion to b e the Hamming distance and the allo w able distortion as 4%. Then for compressing the individual sources without side in formation we need R i ≥ H ( p ) − H ( d ) = 0 . 758 bits, i = 1 , 2, where H ( x ) = − xl og 2 ( x ) − (1 − x ) log 2 (1 − x ). Thus w e still cannot tran s mit ( U 1 , U 2 ) with this distortion when ( X 1 , X 2 ) are indep enden t. If U 1 and U 2 are enco ded, exploiting their correlatio ns, ( X 1 , X 2 ) can b e correlated. Next assume the side information Z = ( Z 1 , Z 2 ) to be a v ail able at the deco der only . Then we need R 1 ≥ I ( U 1 ; W 1 ) − I ( Z 1 ; W 1 ) wh ere W 1 is an auxiliary random v ariable generated from U 1 . W 1 and Z 1 are related by a cascade of a BSC with crossov e r pr ob ab ility 0.3 with a BSC with crosso v er probabilit y 0.04. Th is imp lies that R 1 ≥ 0 . 6577 bits and R 2 ≥ 0 . 6577 bits . 4 Sp ecial Cases In the follo wing w e pro vide s ev eral systems studied in literature as sp ecial cases. The pr acticall y imp ortant sp ecial cases of GMA C and orthogonal c hannels w ill b e stud ied in detail in later sections. Th ere w e will discuss sev eral sp ecific joint source-c hannel co ding sc hemes for these and compare their p erformance. 4.1 Lossless m ultiple access c omm unication with c orrelated sources T ak e ( Z 1 , Z 2 , Z ) ⊥ ( U 1 , U 2 ) ( X ⊥ Y denotes that r .v. X is indep enden t of r.v. Y ) and W 1 = U 1 and W 2 = U 2 where U 1 , U 2 are discrete sources. Then the constrain ts of (1) reduce to H ( U 1 | U 2 ) < I ( X 1 ; Y | X 2 , U 2 ) , H ( U 2 | U 1 ) < I ( X 2 ; Y | X 1 , U 1 ) , (4) H ( U 1 , U 2 ) < I ( X 1 , X 2 ; Y ) where X 1 , X 2 are the c hannel inp uts, Y is th e c hannel outpu t and X 1 ↔ U 1 ↔ U 2 ↔ X 2 is satisfied. Th ese are the conditions obtained in [21]. 4.2 Lossy multiple access comm unication T ak e ( Z 1 , Z 2 , Z ) ⊥ ( U 1 , U 2 ) . In this case the constrain ts in (1) redu ce to I ( U 1 ; W 1 | W 2 ) < I ( X 1 ; Y | X 2 , W 2 ) , I ( U 2 ; W 2 | W 1 ) < I ( X 2 ; Y | X 1 , W 1 ) , (5) I ( U 1 , U 2 ; W 1 , W 2 ) < I ( X 1 , X 2 ; Y ) . This is an immediate generalization of [21] to the lossy case. 4.3 Lossy distr ibuted source co ding with side information The multiple access channel is tak en as a dummy c hannel which repr o duces its inputs. In this case we obtain that the sou r ces can b e cod ed with rates R 1 and R 2 to obtain the sp ecified d istortions at the deco der if R 1 > I ( U 1 , Z 1 ; W 1 | W 2 , Z ) , R 2 > I ( U 2 , Z 2 ; W 2 | W 1 , Z ) , (6) R 1 + R 2 > I ( U 1 , U 2 , Z 1 , Z 2 ; W 1 , W 2 | Z ) . This reco ve rs the r esult in [72], and generalize s the results in [82, 31, 69]. 4.4 Correlated sources with lossless transmission o v er m ul- tiuser chann els wit h receiver side information If we consider ( Z 1 , Z 2 ) ⊥ ( U 1 , U 2 ), W 1 = U 1 and W 2 = U 2 then we reco v er the conditions H ( U 1 | U 2 , Z ) < I ( X 1 ; Y | X 2 , U 2 , Z ) , H ( U 2 | U 1 , Z ) < I ( X 2 ; Y | X 1 , U 1 , Z ) , (7) H ( U 1 , U 2 | Z ) < I ( X 1 , X 2 ; Y | Z ) in T heorem 2 . 1 in [37 ]. 4.5 Mixed Side Information The aim is to determine the r ate distortion fun ction for transm itting a sour ce X with the aid of side in f ormation ( Y , Z ) (system in Fig 1(c) of [27]). The enco der is p ro vided with Y and the deco der h as acce ss to b oth Y and Z . This repr esen ts the Mixed side information (MSI) system whic h com bines the conditional rate distortion s y s tem and the Wyner-Ziv system. This has the system in Fig 1(a) and (b) of [27] as sp ecial cases. The resu lts of Fig 1(c) can b e reco v ered from our T heorem if w e tak e X , Y , Z, W in [27] as U 1 = X , Z = ( Z , Y ) , Z 1 = Y , W 1 = W . U 2 and Z 2 are tak en to b e constan ts. The acceptable rate region is give n b y R > I ( X, W | Y , Z ), w h ere W is a random v ariable with th e pr op ert y W ↔ ( X, Y ) ↔ Z and for whic h there exists a decod er fun ction suc h th at the d istortion constrain ts are met. 5 Discrete Alphab et Sources o v er G aussian MA C This system is pr acticall y ve ry useful. F or example, in a sensor n et w ork, the observ ations sensed by the sensor n o des are d iscretized and th en trans- mitted ov er a GMA C. The physica l proximit y of the sensor n o des mak es their observ ations correlated. This correlation can b e exploited to compress the transmitted d ata. W e present a distributed ‘correlation p reserving’ join t source-c hannel cod ing sc heme yielding joint ly Gaussian channel codewords whic h will b e s ho wn to compress the d ata efficien tly . This co ding s cheme w as d ev elop ed in [62]. Sufficient conditions f or lossless trans mission of t w o d iscrete sources ( U 1 , U 2 ) (generat ing iid sequences in time) o v er a general MA C with no side in formation are obtained in (4) an d repro d uced b elo w for con v enience H ( U 1 | U 2 ) < I ( X 1 ; Y | X 2 , U 2 ) , H ( U 2 | U 1 ) < I ( X 2 ; Y | X 1 , U 1 ) , (8) H ( U 1 , U 2 ) < I ( X 1 , X 2 ; Y ) where X 1 ↔ U 1 ↔ U 2 ↔ X 2 is satisfied. In this section, we further sp eciali ze the ab o v e r esults for lossless trans- mission of discrete correlated sour ces o v er an ad d itiv e memoryless GMA C: Y = X 1 + X 2 + N where N is a Gaussian random v a riable indep enden t of X 1 and X 2 . T he noise N satisfies E [ N ] = 0 and V ar ( N ) = σ 2 N . W e will also ha v e the transmit p ow er constrain ts: E [ X 2 i ] ≤ P i , i = 1 , 2. Since s ou r ce- c hannel sep aration d o es not hold for this system, a join t s ou r ce-c hannel co ding s c heme is needed for optimal p erf orm ance. The d ep enden ce of R.H.S. of (8) on in put alphab ets prev ent s us from getting a closed form expression for the admissibilit y criterion. T herefore w e relax the conditions by taking a wa y the d ep endence on the input alphab ets. This will allo w us to obtain go o d join t s ource-c hannel cod es. Lemma 1 U nder our assumptions, I ( X 1 ; Y | X 2 , U 2 ) ≤ I ( X 1 ; Y | X 2 ) . P r oof : Let ∆ ∆ = I ( X 1 ; Y | X 2 , U 2 ) − I ( X 1 ; Y | X 2 ) . Then ∆ = H ( Y | X 2 , U 2 ) − H ( Y | X 1 , X 2 , U 2 ) − [ H ( Y | X 2 ) − H ( Y | X 1 , X 2 )] . Since th e c hannel is memoryless, H ( Y | X 1 , X 2 , U 2 ) = H ( Y | X 1 , X 2 ) . Th us, ∆ = H ( Y | X 2 , U 2 ) − H ( Y | X 2 ) ≤ 0. Therefore, f rom (8), H ( U 1 | U 2 ) < I ( X 1 ; Y | X 2 , U 2 ) ≤ I ( X 1 ; Y | X 2 ) , (9) H ( U 2 | U 1 ) < I ( X 2 ; Y | X 1 , U 1 ) ≤ I ( X 2 ; Y | X 1 ) , (10) H ( U 1 , U 2 ) < I ( X 1 , X 2 ; Y ) . (11) The relaxation of the upp er b ounds is only in (9) and (10) and n ot in (11). W e sh o w that the r elaxed upp er b ounds are maximized if ( X 1 , X 2 ) is join tly Gaussian and the correlation ρ b etw ee n X 1 and X 2 is high (the h igh- est p ossible ρ may not give the largest up p er b ou n d in th e thr ee in equalities in (9)-(11)). Lemma 2 A j ointly Gaussian distribution for ( X 1 , X 2 ) maximizes I ( X 1 ; Y | X 2 ) , I ( X 2 ; Y | X 1 ) and I ( X 1 , X 2 ; Y ) simultane ously. P r oof : S in ce I ( X 1 , X 2 ; Y ) = H ( Y ) − H ( Y | X 1 , X 2 ) = H ( X 1 + X 2 + N ) − H ( N ) , it is maximized when H ( X 1 + X 2 + N ) is maximized. This entrop y is m axi- mized when X 1 + X 2 is Gaussian with the largest p ossible v a riance = P 1 + P 2 . If ( X 1 , X 2 ) is join tly Gaussian then so is X 1 + X 2 . Next consid er I ( X 1 ; Y | X 2 ). This equals H ( Y | X 2 ) − H ( N ) = H ( X 1 + X 2 + N | X 2 ) − H ( N ) = H ( X 1 + N | X 2 ) − H ( N ) whic h is maximized when P ( x 1 | x 2 ) is Gaussian and this happ ens wh en X 1 , X 2 are join tly Gaussian. A similar result holds for I ( X 2 ; Y | X 1 ). The difference b et we en the b ounds in (9) is I ( X 1 , Y | X 2 ) − I ( X 1 , Y | X 2 , U 2 ) = I ( X 1 + N ; U 2 | X 2 ) . (12) This d ifference is s m all if correlation b et w een ( U 1 , U 2 ) is small. In that case H ( U 1 | U 2 ) and H ( U 2 | U 1 ) will b e large and (9) and (10) can b e activ e con- strain ts. If correlation b et w een ( U 1 , U 2 ) is large, H ( U 1 | U 2 ) and H ( U 2 | U 1 ) will b e small and (11) w ill b e the only activ e constraint. In this case the difference b etw ee n the t w o b ounds in (9) and (10) is large bu t n ot imp or- tan t. Thus, the outer b ounds in (9) and (10) are close to th e inn er b ounds whenev er the constrain ts (9) an d (10) are activ e. O ften (11) will b e the only activ e constrain t. An adv an tage of outer b ounds in (9) and (10) is that w e w ill b e able to obtain a go o d source-c hannel co d ing scheme. Once ( X 1 , X 2 ) are obtained w e can c hec k for sufficient conditions (8). If these are n ot s atisfied for the ( X 1 , X 2 ) obtained, we will increase the correlation ρ b et w een ( X 1 , X 2 ) if p ossible (see details b elo w). Increasing th e correlation in ( X 1 , X 2 ) will decrease th e difference in (12) and increase the p ossibilit y of satisfying (8) when the outer b ounds in (9) and (10 ) are satisfied. W e ev aluate the (relaxed) r ate r egion (9)-(11) for the Gaussian MAC with jointly Gaussian c hannel inp uts ( X 1 , X 2 ) w ith the transmit p o w er con- strain ts. F or m aximization of this region w e need mean vect or [0 0] and co v ariance matrix K X 1 ,X 2 = P 1 ρ √ P 1 P 2 ρ √ P 1 P 2 P 2 ! where ρ is the correla- tion b et w een X 1 and X 2 . T hen (9)-(11) pro vide th e relaxed constraints H ( U 1 | U 2 ) < 0 . 5 log " 1 + P 1 (1 − ρ 2 ) σ N 2 # , (13) H ( U 2 | U 1 ) < 0 . 5 log " 1 + P 2 (1 − ρ 2 ) σ N 2 # , (14) H ( U 1 , U 2 ) < 0 . 5 log " 1 + P 1 + P 2 + 2 ρ √ P 1 P 2 σ N 2 # . (15) The upp er b ounds in the fir st t w o inequalities in (13 ) and (14) d ecrease as ρ increases. But the third upp er b ound (15) increases with ρ and often the thir d constrain t is the limiting constrain t. This motiv a tes us to consider the GMA C with correlated join tly Gaus- sian inputs. The next lemma provides an u pp er b oun d on the correlatio n b et we en ( X 1 , X 2 ) in terms of the distribution of ( U 1 , U 2 ). Lemma 3 L et ( U 1 , U 2 ) b e the c orr elate d sour c es and X 1 ↔ U 1 ↔ U 2 ↔ X 2 wher e X 1 and X 2 ar e jointly Gaussian. Then the c orr ela tion b etwe en ( X 1 , X 2 ) satisfies ρ 2 ≤ 1 − 2 − 2 I ( U 1 ,U 2 ) . P r oof : S ince X 1 ↔ U 1 ↔ U 2 ↔ X 2 is a Mark o v chain, b y data pro cessing inequalit y I ( X 1 ; X 2 ) ≤ I ( U 1 ; U 2 ). T aking X 1 , X 2 to b e join tly Gaussian with zero mean, un it v ariance and correlation ρ, I ( X 1 , X 2 ) = 0 . 5 log 2 ( 1 1 − ρ 2 ). This imp lies ρ 2 ≤ 1 − 2 − 2 I ( U 1 ,U 2 ) . 5.1 A c o ding Sc heme In this sect ion we dev elop a co din g sc heme f or mapping the discrete al- phab ets in to join tly Gauss ian correlate d co de wo rds which also s atisfy the Mark o v cond ition. The heart of the sc heme is to app r o ximate a joint ly Gaus- sian distribu tion with the su m of pro duct of Gaussian marginals. Although this is stated in the follo wing lemma f or t w o dim en sional v ectors ( X 1 , X 2 ), the resu lt holds for an y fi nite dimensional v ectors. Lemma 4 Any jointly Gaussian two dimensional density c an b e uniformly arbitr arily closely appr oximate d by a weighte d sum of pr o duct of mar gi nal Gaussian densities: N X i =1 p i √ 2 π c 1 i e − 1 2 c 1 i ( x 1 − a 1 i ) 2 q i √ 2 π c 2 i e − 1 2 c 2 i ( x 2 − a 2 i ) 2 . (16) P r oof : By Stone-W eierstrass theorem ([39]) the class of functions ( x 1 , x 2 ) 7→ e − 1 2 c 1 ( x 1 − a 1 ) 2 e − 1 2 c 2 ( x 2 − a 2 ) 2 can b e sho wn to b e d ense in C 0 under un iform con- v ergence wh ere C 0 is the s et of all conti nuous fu nctions on ℜ 2 suc h that lim k X k→∞ | f ( x ) | = 0 . Since th e join tly Gauss ian d en sit y ( x 1 , x 2 ) 7→ e − 1 2 σ 2 ( x 2 1 + x 2 2 − 2 ρx 1 x 2 1 − ρ 2 ) is in C 0 , it can b e app ro ximated arb itrarily closely un iformly by th e functions (16 ). F rom the ab o v e lemma we can form a sequence of functions f n ( x 1 , x 2 ) of typ e (16) suc h that sup x 1 ,x 2 | f n ( x 1 , x 2 ) − f ( x 1 , x 2 ) | → 0 as n → ∞ , where f is a giv en join tly Gaussian density . Although f n are not guarante ed to b e p robabilit y d ensities, due to un iform con v ergence, for large n , they will almost b e. In the follo wing lemma we will assume that we h av e made the minor mo difi cation to ensu re that f n is a prop er density f or large enough n . This lemma sho ws that obtaining ( X 1 , X 2 ) from suc h app ro ximations can pro vide the (relaxed) upp er b ounds in (2)-(4) (w e actually sh o w for the third inequalit y only bu t th is can b e sho wn for the other inequalities in the same w a y). Let ( X m 1 , X m 2 ) and ( X 1 , X 2 ) b e rand om v ariables with d istributions f m and f and sup x 1 ,x 2 | f m ( x 1 , x 2 ) − f ( x 1 , x 2 ) | → 0 as m → ∞ . Let Y m and Y denote the corresp ondin g channel outputs. Lemma 5 F or the r and om variables define d ab ove, if { log f m ( Y m ) , m ≥ 1 } is uniformly inte gr a ble, I ( X m 1 , X m 2 ; Y m ) → I ( X 1 , X 2 ; Y ) as m → ∞ . P r oof : S in ce I ( X m 1 , X m 2 ; Y m ) = H ( Y m ) − H ( Y m | X m 1 , X m 2 = H ( Y m ) − H ( N ) , it is sufficien t to sho w that H ( Y m ) → H ( Y ). F rom ( X m 1 , X m 2 ) d − → ( X 1 , X 2 ) and indep endence of ( X m 1 , X m 2 ) from N , w e get Y m = X m 1 + X m 2 + N d − → X 1 + X 2 + N = Y . Then f m → f uniformly implies that f m ( Y m ) d − → f ( Y ). S ince f m ( Y m ) ≥ 0 , f ( Y ) ≥ 0 a.s and l og is con tin uous except at 0, w e obtain l og f m ( Y m ) d − → log f ( Y ) . Then un iform in tegrabilit y pro vides I ( X m 1 , X m 2 ; Y m ) → I ( X 1 , X 2 ; Y ). A set of sufficient conditions for uniform inte grabilit y of { log f m ( Y m ) , m ≥ 1 } is 1. Number of comp onen ts in (16) is u pp er b ounded. 2. V ariance of comp onen t densities in (16) is upp er b ounded and lo w er b ound ed aw a y from zero. 3. The means of the comp onen t densities in (16) are in a b ounded set. F rom Lemma 4 a join t Gaussian densit y w ith any correlation ca n b e expressed by a linear combinatio n of marginal Gaussian densities. But the co efficien ts p i and q i in (16) ma y be p ositive or negativ e. T o realize ou r co ding sc heme, w e w ould lik e to ha v e the p i ’s and q i ’s to b e non negativ e. This introd uces constraints on the realizable Gaussian d ensities in our co ding sc heme. F or example, from Lemma 3, the correlation ρ b et w een X 1 and X 2 cannot exceed p 1 − 2 − 2 I ( U 1 ; U 2 ) . Also there is still the q u estion of getting a go o d linear com bination of marginal densities to obtain th e j oin t densit y for a giv en N in (16). This motiv ates us to consid er an optimization p ro cedure f or fin ding p i , q i , a 1 i , a 2 i , c 1 i and c 2 i in (16) that pro vides the b est app ro ximation to a given join t Gaussian densit y . W e illustrate this with an example. Con- sider U 1 , U 2 to b e binary . Let P ( U 1 = 0; U 2 = 0) = p 00 ; P ( U 1 = 0; U 2 = 1) = p 01 ; P ( U 1 = 1; U 2 = 0) = p 10 and P ( U 1 = 1; U 2 = 1) = p 11 . W e can consider f ( X 1 = . | U 1 = 0) = p 101 N ( a 101 , c 101 ) + p 102 N ( a 102 , c 102 ) ... + p 10 r 1 N ( a 10 r 1 , c 10 r 1 ) , (17) f ( X 1 = . | U 1 = 1) = p 111 N ( a 111 , c 111 ) + p 112 N ( a 112 , c 112 ) ... + p 11 r 2 N ( a 11 r 2 , c 11 r 2 ) , (18) f ( X 2 = . | U 2 = 0) = p 201 N ( a 201 , c 201 ) + p 202 N ( a 202 , c 202 ) ... + p 20 r 3 N ( a 20 r 3 , c 20 r 3 ) , (19) f ( X 2 = . | U 2 = 1) = p 211 N ( a 211 , c 211 ) + p 212 N ( a 212 , c 212 ) ... + p 21 r 4 N ( a 21 r 4 , c 21 r 4 ) . (20) where N ( a, b ) d enotes Gaussian den sit y with mean a and v a riance b . Let p b e the v ector with comp onents p 101 , ..., p 10 r 1 , p 111 , ..., p 11 r 2 , p 201 , ..., p 20 r 3 , p 211 , ... , p 21 r 4 . S imilarly we denote b y a and c the vec tors with comp onen ts a 101 , ..., a 10 r 1 , a 111 , ..., a 11 r 2 , a 201 , ..., a 20 r 3 , a 211 , ..., a 21 r 4 and c 101 , ..., c 10 r 1 , c 111 , ..., c 11 r 2 , c 201 , ..., c 20 r 3 , c 211 , ..., c 21 r 4 . Let f ρ ( x 1 , x 2 ) b e th e jointly Gauss ian d ensit y that we wan t to app ro xi- mate. Let it has zero mean and co v ariance matrix K X 1 ,X 2 = 1 ρ ρ 1 ! . L et g p ,a,c b e the su m of marginal densities with parameters p, a , c appro ximating f ρ . The b est g is ob tained by solving the follo wing minimization pr oblem: min p ,a,c Z [ g p ,a,c ( x 1 , x 2 ) − f ρ ( x 1 , x 2 )] 2 dx 1 dx 2 (21) sub ject to ( p 00 + p 01 ) r 1 X i =1 p 10 i a 10 i + ( p 10 + p 11 ) r 2 X i =1 p 11 i a 11 i = 0 , ( p 00 + p 10 ) r 3 X i =1 p 20 i a 20 i + ( p 01 + p 11 ) r 4 X i =1 p 21 i a 21 i = 0 , ( p 00 + p 01 ) r 1 X i =1 p 10 i ( c 10 i + a 2 10 i ) + ( p 10 + p 11 ) r 2 X i =1 p 11 i ( c 11 i + a 2 11 i ) = 1 , ( p 00 + p 10 ) r 3 X i =1 p 20 i ( c 20 i + a 2 20 i ) + ( p 01 + p 11 ) r 4 X i =1 p 21 i ( c 21 i + a 2 21 i ) = 1 , r 1 X i =1 p 10 i = 1 , r 2 X i =1 p 11 i = 1 , r 3 X i =1 p 20 i = 1 , r 4 X i =1 p 21 i = 1 , p 10 i ≥ 0 , c 10 i ≥ 0 f or i ∈ { 1 , 2 ... r 1 } , p 11 i ≥ 0 , c 11 i ≥ 0 f or i ∈ { 1 , 2 ...r 2 } , p 20 i ≥ 0 , c 20 i ≥ 0 f or i ∈ { 1 , 2 ... r 3 } , p 21 i ≥ 0 , c 21 i ≥ 0 f or i ∈ { 1 , 2 ...r 4 } . The ab ov e constrain ts are su c h that the resulting distrib ution g for ( X 1 , X 2 ) will satisfy E [ X i ] = 0 and E [ X 2 i ] = 1 , i = 1 , 2. The ab o ve co din g scheme w ill b e used to obtain a codeb o ok as f ollo ws. If user 1 pro du ces U 1 = 0, then with probabilit y p 10 i the enco d er 1 obtains co dew ord X 1 from the d istribution N ( a 10 i , c 10 i ). Similarly w e obtain the co dew ords for U 1 = 1 and for us er 2. Once we hav e foun d the enco der m ap s the enco ding and deco ding are as d escrib ed in the pro of of Theorem 1 in App end ix A. The d eco d ing is d on e by joint t ypicalit y of the r eceiv ed Y n with ( U n 1 , U n 2 ). This cod ing sc heme can b e extended to an y discrete alphab et case. W e giv e an example b elo w to illustrate the co ding sc heme. 5.2 Example Consider ( U 1 , U 2 ) with the join t distribution: P ( U 1 = 0; U 2 = 0) = P ( U 1 = 1; U 2 = 1) = P ( U 1 = 0; U 2 = 1) = 1 / 3; P ( U 1 = 1; U 2 = 0) = 0 and p o wer constrain ts P 1 = 3; P 2 = 4. Also consider a Gaussian multiple access c hannel with σ 2 N = 1. If the sources are mapp ed into indep enden t c hannel co de w ords, then the su m rate condition in (15) with ρ = 0 should hold. The LHS ev aluates to 1.585 bits w h ereas the RHS is 1.5 bits. Thus condition (15) is violated and h ence the su fficien t conditions in (8) are also v iolated. In the follo wing we explore the p ossibilit y of using correlated ( X 1 , X 2 ) to see if w e can transmit this s ou r ce on the giv en MA C. The inputs ( U 1 , U 2 ) can b e distr ibutedly mapp ed to join tly Gaussian c hannel co d e w ords ( X 1 , X 2 ) b y the tec hnique m en tioned ab o ve . The maxim um ρ which satisfies (13) and (14) are 0.7024 and 0.78 74 resp ectiv ely and the minim um ρ whic h s atisfies (15) is 0.144. Thus, w e can pic k a ρ wh ic h s atisfies (13)-(15). F rom Lemma 3 , ρ is u pp er b ound ed by 0.546. Therefore we w ant to obtain jointly Gaussian ( X 1 , X 2 ) satisfying X 1 ↔ U 1 ↔ U 2 ↔ X 2 with correlation ρ ∈ [0 . 144 , 0 . 54 6]. If w e p ick a ρ that satisfies the original bou n ds, th en w e w ill b e able to transmit the sources ( U 1 , U 2 ) r eliably on th is MA C. Without loss of gen- eralit y the join tly Gaussian c hannel inp uts requir ed are c hosen with mean v ector [0 0] and co v ariance matrix K X 1 ,X 2 = 1 ρ ρ 1 ! . The ρ chosen is 0.3 and hence is such that it meets all the conditions (13)-(15) . Also, w e c ho ose r 1 = r 2 = r 3 = r 4 = 2. W e solve the optimization problem (21) via MA T - LAB to get the function g . T he norm alized minim um distortion, defi ned as R [ g p ,a,c ( x 1 , x 2 ) − f ρ ( x 1 , x 2 )] 2 dx 1 dx 2 / R f 2 ρ ( x 1 , x 2 ) dx 1 dx 2 is 0.137% when the marginals are c hosen as: f ( X 1 | U 1 = 0) = N ( − 0002 , 0 . 9 108) , f ( X 1 | U 1 = 1) = N ( − 0001 , 1 . 0 446) , f ( X 2 | U 2 = 0) = N ( − 0021 , 1 . 1 358) , f ( X 2 | U 2 = 1) = N ( − 0042 , 0 . 7 283) . The approximat ion (a cross section of the t wo dimensional densities) is sho wn in Fig. 5.2. If we tak e ρ = 0 . 6 which violat es Lemm a 3 then the appro ximation is sho wn in Fig. 5.2. W e can see fr om Fig. 5.2 that the error in this case is more. No w the normalized marginal distortion is 10.5 %. The original up p er b ound in (9) and (10) for this example with ρ = 0 . 3 is I ( X 1 ; Y | X 2 , U 2 ) = 0 . 792 , I ( X 2 ; Y | X 1 , U 1 ) = 0 . 996. Also, I ( X 1 ; Y | X 2 ) = 0 . 949 , I ( X 2 ; Y | X 1 ) = 1 . 107. H ( U 1 | U 2 ) = H ( U 2 | U 1 ) = 0 . 66 and we conclude that the original b oun ds to o are satisfied by th e choi ce of ρ = 0 . 3. Figure 2: C r oss section of the approximat ion of the join t Gaussian ρ =0.3 6 Source-Channel Co ding for Gaussian sourc es o v er Gaussian MAC In this section w e consider transmission of correlated Gauss ian s ources o v er a GMA C. T h is is an imp ortan t example for transmitting cont inuous alphab et sources ov er a GMA C. F or example one comes across it if a sensor net work Figure 3: C r oss section of the approximat ion of the join t Gaussian ρ =0.6 is sampling a Gaussian random field. Also, in the application of detection of c hange ([74]) by a sensor net w ork, it is often the d etectio n of change in the mean of the sensor observ ations with the sensor obser v ation n oise b eing Gaussian. W e will assume that ( U 1 n , U 2 n ) is join tly Gaussian w ith mean zero, v ari- ances σ 2 i , i = 1 , 2 an d correlation ρ. The d istortion measure will b e Mean Square Er r or (MSE). Th e (relaxed) suffi cient conditions from (6 ) for trans- mission of th e sour ces o v er the c hann el are giv en by (these con tin ue to hold b ecause Lemmas 1-3 are still v alid) I ( U 1 ; W 1 | W 2 ) < 0 . 5 log " 1 + P 1 (1 − ˜ ρ 2 ) σ N 2 # , I ( U 2 ; W 2 | W 1 ) < 0 . 5 log " 1 + P 2 (1 − ˜ ρ 2 ) σ N 2 # , (22) I ( U 1 , U 2 ; W 1 , W 2 ) < 0 . 5 log " 1 + P 1 + P 2 + 2 ˜ ρ √ P 1 P 2 σ N 2 # . W e consider three sp ecific coding sc hemes to obtain W 1 , W 2 , X 1 , X 2 where ( W 1 , W 2 ) satisfy the distortion constraints and ( X 1 , X 2 ) are join tly Gaussian with an appr opriate ˜ ρ s uc h that (22) is satisfied. These co ding schemes ha v e b een w id ely used. W e compare their p erformance also. 6.1 Amplify and forward sc heme In the Amp lify and F orwa rd (AF) sc heme th e c hannel codes X i are ju st scaled source s ym b ols U i . Since ( U 1 , U 2 ) are th emselv es joint ly Gauss ian, ( X 1 , X 2 ) will b e join tly Gaussian and retain the d ep endence of inp uts ( U 1 , U 2 ) . The scaling is done to ensur e E [ X i 2 ] = P i , i = 1 , 2. F or a single user case this co d ing is optimal [23]. A t the d eco d er inputs U 1 and U 2 are directly estimated fr om Y as ˆ U i = E [ U i | Y ] , i = 1 , 2. Because U i and Y are join tly Gaussian this estimate is linear and also satisfies th e Minim um Mean S quare Error (MMSE) and Maxim um Lik elihoo d (ML) criteria. The MMSE distortion for this enco ding-deco ding sc heme is D 1 = σ 1 2 P 2 (1 − ρ 2 ) + σ N 2 P 1 + P 2 + 2 ρ √ P 1 P 2 + σ N 2 , D 2 = σ 2 2 P 1 (1 − ρ 2 ) + σ N 2 P 1 + P 2 + 2 ρ √ P 1 P 2 + σ N 2 . (2 3) Since en co d ing and deco ding require minimum pro cessing and dela y in this sc heme, if it satisfies the required distortion b oun ds D i , it should b e the sc heme to implement . T his scheme has b een studied in [44] and found to b e optimal b elo w a certain SNR for tw o-user symm etric case ( P 1 = P 2 , σ 1 = σ 2 , D 1 = D 2 ) . Ho w ev er unlike for single user case, in th is case user 1 acts as interference for user 2 (and vice versa). Thus one should n ot exp ect this sc heme to b e optimal un der high SNR case. That th is is indeed tr ue wa s sho wn in Ref.[63]. It was also sho wn there, that at high SNR, f or P 1 6 = P 2 , it ma y in deed b e b etter in AF to use less p o w er than P 1 , P 2 . This can also b e interpreted as usin g AF on U 1 − α 1 E [ U 2 | U 1 ] and U 2 − α 2 E [ U 1 | U 2 ] at the t w o enco ders at h igh SNR w h ic h w ill redu ce the correlations b et w een the transmitted sy mb ols. 6.2 Separation based schem e Figure 4: sep aration based sc heme In separation b ased (SB) app roac h (Fig. 4) the joint ly Gaussian sources are v ector qu an tized to W n 1 and W n 2 . The quan tized outputs are Slepian- W olf enco ded [69]. This pr o duces cod e words, which are (asymptotically) indep end en t. T h ese indep enden t co de words are enco d ed to capacit y achiev- ing Gauss ian channel co d es ( X n 1 , X n 2 ) w ith correlatio n ˜ ρ = 0. This is a v ery natural scheme and h as b een considered by v arious authors ([21 ],[65],[23 ]). Since s ou r ce-c hannel separation do es not h old for th is sys tem, this sc heme is not exp ected to b e optimal. But b ecause this sc heme decouples source co ding from c hann el co d in g, it is pr eferable to a j oint s ou r ce-c hannel co ding sc heme w ith comparable p erformance. 6.3 Lapidoth-Tinguel y sch eme In th is scheme, obtained in [44], ( U n 1 , U n 2 ) are v ector quant ized to 2 nR 1 , 2 nR 2 ( ˜ U n 1 , ˜ U n 2 ) v ectors wh er e R 1 and R 2 will b e sp ecified b elow. Also, W n 1 , W n 2 are 2 nR 1 and 2 nR 2 , n length co d e wo rds obtained indep enden tly with distri- butions N (0 , 1) . F or eac h ˜ u n i , we p ic k the co d ew ord w n i that is closest to it. This w a y w e obtain Gauss ian co dewo rds W n 1 , W n 2 whic h retain the correla- tions of ( U n 1 , U n 2 ). X n 1 and X n 2 are obtained by scaling W n 1 , W n 2 to satisfy the transmit p o w er constrain ts. W e will call this L T sc heme. ( U 1 , U 2 , W 1 , W 2 ) are (appr oximate ly) jointly Gaussian with co v ariance m atrix σ 2 1 ρσ 1 σ 2 σ 2 1 (1 − 2 − 2 R 1 ) ρσ 1 σ 2 (1 − 2 − 2 R 2 ) ρσ 1 σ 2 σ 2 2 ρσ 1 σ 2 (1 − 2 − 2 R 1 ) σ 2 2 (1 − 2 − 2 R 2 ) σ 2 1 (1 − 2 − 2 R 1 ) ρσ 1 σ 2 (1 − 2 − 2 R 1 ) σ 2 1 (1 − 2 − 2 R 1 ) ˜ ρ 2 σ 1 σ 2 ρ ρσ 1 σ 2 (1 − 2 − 2 R 2 ) σ 2 2 (1 − 2 − 2 R 2 ) ˜ ρ 2 σ 1 σ 2 ρ σ 2 2 (1 − 2 − 2 R 2 ) . (24) In (24) e ρ = ρ q (1 − 2 − 2 R 1 )(1 − 2 − 2 R 2 ). W e obtain the ( R 1 , R 2 ) ab o v e from (22) . F rom I ( U 1 ; W 1 | W 2 ) = H ( W 1 | W 2 ) − H ( W 1 | W 2 , U 1 ) , and the fact that th e Mark o v c hain condition W 1 ↔ U 1 ↔ U 2 ↔ W 2 holds, H ( W 1 | W 2 , U 1 ) = H ( W 1 | U 1 ) and I ( U 1 ; W 1 | W 2 ) = 0 . 5 log h (1 − ˜ ρ 2 )2 2 R 1 i . Th us from (22) we n eed R 1 and R 2 whic h satisfy R 1 ≤ 0 . 5 log P 1 σ N 2 + 1 (1 − ˜ ρ 2 ) . (25) Similarly , we also need R 2 ≤ 0 . 5 log P 2 σ N 2 + 1 (1 − ˜ ρ 2 ) , (26) R 1 + R 2 ≤ 0 . 5 log " σ N 2 + P 1 + P 2 + 2 ˜ ρ √ P 1 P 2 (1 − ˜ ρ 2 ) σ N 2 # . (27) The inequalities (25)-(27) are the same as in [44]. T h us we r eco v er the conditions in [44] fr om our general r esult ((1)). T aking ˆ U i = E [ U i | W 1 , W 2 ] , i = 1 , 2, we obtain the distortions D 1 = v ar ( U 1 | W 1 , W 2 ) = σ 1 2 2 − 2 R 1 h 1 − ρ 2 1 − 2 − 2 R 2 i (1 − ˜ ρ 2 ) , (28) D 2 = v ar ( U 2 | W 1 , W 2 ) = σ 2 2 2 − 2 R 2 h 1 − ρ 2 1 − 2 − 2 R 1 i (1 − ˜ ρ 2 ) . (29) The min im um distortion is obtained when ˜ ρ is such th at the su m rate is met with equalit y in (27). F or the symm etric case at the minimum distor- tion, R 1 = R 2 . 6.4 Asymptotic p erformance of the three schemes W e compare the p erformance of th e three sc hemes. These results are f rom [63]. F or simplicit y we consider the symm etric case: P 1 = P 2 = P , σ 1 = σ 2 = σ , D 1 = D 2 = D . W e will denote the SNR P /σ N 2 b y S . Consider the AF sc heme. F rom (23) D ( S ) = σ 2 S 1 − ρ 2 + 1 2 S (1 + ρ ) + 1 . (30) Th us D ( S ) decreases to σ 2 (1 − ρ ) / 2 strictly monotonically at rate O (1) as S → ∞ . Also, lim S → 0 D ( S ) − σ 2 S = σ 2 (1 + ρ ) 2 . (31) Th us, D ( S ) → σ 2 at rate O ( S ) as S → 0 . Consider the SB scheme at High SNR. F rom [76] if eac h source is enco ded with rate R th en it can b e deco ded at the deco d er with distortion D 2 = 2 − 4 R (1 − ρ 2 ) + ρ 2 2 − 8 R . (32) A t h igh SNR, from the capacit y result for ind ep endent inp uts, we ha v e R < 0 . 25 log S ([23]). Then fr om (32) we obtain D ≥ s σ 4 (1 − ρ 2 ) S + σ 4 ρ 2 S 2 (33) and this lo w er b ound is ac hiev able. As S → ∞ , this lo w er b ound appr oac hes zero at rate O ( √ S ). Thus this sc heme outp erforms AF at h igh SNR. A t lo w SNR, R ≈ S 2 and hence f rom (32) D ≥ ρ 2 σ 4 2 − 4 S + σ 2 (1 − ρ 2 )2 − 2 S . (34) Th us D → σ 2 at rate O ( S 2 ) as S → 0 at high ρ and at rate O ( S ) at s m all ρ . Therefore w e exp ect that at lo w SNR, at high ρ this sc heme will b e w orse than AF but at low ρ it w ill b e comparable. Consider the L T scheme. In the high SNR region we assume that ˜ ρ = ρ since R = R 1 = R 2 are sufficien tly large. Then from (27) R ≈ 0 . 25 log [2 S/ (1 − ρ )] and the distortion can b e app r o ximated by D ≈ σ 2 q (1 − ρ ) / 2 S . (35) Therefore, D → 0 as S → ∞ at r ate O ( √ S ). This rate of con v ergence is same as for SB. Ho we ve r, the R.H.S. in (33) is greater than that of (35) and at lo w ρ the t wo are close. Thus at high SNR L T alw a ys outp er f orms SB but the impro ve ment is small for lo w ρ . A t lo w SNR R ≈ S (1 + ˜ ρ ) 2 − log(1 − ˜ ρ 2 ) 4 and ev aluating D fr om (28) we get D = σ 2 2 − S 1 − ρ 2 (1 − p 1 − ˜ ρ 2 2 − S ) p 1 − ˜ ρ 2 (36) where S = S (1 + ˜ ρ ). Th erefore D → σ 2 as S → 0 at r ate O ( S 2 ) at h igh ρ and at rate O ( S ) at low ρ . Th ese rates are same as th at f or SB. In fact, dividing the exp ression f or D at lo w SNR for SB by th at for L T, w e can sho w that the tw o distortions tend to σ 2 at the same rate for all ρ . The necessary conditions (NC) to b e able to transmit on the GMA C with distortion ( D , D ) f or the symmetric case are ([44]) D ≥ σ 2 [ S ( 1 − ρ 2 ) +1 ] 2 S ( 1+ ρ )+1 , for S ≤ ρ 1 − ρ 2 , σ 2 r (1 − ρ 2 ) 2 S (1+ ρ )+1 , for S > ρ 1 − ρ 2 . (37) The ab ov e three sc hemes along with (37) are compared b elo w using exact computations. Figures 5 and 6 sho ws the d istortion as a function of SNR for unit v ariance jointly Gaussian sources with correlations ρ = 0.1 and 0.75. −20 −15 −10 −5 0 5 10 15 20 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 SNR (dB) distortion SNR vs Distortion performance AF NC SB LT Figure 5: S NR vs d istortion p erforman ce ρ = 0.1 −20 −15 −10 −5 0 5 10 15 20 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 SNR(dB) Distortion SNR vs Distortion performance SB AF NC LT Figure 6: S NR vs d istortion p erforman ce ρ = 0.75 F rom these plots we confirm our theoretical conclusions pr o vided ab o v e. 6.5 Con tin uous sources o v er a GMAC F or general cont inuous alphab et sources ( U 1 , U 2 ) we v ector qu antize U n 1 , U n 2 in to ˜ U n 1 , ˜ U n 2 . T hen to ob tain correlated Gaussian co d ewords ( X n 1 , X n 2 ) we can u se tw o schemes adapted from the case s studied abov e. In the first sc heme w e use the sc heme dev elop ed in s ec. 5.1. In th e second sc heme we use L T sc heme explained in sec. 6.3. 7 Correlated Sources o v er Orthogonal MA C One stand ard wa y to us e the MAC is via TDMA, FDMA, C DMA or Orthog- onal F r equency Division Multiple Access (OFDMA) ([23 , 60, 9]). These pro- to cols although sub optimal are used due to practical considerations. These proto cols make a MA C a set of parallel orthogonal channels (for CDMA, it happ en s if w e use orthogo nal co des). W e study transmission of correlate d sources th r ough suc h a system. 7.1 T ransmission of cor related sources ov er orthogonal chan- nels Consider th e setup in Fig. 1 wh en Y = ( Y 1 , Y 2 ) and p ( y | x 1 , x 2 ) = p ( y 1 , y 2 | x 1 , x 2 ) = p ( y 1 | x 1 ) p ( y 2 | x 2 ). Then the conditions in (1) b ecome I ( U 1 , Z 1 ; W 1 | W 2 , Z ) < I ( X 1 ; Y 1 | W 2 , Z ) ≤ I ( X 1 ; Y 1 ) , ( 38) I ( U 2 , Z 2 ; W 2 | W 1 , Z ) < I ( X 2 ; Y 2 | W 1 , Z ) ≤ I ( X 2 ; Y 2 ) , ( 39) I ( U 1 , U 2 , Z 1 , Z 2 ; W 1 , W 2 | Z ) < I ( X 1 , X 2 ; Y 1 , Y 2 | Z ) ≤ I ( X 1 ; Y 1 ) + I ( X 2 ; Y 2 ) . (40) The outer b ounds in (38 )-(40) are attained if th e channel co dewords ( X 1 , X 2 ) are indep enden t of eac h other. Also, the distrib ution of ( X 1 , X 2 ) maximiz- ing these b ound s are not dep enden t on th e distribution of ( U 1 , U 2 ). This implies that s ource-c hannel separation holds for this system ev en with side information ( Z 1 , Z 2 , Z ) (for the sufficient conditions (1) ). T h us by c ho osing ( X 1 , X 2 ) wh ic h m aximize the outer b ounds in (38)-(40) we obtain capacit y region for this system whic h is indep end en t of the side conditions. Also, for a GMAC this is obtained b y indep enden t Gaussian r.v.s X 1 and X 2 with distributions N (0 , P i ) , i = 1 , 2, where P i are the p o w er constraints. F ur- thermore, the L.H.S. of the in equalities are sim ultaneously minim ized when W 1 and W 2 are in d ep endent. Th us, the source co ding ( W 1 , W 2 ) on ( U 1 , Z 1 ) and ( U 2 , Z 2 ) can b e done as in Slepian-W olf co d ing (b y first v ector quan- tizing in case of contin uous v alued U 1 , U 2 ) but also taking into accoun t the fact that the side information Z is a v ail able at the deco der. In this section this co d ing sc heme will b e called SB. If w e take W 1 = U 1 and W 2 = U 2 and the side information ( Z 1 , Z 2 , Z ) ⊥ ( U 1 , U 2 ), we can reco v er the conditions in [9]. 7.2 Gaussian sources and orthogonal Gaussian cha nnels No w we consider the transmission of join tly Gaussian sour ces ov er orthogo- nal Gaussian c hannels. Initially it will also b e assu m ed that there is no s id e information ( Z 1 , Z 2 , Z ). No w ( U 1 , U 2 ) are zero mean join tly Gaussian random v ariables with v ari- ances σ 2 1 and σ 2 2 resp ectiv ely and correlation ρ . Then Y i = X i + N i , i = 1 , 2 where N i is Gaussian with zero mean and σ 2 N i v a riance. Also N 1 and N 2 are indep end en t of eac h other and also of ( U 1 , U 2 ). In th is scenario, the R.H.S. of the inequalities in (38)-(40) are maximized b y taking X i ∼ N (0 , P i ) , i = 1 , 2 ind ep endent of eac h other wher e P i is the a v erage transmit p ow er constraint on user i . Then I ( X i , Y i ) = 0 . 5 l og (1 + P i /σ 2 N i ) , i = 1 , 2. Based on the commen ts at th e end of sec. 7.1, for t wo u s ers, u sing th e results fr om [76] we obtain the n ecessary and su fficien t conditions f or trans- mission on an orthogonal GMA C with give n distortions D 1 and D 2 . W e can sp ecialize the ab o v e results to a TDMA, FDMA or CDMA based transmission sc heme. T he s p ecializatio n to T DMA is giv en h ere. Su pp ose source 1 uses the c hannel α fraction of time and us er 2, 1 − α fraction of time. In this case we can us e av erag e p o w er P 1 /α for the firs t u ser and P 2 / (1 − α ) for the second user when ev er they trans m it. The conditions (38)-(40) for the optimal sc heme b ecome I ( U 1 ; W 1 | W 2 ) < 0 . 5 α log 1 + P 1 ασ N 1 2 , (41) I ( U 2 ; W 2 | W 1 ) < 0 . 5(1 − α ) log " 1 + P 2 (1 − α ) σ N 2 2 # , (42) I ( U 1 , U 2 ; W 1 , W 2 ) < 0 . 5 α log 1 + P 1 ασ N 1 2 + 0 . 5(1 − α ) log " 1 + P 2 (1 − α ) σ N 2 2 # . (43) In the f ollo wing w e compare the p erformance of the AF sc heme (ex- plained in sec. 6.1) with the SB sc heme. Unlik e in the GMA C there is no int erference b et w een the t w o u sers when orthogonal c hannels are u s ed. Therefore, in this case we exp ect AF to p erform quite well. F or AF, the minimum d istortions ( D 1 , D 2 ) are D 1 = ( σ 1 σ N 1 ) 2 h P 2 (1 − ρ 2 ) + σ 2 N 2 i P 1 P 2 (1 − ρ 2 ) + σ 2 N 2 P 1 + σ 2 N 1 P 2 + σ 2 N 1 σ 2 N 2 , (44) D 2 = ( σ 2 σ N 2 ) 2 h P 1 (1 − ρ 2 ) + σ 2 N 1 i P 1 P 2 (1 − ρ 2 ) + σ 2 N 2 P 1 + σ 2 N 1 P 2 + σ 2 N 1 σ 2 N 2 . (45) Th us, as P 1 , P 2 → ∞ , D 1 , D 2 tend to zero. W e also see that D 1 and D 2 are minim um wh en th e a v erage p o w ers used are P 1 and P 2 . These conclusions are in con trast to the case of a GMA C where the d istortion for th e AF do es not approac h zero as P 1 , P 2 → ∞ and the optimal p o w ers needed may not b e the maxim um av erag e allo w ed P 1 and P 2 ([63]). W e compare the p erformance of AF with SB for the symmetric case where P 1 = P 2 = P , σ 2 1 = σ 2 2 = σ 2 , D 1 = D 2 = D , σ 2 N 1 = σ 2 N 2 = σ 2 N . T hese results are from [61]. W e denote the minimum distortions ac hiev ed in S B and AF b y D ( S B ) and D ( AF ) resp ectiv el y . σ 2 is take n to b e unity without loss of generalit y . W e denote P /σ 2 N b y S . Then D ( S B ) = s 1 − ρ 2 (1 + S ) 2 + ρ 2 (1 + S ) 4 , D ( AF ) = S (1 − ρ 2 ) + 1 1 + 2 S + S 2 (1 − ρ 2 ) . (46 ) W e see from th e ab ov e equations that w hen ρ = 0 , D ( S B ) = D ( AF ) = 1 / (1 + S ). A t h igh S , D ( AF ) ≈ 1 /S and D ( S B ) ≈ p 1 − ρ 2 /S . Ev ent ually b oth D ( S B ) and D ( AF ) tend to zero as S → ∞ . When S → 0 b oth D ( S B ) and D ( AF ) go to σ 2 . By squaring the equation (46) w e can sho w that D ( AF ) ≥ D ( S B ) for all S . But in [61] w e ha v e s ho wn that D ( AF ) − D ( S B ) is small wh en S is small or large or wh enev er ρ is small. D(AF) and D(SB) are plotted for ρ =0.3 and 0.7 usin g exact computa- tions in Figs. 7 and 8. The ab o v e results can b e easily extended to the m ultiple source case. F or SB, for the s ource co ding part, the rate region for m ultiple user case (under a s ymmetry assump tion) is giv en in [75]. This can b e com bined with the Figure 7: S NR vs d istortion p erforman ce rho=0.3 Figure 8: S NR vs d istortion p erforman ce rho=0.7 capacit y ac hieving Gaussian c hannel co des o v er eac h ind ep endent c hannel to obtain the necessary and sufficien t conditions for trans m ission. Let N b e the num b er of sou r ces whic h are join tly Gaussian with zero mean and co v ariance matrix K U . Let P b e the s ymmetric p o w er constrain t. Let K U ha v e the same structure as giv en in [75]. Let C U Y = √ P [1 ρ.....ρ ] b e a 1 × N v ecto r. The minimum distortion ac hiev ed by the AF scheme is giv en as D ( AF ) = 1 − C U Y ( P K U + σ 2 N I ) − 1 C ′ U Y . 7.3 Side information Let us consider the case w hen side information Z i is av ailable at enco der i , i = 1 , 2 and Z is a v a ilable at the deco der. One use of th e side information Z i at the enco ders is to in cr ease the correlatio n b et we en the sources. Th is can b e optimally done (see [15]), if w e tak e appropr iate linear com bination of ( U i , Z i ) at enco der i . The follo wing results are from [61]. W e are n ot a w are of any other result on p erform ance of join t sour ce-c hannel sc hemes with side information. W e are curr en tly wo rking on obtaining similar results f or the general MAC. 7.3.1 AF w ith side information Side information at enco ders only : A linear com bination of the source outputs and sid e information L i = a i U i + b i Z i , i = 1 , 2 is amp lifi ed and sen t o v er the channel. W e find the linear com binations, w hic h min imize the sum of distortions. F or this we consider the follo wing optimization p roblem: Minimize D ( a 1 , b 1 , a 2 , b 2 ) = E [( U 1 − ˆ U 1 ) 2 ] + E [( U 2 − ˆ U 2 ) 2 ] (47) sub ject to E [ X 2 1 ] ≤ P 1 , E [ X 2 2 ] ≤ P 2 where ˆ U 1 = E [ U 1 | Y 1 , Y 2 ] , ˆ U 2 = E [ U 2 | Y 1 , Y 2 ] . Side information at Deco der only : In this case the d ecod er side information Z is u sed in estimating ( U 1 , U 2 ) fr om ( Y 1 , Y 2 ). The optimal estimation ru le is ˆ U 1 = E [ U 1 | Y 1 , Y 2 , Z ] , ˆ U 2 = E [ U 2 | Y 1 , Y 2 , Z ] . (48) Side information at b oth Enco der and Deco der : Linear combina- tions of the sources are amplified as ab o v e and sent ov er the c hannel. T o find the optimal linear combinatio n, solv e an optimization problem similar to (47) with ( ˆ U 1 , ˆ U 2 ) as give n in (48). 7.3.2 SB with side information F or a giv en ( L 1 , L 2 ) w e use the source-c hannel cod ing sc heme explained at the en d of sec. 7.1. The side information Z at the decod er red uces the source rate region. This is also u sed at the deco der in estimating ( ˆ U 1 , ˆ U 2 ). The linear com binations L 1 and L 2 are obtained whic h m in imize (47) through this co d ing-deco ding scheme. 7.3.3 Comparison of AF and SB with side information W e pro vide the comparison of AF with SB for U 1 , U 2 ∼ N (0 , 1). Also we tak e the side in formation with a sp ecific str ucture whic h seems natural in this set u p. Let Z 1 = s 1 U 2 + V 1 and Z 2 = s 2 U 1 + V 2 , where V 1 , V 2 ∼ N (0 , 1) and are ind ep endent of eac h other and in dep end en t of the sources, and s 1 and s 2 are constant s that can b e interpreted as th e side c hannel SNR. W e also take Z = ( Z 1 , Z 2 ). W e ha v e compared AF and SB w ith d ifferen t ρ and s 1 , s 2 b y explicitly computing the minimum ( D 1 + D 2 ) / 2 ac hiev able. W e tak e P 1 = P 2 . F or s 1 = s 2 = 0 . 5 and ρ = 0 . 4 we pr o vide th e results in Fig. 9. F rom the Figure one sees that without side in formation, the p erformance of AF and S B is v ery close for different S NRs. The differen ce in their p erformance increases with side in formation for mo derate v alues of SNR b ecause the effect of the side information is to effectiv ely increase the correlation b et w een the sources. Ev en for these cases at lo w and high SNRs the p erform ance of AF is close to that of SB. Th ese observ ations are in conformit y with our conclusions in the p r evious Section . Our other conclusions, based on computations n ot p r o vided here are the follo wing. F or the symmetric case, f or SB, enco der-only side in formation reduces th e distortion marginally . This happ ens b ecause a distortion is incurred f or ( U 1 , U 2 ) while making the linear combinatio ns ( L 1 , L 2 ). F or th e AF we actually see no improv emen t and the optimal linear combinatio n has b 1 = b 2 = 0. F or deco der-only side information the p erformance is improv ed for b oth AF and SB as the side information can b e used to obtain b etter estimates of ( U 1 , U 2 ). Adding enco d er side information further impro ve s the p erformance only marginally f or SB; the AF p erform ance is not imp ro v ed. In the asymmetric case s ome of these conclusions may not b e v alid. 8 MA C with feedbac k In this section w e consider a memoryless MA C w ith feedbac k. The channel output Y k − 1 is av ailable to the enco d ers at time k . Gaarder and W olf ([28]) sho w ed that, un lik e in the p oin t to p oint case, feedbac k in creases the capacit y r egion of a d iscrete memoryless multiple- access channel . In [22] an ac hiev able region R 1 < I ( X 1 ; Y | X 2 , U ) , R 2 < I ( X 2 ; Y | X 1 , U ) , (49) R 1 + R 2 < I ( X 1 , X 2 ; Y ) Figure 9: AF and SB with b oth encod er and deco der side information where p ( u, x 1 , x 2 , y ) = p ( u ) p ( x 1 | u ) p ( x 2 | u ) p ( y | x 1 , x 2 ). It was d emonstrated in [78] that the same rate region is achiev able if there is a f eedbac k link to only one of the transmitters. This ac hiev able r egion w as improv ed in [16]. The ac hiev able region for a MA C, where eac h n o de receiv es p ossibly differen t c hannel feedbac k, is deriv ed in [17]. The f eedbac k signal in their set-up is correlated but not ident ical to the signal observe d by the receiv er. A simp ler and larger rate region for the same set-up w as obtained in [79]. Kramer ([43]) used the notion of ‘dir ected information’ to derive an ex- pression for the capacit y region of the MA C with feedbac k. How ev er, no single letter expressions were obtained. If the users generate indep enden t s equ ences, then the capacit y regi on C f b of the w hite Gaussian MA C is ([56]) C f b = [ 0 ≤ ρ ≤ 1 ( R 1 , R 2 ) : R 1 ≤ 0 . 5 log " 1 + P 1 (1 − ρ 2 ) σ N 2 # , R 2 ≤ 0 . 5 log " 1 + P 2 (1 − ρ 2 ) σ N 2 # , (50) R 1 + R 2 ≤ 0 . 5 log " 1 + P 1 + P 2 + 2 ρ √ P 1 P 2 σ N 2 # . The capacit y region for a give n ρ in (50) is same as in (13)-(15) for a channel without feedbac k bu t with correlation ρ b etw ee n c hannel inp uts ( X 1 , X 2 ). Th us the effect of feedbac k is to allo w arbitrary correlation in ( X 1 , X 2 ). An ac hiev ab le region for a GMA C with noisy feedback is pr o vided in [46]. Gaussian MAC w ith different f eedbac k to different no des is considered in [66]. An achiev able region based on co op eration among the sour ces is also giv en. Reference [80] ob tains an ac hiev able region w h en n on-causal state infor- mation is a v a ilable at b oth en co d ers. Th e authors also provi de the capacit y region for a Gaussian MA C with additive in terference and feedbac k. It is found that feedb ack of the output enhances the capacit y of the MA C with state. Interference wh en causally k n o wn at the transmitters can b e exactly cancelled and hence has no impact on the capac it y region of a t w o user MA C. Th us the capacit y region is the same as giv en in (50). In [50], it is shown that feedbac k do es not increase the capacit y of the Gelfand-Pinsk er c hannel ([35]) and feedforward d o es not impro v e the ac hiev- able rate-distortion p erformance in the Wyner-Z iv system ([82]). MA C with feedbac k and correlated sources (MACF CS) is stud ied in [53, 52 ]. Th is h as a MA C with correlated sources and a MA C with feedbac k as sp ecial cases. Gaussian MA CF CS w ith a total a ve rage p o we r constraint is considered in [52]. Differen t ac hiev able r ate r egions an d a capacit y outer b ound are give n for the MA CF CS in [53]. F or the first ac hiev able region a deco de and forward based strategy is us ed where the sour ces fi rst exc hange their data, and then co op erate to send the fu ll information to the destina- tion. F or t w o other ac hiev able regions, Slepian-W olf co d ing is p erformed first to remov e the correlations among the sou r ce data and it is follo w ed b y the co ding for the MA C w ith feedbac k or MA C disregarding the feedbac k. The authors also sho w that d ifferen t co d ing stategies p erform b etter und er differen t source correlation structures. The transmission of biv ariate Gaussian sources ov er a Gaussian MAC with feedbac k is analyzed in [45]. The authors show that f or th e sy m metric case, for SNR less than a threshold whic h is determined b y the source cor- relation, feedbac k is useless and min im um distortion is ac hiev ed b y u nco ded transmission. 9 MA C with fading A Gaussian MA C with a fi n ite n umb er of fading states is consid er ed . W e pro vide results when there are M in dep end en t sources. The c hannel state information (CSI) may b e av ailable at the receiv er and/or the transmitters. Consider the c hannel ([14 ]) Y k = M X l =1 h lk X lk + N k (51) where X lk is the c hann el input and h lk is the fading v alue at time k for user l . T he fading pro cesses { h lk , k ≥ 1 } of all u sers are join tly statio nary and ergo dic and the stationary d istribution has a con tin uous b ounded densit y . The fading pro cess for the differen t users are indep enden t. { N k } is the additiv e wh ite Gauss ian noise. All the users are p ow er constrainte d to P , i.e., E [ X 2 l ] ≤ P for all l . Since the source-c hannel separation holds, we p ro vide the capacit y region of this c hannel. 9.1 CSI at receiver only When the channel fading pr o cess { h lk } is a v aila ble at the r eceiv er only , the ac hiev able r ate region is the set of rates ( R 1 , R 2 , ..., R M ) satisfying X l ∈S R l ≤ E log 1 + P l ∈S ν l P σ 2 (52) for all su bsets S of { 1 , 2 . ...M } , ν l = | h l | 2 and σ 2 = E [ N 2 ]. The exp ectatio n is o v er all fading p o w ers { ν l } , l ∈ S . One of the p erformance measur es is normalized su m rate p er user R = 1 M M X l =1 R l = 1 M E " log 1 + M P 1 M P M l =1 ν l σ 2 !# ≤ 1 M log 1 + M P 1 M P M l =1 E [ ν l ] σ 2 ! . (53) If E [ ν l ] = 1 for eac h l , then the up p er b oun d equals the capacit y of the A W GN c hannel 1 M log h 1 + M P σ 2 i . Also, as M increases, if { ν l } are iid , by La w of Large Numb ers (LLN), R will b e close to this upp er b ound . Thus a v eraging o v er many users mitigates th e effect of fading. This is in cont rast to the time /frequency/space av erag ing. The capacit y ac hieving distribu tion is iid Gaussian for eac h user and the co de for on e user is indep end en t of the co de for an other user (in other words AF is optimal in th is case). 9.2 CSI at b oth T ransmitter and Receiv er The add itional elemen t that is in tro duced wh en CSI is pro vided to the trans- mitters in addition to the receiv er is d ynamic p o w er con trol wh ic h can b e done in resp onse to the c hanging c hannel state. Giv en a j oin t fadin g p o w er ν = ( ν 1 , ..., ν M ), P i ( ν ) denotes the transmit p o wer allo cated to user i . Let P i b e the a v erage p ow er constraint for user i . F or a given p o w er con trol p olicy P C f ( P ) = R : R ( S ) ≤ E 1 2 log 1 + P i ∈S ν i P i ( ν ) σ 2 f or all S ⊂ { 1 , 2 , ..., M } (54) denotes the rate region ac hiev able. The capacit y region is C ( P ) = [ P ∈F C f ( P ) (55) where F is the set of f easible p o w er cont rol p olicies, F ≡ {P : E [ P i ( ν )] ≤ P i , f or i = 1 , ..., M } . (56) Since the capacit y region is con v ex, the ab ov e c haracterizat ion imp lies that time sharin g is n ot required. The explicit c haracterization of the capacit y region exploiting its p oly- matroid structure is giv en in [70]. F or P i = P for eac h i and eac h h i ha ving the same distribu tion, the optimal p o w er con trol is that only the user with the b est c hann el transmits at a time. T h e instant aneous p ow er assigned to the i th user, observin g the realization of the fading p o we rs ν 1 , ν 2 , ..., ν M is P l ( ν j , j = 1 , ..., M ) = ( 1 λ − 1 ν l , ν l > λ, ν l > ν j j 6 = l 0 other w ise (57) where λ is c hosen su c h that the a ve rage p o we r constraint is satisfied. Th is function is actually the w ell kno wn w ater filling fu nction ([36]) optimal for a sin gle user. Th is strategy do es n ot d ep end on the fadin g statistics bu t f or the constan t λ . The capacit y ac hieving d istr ibution is Gaussian (th us AF for eac h user in its assigned slot is optimal). Unlik e in the single user case th e optimal p o w er control ma y yield sub- stan tial gain in capacit y . This happ ens b ecause if M is large, with high probabilit y at least one of the iid fading p o w ers will b e large provi ding a go o d channel f or the resp ectiv e user at that time instan t. The optimal strategy is also v alid for non equal a v erage p o we rs. Th e only c hange b eing that the fading v alues are normalized by th e Lagrange’s co ef- ficien ts [41 ]. Th e extension of this strategy to frequency select iv e c hann els is give n in [42]. An explicit c haracterizati on of the ergo dic capacit y region and a simple enco ding-deco ding sc heme for a fading GMA C with common data is giv en in [48]. Op tim um p o w er allo cation sc hemes are also pr o vided. 10 T hough ts for practitioners Practical schemes for distrib uted source co d ing, c hannel co ding and join t source-c hannel co ding for MAC are of in terest. The ac hiev abilit y pro ofs assume in finite length cod e words and ignore dela y and complexit y which mak e them of limited in terest in practical scenarios. Reference [5] reviews a panorama of pr actical join t sour ce-c hannel cod- ing metho ds for single u ser systems. The tec hniqu es giv en are hierarc hical protection, c hannel optimized v ector quantize rs (COV Q), self organizing h yp ercub e (SOH), mo dulation organized vecto r qu an tizer and hierarc hic hal mo dulation. F or lossless d istributed source co ding, S lepian-W olf (S-W) ([69 ]) pro vide the rate r egion. The underlying idea for construction of p ractical codes for this system is to exploit the duality b et w een the sour ce and c hannel co d ing. The approac h is to p artition the space of all p ossible source outcomes in to disjoin t bins that are cosets of a go o d linear c hannel co de. Suc h construc- tions lead to constructiv e and non-asymp totic sc hemes. Wyner w as the first to suggest suc h a sc heme in [81]. In spired b y Wyn er’s sc hme, T urb o/ LDPC b ased p ractical co d e design is giv en in [4] for corr e- lated bin ary sour ces. The correlation b et w een the sources w ere mo delled b y a ’virtu al’ binary symmetric channel (BSC) with crosso v er probab ility p . The p erformance of this scheme is v ery close to the Slepian-W olf limit H ( p ). S -W co de designs using p o w erful tu rb o and LDPC co des f or other correlation mo d els and m ore than t w o s ou r ces is giv en in [18]. LDPC based codes w ere also prop osed in [19] where a general iterativ e S-W deco ding algorithm that incorp orates the graphical str ucture of all the enco ders and op erates in a ‘T u rb o lik e’ fashion is p rop osed. Reference [51] prop oses LDPC co des f or binary S-W co ding problem w ith Maxim um Lik e- liho o d(ML) deco ding. This giv es an upp er b ound on p erf orm ance with iterativ e deco ding. They also show that a linear co d e for S-W source co din g can b e used to constru ct a c hannel co de for a MA C with correlated add itiv e white n oise. In Distributed Source Co d in g us in g Syndromes (DISC US) ([59]) T rellis co ded Mo du lation(TCM), Hamming codes and Reed - S olomon (RS) co d es are us ed f or S-W codin g. F or the Gaussian v ersion of DISCUS, the source is fir s t quantiz ed and th en discrete DISCUS is used at b oth enco d er and deco der. Source co din g with fidelit y criterion su b ject to the a v aila bilit y of side information is add ressed in [82]. First the source is qu antized to the extend allo w ed by the fid elit y r equiremen t. Then S-W co din g is used to r emo v e the information at the deco d er due to the side information. Since S-W co ding is based on c hannel co des, Wyner-Ziv co din g can b e interpreted as a source-c hannel co d ing problem. The codin g incurres a quantiza tion loss due to source co d ing and binning loss d ue to c hann el co d ing. T o ac hiev e Wyner-Ziv limit p ow erful co des need to b e emplo y ed for b oth source co ding and channel co ding. It w as sho wn in [88] that nested lattice co des can achiev e the Wyner-Ziv limit asymptotically , for large dimen s ions. A practical nested lattice co de implemetation is provided in [67]. F or the BSC correlation m o del, linea r binary b lo c k co d es are u sed for lossy Wyner-Ziv co d ing in [88, 68]. Lattice co des and T rellis based cod es ([26]) ha v e b een used for b oth source and channel co ding f or the correlated Gaussian sources. A n ested lattice construction based on similar sub lattices for high correlation is pro- p osed in [67]. An other appr oac h to practical co d e constru ctions is based on Slepian-W olf cod ed nested qu an tization (SW C-NQ) which is a n ested scheme follo w ed by b inning. Asymptotic p er f ormance b ounds of S W C-NQ are es- tablished in [49]. A com bination of a scala r quan tizer and a p ow erful S-W co de is also used for nested Wyner -Z iv co ding. Wyner -Z iv co ding b ased on TCQ and LDPC are pr o vided in [85 ]. A comparison of d ifferen t appr oac hes for b oth Wyner-Ziv cod ing and classical sour ce co ding are pr o vided in [84]. Lo w densit y generator matrix (LDGM) co des are prop osed for join t source channel co ding of correlated sources in [89]. Practical co de construction for the sp ecial case of the CEO problem are pro vided in [57, 86]. 11 Directions for future r esearc h In th is rep ort we ha v e provided sufficien t cond itions f or transm ission of correlated sources ov er a MA C with sp ecified distortions. It is of inte rest to find a sin gle letter characte rization of the necessary co nditions and to establish the tightness of the sufficien t conditions. It is also of interest to extend the ab o v e results and cod ing sc hemes to sources correlated in time and a MA C with memory . T he error exp onen ts are also of interest. Most of the ac hiev ability results in this rep ort use random cod es whic h are in efficient b ecause of large co d ew ord length. I t is desirable to obtain p o wer efficien t practical co des for side in f ormation a w are compression that p erforms very close to the optimal sc heme. F or the fadin g channels, fairness of the rates pro vided to differen t users, the dela y exp erienced b y the messages of differen t users and c hannel trac king are issues w orth p ondering. It is also desirable to find th e p erformance of these sc hemes in terms of scali ng b eha viour in a net w ork scenario. The com bination of joint source-c hannel co ding and net w ork co d ing is also a new area of researc h. Another emerging area is the use of join t source-c hannel co des in MIMO systems and co-oper ative communicati on. 12 Conc lusions In this rep ort, sufficient conditions are pro vided f or transmission of corre- lated sources ov er a multiple access c hannel. V arious p revious resu lts on this p roblem are obtained as sp ecial cases. Suitable examples are giv en to emphasis the sup er iority of join t source-c hannel co ding sc hemes. Imp or- tan t sp ecial cases: Correlated d iscrete s ou r ces o v er a GMA C and Gaussian sources o v er a GMAC are discussed in more detail. In particular a new join t source-c hannel co ding scheme is p resen ted for discrete sources o v er a GMA C. Pe rformance of sp ecific joint source-c hannel co ding sc hemes for Gaussian sources are also compared . P r actical schemes like TDMA, FDMA and CDMA are brough t into th is framew ork. W e also consider a MA C with feedbac k and a fading MA C. V arious pr actical schemes motiv a ted by join t source-c hannel cod ing are also present ed. A Pr o of of Theorem 1 The co d in g sc heme inv olv es distributed quantiz ation ( W 1 , W 2 ) of the sources and the side inform ation ( U 1 , Z 1 ) , ( U 2 , Z 2 ) follo w ed by a correlation pr eserv- ing m ap p ing to the c hannel co dewords. The deco ding approac h inv olv es first deco ding ( W 1 , W 2 ) and then obtaining estimate ( ˆ U 1 , ˆ U 2 ) as a function of ( W 1 , W 2 ) and the decod er side information Z . W e also use the follo wing Lemmas in the pro of. Lemma 6 ( Marko v Lemma ): Supp ose X ↔ Y ↔ Z . If for a g i ven ( x n , y n ) ∈ T n ǫ ( X, Y ) , Z n is dr awn ac c or ding to Q n i =1 p ( z i | y i ) , then with high pr ob ability ( x n , y n , Z n ) ∈ T n ǫ ( X, Y , Z ) for n sufficiently lar ge. Lemma 7 ( E xt ended Marko v Lemma ): Su pp ose W 1 ↔ U 1 Z 1 ↔ U 2 W 2 Z 2 Z and W 2 ↔ U 2 Z 2 ↔ U 1 W 1 Z 1 Z . If for a given ( u n 1 , u n 2 , z n 1 , z n 2 , z n ) ∈ T n ǫ ( U 1 , U 2 , Z 1 , Z 2 , Z ) , W n 1 and W n 2 ar e dr awn r esp e ctive ly ac c or ding to Q n i =1 p ( w 1 i | u 1 i , z 1 i ) and Q n i =1 p ( w 2 i | u 2 i , z 2 i ) , then with high pr o b ability ( u n 1 , u n 2 , z n 1 , z n 2 , z n , W n 1 , W n 2 ) ∈ T n ǫ ( U 1 , U 2 , Z 1 , Z 2 , Z, W 1 , W 2 ) for n sufficiently lar ge. The pr o ofs of these lemmas are a v ailable in [10 ] and [73] resp ectiv ely . W e show th e ac hiev abilit y of all p oin ts in the rate region (1). P r oof : Fix p ( w 1 | u 1 , z 1 ) , p ( w 2 | u 2 , z 2 ) , p ( x 1 | w 1 ) , p ( x 2 | w 2 ) as we ll as f n D ( . ) satisfying the distortion constraints. C odebook Generat ion : Let R ′ i = I ( U i , Z i ; W i ) + δ, i ∈ { 1 , 2 } for some δ > 0. Generate 2 nR ′ i co dew ords of length n , sampled iid from the marginal distribution p ( w i ) , i ∈ { 1 , 2 } . F or eac h w n i indep end en tly generate sequence X n i according to Q n j =1 p ( x ij | w ij ) , i ∈ { 1 , 2 } . Call these sequences x i ( w n i ) , i ∈ 1 , 2. Rev eal the co deb o oks to the enco ders and th e deco der . E ncoding : F or i ∈ { 1 , 2 } , giv en the source sequen ce U n i and Z n i , the i th enco der lo oks for a co deword W n i suc h that ( U n i , Z n i , W n i ) ∈ T n ǫ ( U i , Z i , W i ) and then trans mits X i ( W n i ) where T n ǫ ( . ) is th e set of ǫ -w eakly t ypical se- quences ([23]) of length n . D ecoding : Up on receiving Y n , the deco der fin ds the un ique ( W n 1 , W n 2 ) pair suc h that ( W n 1 , W n 2 , x 1 ( W n 1 ) , x 2 ( W n 2 ) , Y n , Z n ) ∈ T n ǫ . If it fails to find suc h a un iqu e p air, the d ecod er declares an error and incurr es a maximum distortion of d max . In the follo wing w e sho w that the p robabilit y of error for the enco d ing deco ding sc heme tends to zero as n → ∞ . Th e error can o ccur b ecause of the follo wing four ev en ts E 1 - E4 . W e sho w that P ( Ei ) → 0, for i = 1 , 2 , 3 , 4. E1 T h e enco d ers d o n ot find the co dewords. Ho w ev er fr om r ate d istor- tion theory [23], page 356 , lim n →∞ P ( E 1 ) = 0 if R ′ i > I ( U i , Z i ; W i ) , i ∈ 1 , 2 . E2 The codewords are not j oin tly typica l with Z n . P rob obalit y of this ev en t goes to zero from the extend ed Mark o v Lemma (Lemma 6). E3 T here exists another co dewo rd ˆ w n 1 suc h th at ( ˆ w n 1 , W n 2 , x 1 ( ˆ w n 1 ) , x 2 ( W n 2 ), Y n , Z n ) ∈ T n ǫ . Define α ∆ = ( ˆ w n 1 , W n 2 , x 1 ( ˆ w n 1 ) , x 2 ( W n 2 ) , Y n , Z n ). Then, P ( E3 ) = P r { There is ˆ w n 1 6 = w n 1 : α ∈ T n ǫ } ≤ X ˆ w n 1 6 = W n 1 :( ˆ w n 1 ,W n 2 ,Z n ) ∈ T n ǫ P r { α ∈ T n ǫ } (58) The pr obabilit y term inside the su mmation in (58) is ≤ X ( x 1 ( . ) ,x 2 ( . ) ,y n ): α ∈ T n ǫ P r { x 1 ( ˆ w n 1 ) , x 2 ( w n 2 ) , y n | ˆ w n 1 , w n 2 , z n } = X ( x 1 ( . ) ,x 2 ( . ) ,y n ): α ∈ T n ǫ P r { x 1 ( ˆ w n 1 ) | ˆ w n 1 } P r { x 2 ( w n 2 ) , y n | w n 2 , z n } ≤ X ( x 1 ( . ) ,x 2 ( . ) ,y n ): α ∈ T n ǫ 2 − n { H ( X 1 | W 1 )+ H ( X 2 ,Y | W 2 ,Z ) − 4 ǫ } ≤ 2 nH ( X 1 ,X 2 ,Y | W 1 ,W 2 ,Z ) 2 − n { H ( X 1 | W 1 )+ H ( X 2 ,Y | W 2 ,Z ) − 4 ǫ } . But fr om hypothesis, we ha v e H ( X 1 , X 2 , Y | W 1 , W 2 , Z ) − H ( X 1 | W 1 ) − H ( X 2 , Y | W 2 , Z ) = H ( X 1 | W 1 ) + H ( X 2 | W 2 ) + H ( Y | X 1 , X 2 ) − H ( X 1 | W 1 ) − H ( X 2 , Y | W 2 , Z ) = H ( Y | X 1 , X 2 ) − H ( Y | X 2 , W 2 , Z ) = H ( Y | X 1 , X 2 , W 2 , Z ) − H ( Y | X 2 , W 2 , Z ) = − I ( X 1 ; Y | X 2 , W 2 , Z ) . Hence, P r { ( ˆ w n 1 , W n 2 , x 1 ( ˆ w n 1 ) , x 2 ( W n 2 ) , Y n , Z n ) ∈ T n ǫ } ≤ 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 6 ǫ } . (59) Then fr om (58) P ( E3 ) ≤ X ˆ w n 1 6 = w n 1 :( ˆ w n 1 ,w n 2 ,z n ) ∈ T n ǫ 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 6 ǫ } = |{ ˆ w n 1 : ( ˆ w n 1 , w n 2 , z n ) ∈ T n ǫ }| 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 6 ǫ } ≤ |{ ˆ w n 1 }| P r { ˆ w n 1 , w n 2 , z n ) ∈ T n ǫ } 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 6 ǫ } ≤ 2 n { I ( U 1 ,Z 1 ; W 1 )+ δ } 2 − n { I ( W 1 ; W 2 ,Z ) − 3 ǫ } 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 6 ǫ } (60) = 2 n { I ( U 1 ,Z 1 ; W 1 | W 2 ,Z ) } 2 − n { I ( X 1 ; Y | X 2 ,W 2 ,Z ) − 9 ǫ − δ } . The R .H.S of the ab o v e inequalit y tend s to zero if I ( U 1 , Z 1 ; W 1 | W 2 , Z ) < I ( X 1 ; Y | X 2 , W 2 , Z ). In (60) w e h a v e used the fact that I ( U 1 , Z 1 ; W 1 ) − I ( W 1 ; W 2 , Z ) = H ( W 1 | W 2 , Z ) − H ( W 1 | U 1 , Z 1 ) = H ( W 1 | W 2 , Z ) − H ( W 1 | U 1 , Z 1 , W 2 , Z ) = I ( U 1 , Z 1 ; W 1 | W 2 , Z ) . Similarly , by symmetry of the problem w e require I ( U 2 , Z 2 ; W 2 | W 1 , Z ) < I ( X 2 ; Y | X 1 , W 1 , Z ). E4 T h ere exist other cod ew ords ˆ w n 1 and ˆ w n 2 suc h that α ∆ =( ˆ w n 1 , ˆ w n 2 , x 1 ( ˆ w n 1 ) , x 2 ( ˆ w n 2 ) , y n , z n ) ∈ T n ǫ . Then, P ( E4 ) = P r { There is ( ˆ w n 1 , ˆ w n 2 ) 6 = ( w n 1 , w n 2 ) : α ∈ T n ǫ } ≤ X ( ˆ w n 1 , ˆ w n 2 ) 6 =( w n 1 ,w n 1 ):( ˆ w n 1 , ˆ w n 2 ,z n ) ∈ T n ǫ P r { α ∈ T n ǫ } . (61) The pr obabilit y term inside the su mmation in (61) is ≤ X ( x 1 ( . ) ,x 2 ( . ) ,y n ): α ∈ T n ǫ P r { x 1 ( ˆ w n 1 ) , x 2 ( w n 2 ) , y n | ˆ w n 1 , ˆ w n 2 , z n } ≤ X .... P r { x 1 ( ˆ w n 1 ) | ˆ w n 1 } P r { x 2 ( ˆ w n 2 ) | ˆ w n 2 } P r { y n | z n } ≤ X ( x 1 ( . ) ,x 2 ( . ) ,y n ): α ∈ T n ǫ 2 − n { H ( X 1 | W 1 )+ H ( X 2 | W 2 )+ H ( Y | Z ) − 5 ǫ } ≤ 2 nH ( X 1 ,X 2 ,Y | W 1 ,W 2 ,Z ) 2 − n { H ( X 1 | W 1 )+ H ( X 2 | W 2 )+ H ( Y | Z ) − 7 ǫ } . But fr om hypothesis, we ha v e H ( X 1 , X 2 , Y | W 1 , W 2 , Z ) − H ( X 1 | W 1 ) − H ( X 2 | W 2 ) − H ( Y | Z ) = H ( Y | X 1 , X 2 ) − H ( Y | Z ) = H ( Y | X 1 , X 2 , Z ) − H ( Y | Z ) = − I ( X 1 , X 2 ; Y | Z ) . Hence, P r { ( ˆ w n 1 , ˆ w n 2 , x 1 ( ˆ w n 1 ) , x 2 ( ˆ w n 2 ) , y n , z n ) ∈ T n ǫ } ≤ 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 7 ǫ } . (62) Then fr om (61) P ( E4 ) ≤ X ( ˆ w n 1 , ˆ w n 2 ) 6 =( w n 1 ,w n 1 ): ( ˆ w n 1 , ˆ w n 2 ,z n ) ∈ T n ǫ 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 7 ǫ } = |{ ( ˆ w n 1 , ˆ w n 2 ) : ( ˆ w n 1 , ˆ w n 2 , z n ) ∈ T n ǫ }| 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 7 ǫ } ≤ |{ ˆ w n 1 }||{ ˆ w n 2 }| P r { ( ˆ w n 1 , ˆ w n 2 , z n ) ∈ T n ǫ } 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 7 ǫ } ≤ 2 n { I ( U 1 ,Z 1 ; W 1 )+ I ( U 2 ,Z 2 ; W 2 )+2 δ } 2 − n { I ( W 1 ; W 2 ,Z )+ I ( W 2 ; W 1 ,Z )+ I ( W 1 ; W 2 | Z ) − 4 ǫ } 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 7 ǫ } = 2 n { I ( U 1 ,U 2 ,Z 1 ,Z 2 ; W 1 ,W 2 | Z ) } 2 − n { I ( X 1 ,X 2 ; Y | Z ) − 11 ǫ − 2 δ } . The RHS of the ab o v e inequalit y tends to zero if I ( U 1 , U 2 , Z 1 , Z 2 ; W 1 W 2 | Z ) < I ( X 1 , X 2 ; Y | Z ). Th us as n → ∞ , with prob ab ility tendin g to 1, the deco der find s the cor- rect sequ ence ( W n 1 , W n 2 ) which is jointly w eakly ǫ -typical with ( U n 1 , U n 2 , Z n ). The fact that ( W n 1 , W n 2 ) are w eakly ǫ -t ypical with ( U n 1 , U n 2 , Z n ) do es not guarantee that f n D ( W n 1 , W n 2 , Z n ) w ill satisfy the d istortions D 1 , D 2 . F or this, one needs th at ( W n 1 , W n 2 ) are distortion- ǫ -w eakly t ypical ([23]) with ( U n 1 , U n 2 , Z n ). Let T n D ,ǫ denote the set of distortion typica l sequences ([23]). Then by strong la w of large num b ers P ( T n D ,ǫ | T n ǫ ) → 1 as n → ∞ . Th us the distortion constrain ts are also satisfied by ( W n 1 , W n 2 ) obtained ab o v e with a probabilit y tending to 1 as n → ∞ . Th er efore, if distortion measure d is b ound ed lim n →∞ E [ d ( U n i , ˆ U n i )] ≤ D i + ǫ i = 1 , 2. If there exist u ∗ i suc h that E [ d i ( U i , u ∗ i )] < ∞ , i = 1 , 2, then the result extends to u n b ounded distortion measures also as follo ws. Whenev er th e deco ded ( W n 1 , W n 2 ) are not in the d istortion typical set then w e estimate ( ˆ U n 1 , ˆ U n 2 ) as ( u ∗ 1 n , u ∗ 2 n ). Then for i = 1 , 2, E [ d i ( U n i , ˆ U n i )] ≤ D i + ǫ + E [ d ( U n i , u ∗ i n ) 1 { ( T n D ,ǫ ) c } ] . (63) Since E [ d ( U n i , u ∗ i n )] < ∞ and P [( T n D ,ǫ ) c ] → 0 as n → ∞ , the last term of (63) go es to zero as n → ∞ . References [1] R. Ahlsw ede. M ultiwa y comm unication c hannels. Pr o c. Se c ond Int. Symp. Inform. T r ansmission, Armenia, U SSR, Hungarian Pr ess , 1971. [2] R. Ahlsw ede and T. Han. On source co ding with side information via a multiple access channel and relate d problems in information theory . IEEE T r ans. Inform. The o ry , 29(3):3 96–411, Ma y 1983. [3] I. F. Akylidiz, W. Su, Y. Sank arasubramaniam, and E. Ca yirici. A surve y on sensor net w orks. IEEE Communic ations Magazine , pages 1–13, Aug. 2002. [4] A.Live ris, Z.Xiong, and C. Georghiades. Compression of binary sour ces with sid e information at the deco der u s ing LDPC cod es. IEEE Commn. L ett. , 6(10) :440–442 , 2002. [5] S. B. Z. Azami, P . Du hamel, and O. Rioul. Combined source-c hannel co ding: Panorama of metho ds. CN ES Workshop on Data Compr ession, T oulouse, F r anc e , No v 1996 . [6] S. J. Baek, G. V eciana, and X. S u. Minimizing en er gy consumption in large-scal e sensor net w orks throu gh distribu ted d ata compression and hierarc hical aggreg ation. IEEE JSAC , 22(6):1130 –1140, Aug. 2004. [7] R. J. Barron, B. Chen , and G. W. W ornell. The dualit y b et wee n infor- mation em b edding and source co d in g with side information and some applications. IEEE T r ans. Inform. The ory , 49(5):1159– 1180, Ma y 2003. [8] J. Barros and S. D. Serv etto. Reac hbac k capaci t y with non-in terfering no des. Pr o c.ISIT , pages 356–361, 2003 . [9] J. Barros and S. D. S erv etto. Net w ork inf ormation fl o w with corr elated sources. IEEE T r ans. Inform. The ory , 52(1):15 5–170, Jan 2006. [10] T . Berger. Mu ltiterminal source co ding. L e ctur e notes pr esente d at 1977 CISM summer scho ol, Udine, Italy , J u ly . 1977. [11] T . Berger. Multiterminal sour c e c o ding, In Information The o ry A p- pr o ach to Communic ation, E d. G. L ongo . Spr in ger-V erlag, N.Y., 1977. [12] T . Berger and R. W. Y eung. Multiterminal source co ding w ith one dis- tortion criterion. IEEE T r ans. Inform. The ory , 35(2):2 28–236, Marc h 1989. [13] T . Berger, Z. Zh ang, and H. Visw anathan. The CEO p roblem. IEEE T r a ns. Inform. The o ry , 42(3):8 87–902, Ma y 1996. [14] E . Biglieri, J. Proakis, and S. Shamai. F ad in g channels: Inf ormation theoretic and comm unication asp ects. IEEE T r ans. Inform. The ory , 44(6): 2619–26 92, Oct. 1998. [15] L . Brieman and H. F riedman. Es timating optimal trans formations for m ultiple regression and correlation. Journal of Americ an Statistic al Asso c iation , 80(391):58 0–598, 1983. [16] S . Bross and A. Lapidoth. An improv ed ac hiev able region for discrete memoryless t w o-user multiple-ac cess channel with noiseless feedbac k. IEEE T r ans. Inform. The o ry , 51(3):8 11–833, Marc h 2005. [17] A. B. Carleial. Mu ltiple access c hannels with d ifferen t generalized feed- bac k signals. IEEE T r ans. Inform. The ory , IT-28(6):8 41–850, No v 1982. [18] C .Lan, K. N. A.Liv eris, Z.Xiong, and C . Georghiades. Slepain-W olf co ding of m ultiple m -ary sources using LDPC co des. P r o c. DCC’04, Snowbir d, UT , p age 549, 2004. [19] T . P . C oleman, A. H. Lee, M. Medard, and M. Effros. L o w-complexit y approac hes to slepian-w olf near-lossless distr ib uted data compression. IEEE T r ans. Inform. The o ry , 52(8):3 546–356 1, Aug. 2006. [20] T . M. Co ve r. A p ro of of the data compr ession theorem of Slepian and W olf for ergodic sources. IEEE T r ans. Inform. The ory , 21(2):22 6–228, Marc h 1975. [21] T . M. Co v er, A. E. Gamal, and M. Salehi. Multiple access c hannels with arbitrarily correlated sour ces. IEEE T r a ns. Inform. The ory , 26(6):6 48– 657, No v. 1980. [22] T . M. Co v er and C. S. K. Leung. An ac hiev ab le rate region f or the m ultiple-access c hannel with feedb ac k. IEEE T r ans. Inform. The ory , 27(3): 292–298 , Ma y 1981. [23] T . M. Co v er and J. A. Th omas. Elements of Information the o ry . Wiley Series in T elecomm unicatio n, N.Y., 2004. [24] S . C. Drap er and G. W. W ornell. S ide information a w are co din g state- gies for sensor net wo rks. IEEE Journal on Sele cte d Ar e as i n Comm. , 22:1–1 1, Au g 2004 . [25] G. Duec k. A n ote on th e multiple access c hannel with correlated sources. IEEE T r ans. Inform. The o ry , IT -27(2):232 –235, 1981. [26] M. V. Ey u b oglu and G. D. F orney . Lattice and trellis quantiz ation with latica and trellis-b ounded co deb o oks-high rate theory for memoryless sources. IEEE T r ans. Inform. The ory , 39(1):46 –59, 199 3. [27] M. Fleming and M. Effros. On r ate distortio n with mixed typ es of side inform ation. IEEE T r ans. Inform. The ory , 52(4):1698 –1705, Apr il 2006. [28] N. T. Gaarder and J. K. W olf. The capacit y region of a m ultiple access discrete memoryless c hannel can increase with feedback. IEEE T r a ns. Inform. The ory , IT-21, 1975. [29] H. E. Gamal. On scaling la ws of dense wireless sensor net w orks: the data gathering c hann el. IE EE T r a ns. Inform. The ory , 51(3 ):1229–1 234, Marc h 2005. [30] M. Gastpar. Multiple access c hannels und er receiv ed-p o we r constraints. Pr o c. IEEE Inform. The ory Workshop , pages 452–457, 2004 . [31] M. Gastpar. Wyner-zi v p roblem with multiple sour ces. IEE E T r ans. Inform. The ory , 50(11):2 762–276 8, Nov. 2004. [32] M. Gastpar. Wyner-zi v p roblem with multiple sour ces. IEE E T r ans. Inform. The ory , 50(11):2 762–276 8, Nov. 2004. [33] M. Gastpar and M. V etterli. Source-c hannel communicatio n in sensor net w orks. Pr o c. IPSN’03 , p ages 162–17 7, 2003. [34] M. Gastpar and M. V etterli. P o w er spatio-t emp oral bandwidth and distortion in large sensor net w orks. IEEE JSA C , 23(4):7 45–754, 2005. [35] S . Gel’fand and M. Pinsker. C o ding for c hannels w ith r andom param- eters. Pr obl. Contr ol and Inform. The ory , 9(1):19– 31, 1980. [36] A. J. Goldsmith and P . P . V araiya . Capacit y of fading c hannels with c hannel side information. IEEE T r ans. Inform. The ory , 43(6):1986– 1992, Nov 1997. [37] D. Gunduz and E. Erkip. T ran s mission of correlated sources o v er m ul- tiuser c hannels with receiv er side information. UCSD IT A Workshop, San Die go, CA , Jan 2007. [38] P . Ishw ar, R. Puri, K. Ramchandran, and S. S . Pradhan. On rate constrained distributed estimation in u n reliable sensor net wo rks. IEEE JSAC , p ages 765– 775, 2005. [39] J . Jaco d and P . Protter. Pr ob ability Essentials . Spr inger, N.Y., 2004. [40] W. Kang an d S. Uluku s. An outer b oun d for mac with correlated sources. Pr o c. 40 th annual c onfer enc e on Informatio n Scienc es and Systems , p ages 240–244 , Marc h 2006. [41] R . Kn opp and P . A. Humblet . I nformation capacit y and p ow er con trol in single-cell multiuser comm unication. Pr o c. Int. Conf. on Communi- c a tion, ICC’95, Se attle,W A , pages 331–335, June 1995. [42] R . Knopp and P . A. Humblet. Multiple-acce ssing o v er frequen cy- selectiv e fadin g c hannels c hannels. 6 th IEE E Int. Symp. on Per- sonal Indo or and Mobile R adio Communic at ion, PIMRC’95, T or o nto, Canada , p ages 1326 –1331, sept 1995. [43] G. Kramer. Capacit y results for d iscrete memoryless netw ork. IEEE T r a ns. Inform. The o ry , 49:4–2 1, Jan. 2003. [44] A. Lapidoth and S. Tinguely . Send ing a bi- v a riate Gaussian source o v er a Gaussian MAC. IEE E ISIT 06 , 2006. [45] A. Lapidoth and S . Tinguely . Sending a b iv aria te Gaussian source o ve r a Gaussian MAC with feedb ac k. IEEE ISIT, N ic e, F r anc e , Ju n e 2007 . [46] A. Lapidoth and M. A. Wigger. On Gaussian MA C with imp erfect feedbac k. 24 th IE EE c onvention of Ele ctric al and Ele ctr onics Engine ers in Isr ael (IEEEI 06), Eilat , Nov 2006. [47] H. Liao. Multiple access c hannels. P h.D dissertion, Dept. Ele c . E ngg., Univ of Hawaii, Honolulu , 1972 . [48] N. Liu and S. Ulukus. C apacit y region and optim um p o w er control stategie s for fading Gaussian multi ple access c hannels with common data. IEEE T r ans. Inform. The ory , 54(10):18 15–1826, O ct. 2006. [49] Z . Liu , S . Cheng, A. Live ris, and Z. Xiong. Slepian-W olf co ded nested quan tization (sw c-nq) f or Wyner-Ziv co ding: Pe rforman ce analysis and co de d esign. IE EE T r ans. Inform. The ory , 52:4358–4 379, Oct 2006. [50] N. Merh a v and T. W eissman. Co d ing for th e feedbac k Gel’fand-Pinsk er c hannel and feed f orward Wyn er-Ziv source. IEEE T r an s. Inform. The- ory , 52:4207–4 211, 200 6. [51] J . Muramastu, T. Uyemat su, and T. W ada y ama. Low densit y parit y c hec k matrices for codin g of correlate d sources. IEEE T r ans. Inform. The ory , 51(10):3 645–3654, Oct 2005. [52] A. D. Murugan, P . K. Gopala, and H. El-Gamal. Correlate d sources o v er wireless channels: Co op erativ e source-c hannel co ding. IEEE Jour- nal. on Sel. Ar e as Commun. , 22(6):988–9 98, Aug 2004. [53] L . On g and M. Motani. Co ding stategies for m ultiple-acc ess c hannels with feedbac k and co rrelated sources. IE EE T r ans. Inform. The o ry , 53(10 ):3476–3 497, O ct 2007. [54] Y. Oohama. Gauss ian multiterminal source co d ing. IEEE T r ans. In- form. The ory , 43(6):19 12–1923 , No v. 1997. [55] Y. Oohama. The rate distortion fun ction for qu adratic Gaussian C EO problem. IEEE T r ans. Inform. The ory , 44(3):1 057–107 0, May 1998. [56] L . H. Ozaro w. The capacit y of the w hite Gaussian m ultiple access c hannel with feedbac k. IEE E T r ans. Inform. The o ry , 30( 4):623 – 629, 1984. [57] S . Pradhan and K . Ramc handran. Distributed source co ding: Symmet- ric rates and application to sen s or net w orks. Pr o c. DCC’00, Snowbir d , UT , pages 302–311, 2000. [58] S . S. Pradh an, J. Chou, and K. Ramac handran. Dualit y b et w een source co ding and c hannel co din g and its extension to the side information case. IEEE T r ans. Inform. The ory , 49(5):11 81–1203, Ma y 2003. [59] S . S. Pradhan and K. Ramc handr an. Distributed source cod ing u sing syndromes DISC US : Design and construction. IEEE T r ans. Inform. The ory , 49(3):62 6 – 643, Marc h 2003. [60] J . G. Proakis. Digital c ommunic ation . McGra w-Hill In ternational edi- tion, 2001. [61] R . Ra jesh and V. Sharma. Correlated Gaussian sources o v er orthogonal Gaussian channels. submitte d . [62] R . Ra jesh and V. Sharma. A join t s ou r ce-c hannel cod ing scheme for transmission of correlated discrete sour ces o v er a Gaussian multiple access channel. sub mitte d . [63] R . Ra jesh and V. Sh arma. Sour ce c hannel co ding for Gaussian sources o v er a Gaussian multiple access c hann el. Pr o c. 45 A l lerton c onfer enc e on c om puting c ontr ol and c ommunic ation, Montic el lo, IL , 2007 . [64] R . Ra jesh, V. K . V arsheney a, and V. Sharma. Distribu ted join t source- c hannel co ding on a multiple access c hannel with side inf ormation. Sub- mitte d . [65] S . Ra y , M. Medard, M. Effros, and R. Kotter. On separation for m ultiple access channels. P r o c. IEEE Inform. The ory Workshop , 2006 . [66] A. Sen d onaris, E. Er kip, and B. Aazhang. User co op eration div ersit y- part I. IE EE T r ans. On Commun. , 51(11):192 7–1938, No v 2003. [67] S . Serv etto. Lattice quantiza tion w ith side information: Co des asymp- totics and applications in sen s or net w orks. IE E E T r ans. Inform. The- ory , 53(2):714– 731, F eb. 2007. [68] S . Shamai, S. V erdu , an d R. Zamir. S y s tematic lossy source/c hannel co ding. IEEE T r ans. Inform. The ory , 44(2):564 –578, Marc h 1998. [69] D. Slepian and J. K. W olf. Noiseless co d in g of correlated inf ormation sources. IEEE T r ans. Inform. The ory , 19(4):47 1–480, Jul. 1973. [70] D. Tse and S. V. Hanly . Mu ltiaccess f ading c hannels-part i: p olymatroid structure, op timal resour ce allo cation an d th roughput capacitie s. IEE E T r a ns. Inform. The o ry , 44(7):2 796–281 5, Nov 1998. [71] V. K. V arsheneya and V. Sharma. Distributed co ding for multiple access comm unication with side information. Pr o c. IEEE Wir eless Communi- c a tion and Networking Confer enc e (WCN C) , April 2006. [72] V. K . V arsheney a and V. Sharm a. Lossy distribu ted s ource co ding with side information. Pr o c. National Confer e nc e on Communic a tion (NCC), New Delhi , Jan 2006. [73] V. K. V arshney a. Distrib uted co ding for wir eless sensor netw orks. ME thesis, ECE Dept, IISc , No v. 2005. [74] V. V. V eera v alli. Decen tralize d quic k est c hange detection. IEE E T r ans. Inform. The ory , 47(4):16 57–1665 , May 2001. [75] A. B. W agner, S. T avildar, and P . Visw anath. The r ate region of the quadratic Gaussian t w o term in al source co d ing problem. Pr e print . [76] A. B. W agner, S. T avildar, and P . Visw anath. The r ate region of the quadratic Gaussian t w o terminal source co ding problem. Arxiv (shorter version is also available in ISIT 2006 ) , 2005. [77] N. W ernersson, J . Karlsson, an d M. Sko glund. Distribu ted scalar quan- tisers for Gaussian c hannels. ISIT, Nic e, F r anc e , Jun e 2007. [78] F. M. J. Willems and E. V. Meulan. P artial feedbac k for the d is- crete memoryless multiple access channel. IEEE T r ans. Inform. The o ry , 29(2): 287–290 , Marc h 1983. [79] F. M. J. Willems, E . V. Meulan, and J. P . M. Sc halkwijk. Achiev able rate region for the multiple acc ess c hannel with generalized feedbac k. Pr o c. A l lerton c onfer enc e, Montic el lo, IL , 1983. [80] W. W u , S. Vish w anath, and A. Arap ostatis. On the capacit y of multiple access channels with side information and feedbac k. Pr o c. International Symp osium on Information The ory , July 2006 . [81] A. Wyner. Recent results in Shan n on theory . IE E E T r ans. Inform. The ory , 20(1):2– 10, 1974 . [82] A. Wyner and J. Ziv. The rate d istortion fun ction for s ou r ce co ding with side inf ormation at th e receiv er. IEEE T r ans. Inform . The or y , IT-22:1–1 1, J an. 1976. [83] J . J. Xiao and Z . Q. Luo. Multite rmin al sour ce c hannel communica- tion o v er an orthogonal m ultiple access channel. IEEE T r ans. Inform. The ory , 53(9):32 55–3264, sept. 2007. [84] Z . Xiong, A. Liveris, and S .Cheng. Distributed source co ding for sensor net w orks. IEEE Signal P r o c essing Magazine , p ages 80–94, S ept 2004. [85] Y. Y ang, S. Ch eng, Z. Xiong, and W. Zh ao. Wyn er-Ziv co d ing based on TCQ and LDPC co des. Pr o c. Asilomer Conf. Signals, Systems and Computers, Pacific Gr ove, CA , pages 825– 829, 2003. [86] Y. Y ang, V. Stanko vic, Z. Xiong, an d W. Zhao. Distributed s ource co ding: Sy m metric rates and applicati on to sensor net wo rks. Pr o c. DCC’04, Snowbir d, UT , page 572, 2004. [87] R . Zamir and T. Berger. Mu ltiterminal s ou r ce co din g with high reso- lution. IEEE T r ans. Inform. The ory , 45(1):10 6–117, J an. 1999. [88] R . Z amir, S. S hamai, an d U.Erez. Nested linear/lattice co d es for struc- tured m ultiterminal binn in g. IEEE T r ans. Inform. The ory , 48(6):1250– 1276, Ju ne 2002. [89] W. Zong and J. G. F rias. LDGM co des for channel co ding and join t source-c hannel co din g of correlated sour ces. EURA SIP Journal on Ap- plie d Signal Pr o c essing , pages 942– 953, 2005.
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment