<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://en.zaoniao.it/index.php?action=history&amp;feed=atom&amp;title=Slepian%E2%80%93Wolf_coding</id>
	<title>Slepian–Wolf coding - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://en.zaoniao.it/index.php?action=history&amp;feed=atom&amp;title=Slepian%E2%80%93Wolf_coding"/>
	<link rel="alternate" type="text/html" href="http://en.zaoniao.it/index.php?title=Slepian%E2%80%93Wolf_coding&amp;action=history"/>
	<updated>2026-05-15T13:58:59Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.32.0</generator>
	<entry>
		<id>http://en.zaoniao.it/index.php?title=Slepian%E2%80%93Wolf_coding&amp;diff=6895&amp;oldid=prev</id>
		<title>Admin: Created page with &quot;__NOTOC__ In information theory and communication, the '''Slepian–Wolf coding''', also known as the '''Slepian–Wolf bound''', is a result in distributed source c...&quot;</title>
		<link rel="alternate" type="text/html" href="http://en.zaoniao.it/index.php?title=Slepian%E2%80%93Wolf_coding&amp;diff=6895&amp;oldid=prev"/>
		<updated>2019-07-16T09:16:10Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;__NOTOC__ In &lt;a href=&quot;/index.php?title=Information_theory&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;Information theory (page does not exist)&quot;&gt;information theory&lt;/a&gt; and &lt;a href=&quot;/index.php?title=Communication&amp;amp;action=edit&amp;amp;redlink=1&quot; class=&quot;new&quot; title=&quot;Communication (page does not exist)&quot;&gt;communication&lt;/a&gt;, the &amp;#039;&amp;#039;&amp;#039;Slepian–Wolf coding&amp;#039;&amp;#039;&amp;#039;, also known as the &amp;#039;&amp;#039;&amp;#039;Slepian–Wolf bound&amp;#039;&amp;#039;&amp;#039;, is a result in distributed source c...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;__NOTOC__&lt;br /&gt;
In [[information theory]] and [[communication]], the '''Slepian–Wolf coding''', also known as the '''Slepian–Wolf bound''', is a result in [[distributed source coding]] discovered by [[David Slepian]] and [[Jack Wolf]] in 1973. It is a method of theoretically [[Line code|coding]] two [[Lossless compression|lossless compressed]] correlated [[Communication source|source]]s.&lt;br /&gt;
&lt;br /&gt;
Distributed coding is the coding of two, in this case, or more dependent sources with separate [[encoder]]s and a joint [[Codec|decoder]]. Given two statistically dependent i.i.d. finite-alphabet random [[sequence]]s X and Y, the Slepian–Wolf theorem gives a theoretical bound for the lossless coding rate for distributed coding of the two sources as shown below:&lt;br /&gt;
: &amp;lt;math&amp;gt;R_X\geq H(X|Y), \,&amp;lt;/math&amp;gt;&lt;br /&gt;
: &amp;lt;math&amp;gt;R_Y\geq H(Y|X), \, &amp;lt;/math&amp;gt;&lt;br /&gt;
: &amp;lt;math&amp;gt;R_X+R_Y\geq H(X,Y). \, &amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If both the encoder and the decoder of the two sources are independent, the lowest rate it can achieve for lossless compression is &amp;lt;math&amp;gt;H(X)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;H(Y)&amp;lt;/math&amp;gt; for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; respectively, where &amp;lt;math&amp;gt;H(X)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;H(Y)&amp;lt;/math&amp;gt; are the entropies of &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt;. However, with joint decoding, if vanishing error probability for long sequences is accepted, the Slepian–Wolf theorem shows that much better compression rate can be achieved. As long as the total rate of &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; is larger than their joint entropy &amp;lt;math&amp;gt;H(X,Y)&amp;lt;/math&amp;gt; and none of the sources is encoded with a rate smaller than its [[entropy]], distributed coding can achieve arbitrarily small [[Probability of error|error probability]] for long sequences.&lt;br /&gt;
&lt;br /&gt;
A special case of distributed coding is compression with decoder side information, where source &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; is available at the decoder side but not accessible at the encoder side. This can be treated as the condition that &amp;lt;math&amp;gt;R_Y=H(Y)&amp;lt;/math&amp;gt; has already been used to encode &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt;, while we intend to use &amp;lt;math&amp;gt;H(X|Y)&amp;lt;/math&amp;gt; to encode &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;. In other words, two isolated sources can compress data as efficiently as if they were communicating with each other. The whole system is operating in an asymmetric way (compression rate for the two sources are asymmetric).&lt;br /&gt;
&lt;br /&gt;
This bound has been extended to the case of more than two correlated sources by [[Thomas M. Cover]] in 1975, and similar results were obtained in 1976 by [[Aaron D. Wyner]] and [[Jacob Ziv]] with regard to lossy coding of joint Gaussian sources.&lt;br /&gt;
&lt;br /&gt;
==Source==&lt;br /&gt;
&lt;br /&gt;
[http://wikipedia.org/ http://wikipedia.org/]&lt;br /&gt;
&lt;br /&gt;
[[Category:Error-correcting codes]]&lt;br /&gt;
[[Category:Error-detecting codes]]&lt;br /&gt;
==See Also on BitcoinWiki==&lt;br /&gt;
* [[Beat]]&lt;br /&gt;
* [[AQwire]]&lt;br /&gt;
* [[Bountie]]&lt;br /&gt;
* [[Cosplay Token]]&lt;br /&gt;
* [[Developeo]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
</feed>