LP Decoding Achieves Capacity Jon Feldman Columbia University Cliff Stein Thank you: Alexander Barg, David Karger, Ralf Koetter, Tal Malkin, Rocco Servedio, Pascal Vontobel, Martin Wainwright, Gilles Zémor
Codes, Noisy Channels, Achieving Capacity Encoder Noise Decoder (?) Noise: probabilistic (Gaussian) Word Error Rate (WER) = Pr noise Channel capacity rate ( ) st decoder, with WER : highest code family, Shannon: characterized capacity for many channels Achieving capacity: code family, decoder, with WER for all J Feldman, C Stein, LP Decoding Achieves Capacity p1/18
Achieving Capacity with Poly-time Decoders Forney ( 66): ML/BD decoder Concatenated codes (OPT Reed-Solomon) Barg/Zémor ( 02): ML/message-passing decoder Expander codes [SS 96][BZ 01-04][GI 01-04] This paper: Linear Programming (LP) decoder Same expander codes as Barg/Zémor New feature: Maximum-Likelihood (ML) Certificate property: if codeword output, it maximizes Pr[correct] J Feldman, C Stein, LP Decoding Achieves Capacity p2/18
Application to Practical Codes Turbo [BGT 93], low-density parity-check (LDPC) [Gal 63] codes: Practical construction/encoding, moderate length Perform well (experimentally) under message-passing decoder Most successful theory: density evolution [RU, LMSS, RSU, BRU, CFDRU,, 99present] Non-constructive, assumes local tree structure LP decoding bounds: No tree assumption Bounds relevant for finite lengths Performance of message-passing is closely related [FKW 02, Fel02, KV02, KV04a, KV04b] J Feldman, C Stein, LP Decoding Achieves Capacity p3/18
Maximum-Likelihood (ML) Decoding Memoryless channel: each indpendently Example: Gaussian, where Cost function : log-likelihood ratio of is affected by noise If If more likely more likely Pr Pr ML DECODING: Given corrupt codeword, find such that is minimized J Feldman, C Stein, LP Decoding Achieves Capacity p4/18
LP Decoding [FK 02, Fel 03] cw cw min Polytope cw noise noise cw pseudocodeword Convex hull( ) LP variables Alg: Solve LP If ML certificate property for each code bit, relaxed integral, output, else error J Feldman, C Stein, LP Decoding Achieves Capacity p5/18
LP Decoding Success Conditions Objective function cases some other cw oise oise (a) (b) (c) (d) pseudocodeword (a) No noise (b) Both succeed (c) ML succeed, LP fail (d) Both fail ll( ) transmitted cw J Feldman, C Stein, LP Decoding Achieves Capacity p6/18
Using a Dual Witness to Prove Success [FMSSW 04] Assume is transmitted (polytope symmetry); assume unique LP optimum (no problem) success Point is LP optimum dual feasible point w/ value Take LP dual, set dual objective = 0: polytope success non-empty Buys analytical slack: With no noise is large Noise increases shrinks Trade off strength of result with ease of analysis Prove result for LDPC codes, adversarial channel [FMSSW 04] J Feldman, C Stein, LP Decoding Achieves Capacity p7/18
Tanner Graph Codes ts Bipartite Tanner graph : variable nodes check nodes Subcode for each check Overall codeword: setting of bits to var nodes st:, bits of in code LDPC codes: special case w/ const degree, = single parity check code Ex: is (3,6)-regular, = J Feldman, C Stein, LP Decoding Achieves Capacity p8/18
LP Relaxation for Tanner Codes [FWK 03] st LP: For all check nodes, ch = convex hull ch of local codewords ch J Feldman, C Stein, LP Decoding Achieves Capacity p9/18
LP Relaxation for Tanner Codes st LP:,, J Feldman, C Stein, LP Decoding Achieves Capacity p10/18
ents Dual Polytope: Tanner Codes Polytope for general Tanner codes: Edge weights For all code bits (left nodes), For all checks, codewords, sup J Feldman, C Stein, LP Decoding Achieves Capacity p11/18
ents Dual Polytope: Tanner Codes No noise in the binary symmetric channel +1 +1 Edge weights For all code bits (left nodes), +1 +1 For all checks, codewords, +1 sup J Feldman, C Stein, LP Decoding Achieves Capacity p12/18
Dual Polytope: Tanner Codes A bit of noise +1 +1-1 +1 - - - + + + + Edge weights For all code bits (left nodes), For all checks, codewords, -1 sup J Feldman, C Stein, LP Decoding Achieves Capacity p13/18
Using Expansion to Set Edge Weights G G G G Y Y strong codes GV codes (high rate) Code built from Ramanujan graph [SS 96, BZ 02] Top nodes: strong codes, rate Bottom nodes: GV-bound codes, rate Initial weighting: all top edges bottom edges, J Feldman, C Stein, LP Decoding Achieves Capacity p14/18
Using Expansion to Set Edge Weights G I J X Y Z [ \^] Error region N(err) \ X _ X \ Y _ Y \ Z _ Z \ [ _ [ ` Error region: checks w/ violated dual constraints Use expansion to spread out excess weight among neighbors details interesting but omitted Maintain a, check constraints J Feldman, C Stein, LP Decoding Achieves Capacity p15/18
Using Expansion to Set Edge Weights _ \ X J X X Y _ \ Y G Y Z _ \ Z Z I _ \ [ [ [ ` Error region \^] N(err) Thm: For any memoryless symmetric LLRbounded channel with capacity, and any rate, there exists a code family of rate for which the word error rate of LP decoding is J Feldman, C Stein, LP Decoding Achieves Capacity p16/18
Future work: LP Decoder Applications/Extensions Turbo codes: natural flow-like LP [FK 02][Fel 03] RA(1/2) codes: WER Can we get WER [FK 02][EH 03]? (eg, rate 1/3 RA) J Feldman, C Stein, LP Decoding Achieves Capacity p17/18
Future work: LP Decoder Applications/Extensions Turbo codes: natural flow-like LP [FK 02][Fel 03] RA(1/2) codes: WER Can we get WER Tighter relaxation (lift and project)? [FK 02][EH 03]? (eg, rate 1/3 RA) ML decoding using IP (branch and bound)? Deeper connections to message-passing? More efficient algorithm to solve LP? J Feldman, C Stein, LP Decoding Achieves Capacity p17/18
Major Open Questions Achieve capacity, decoding time poly All previous capacity results: Explicit ML decoding of poly This result: Representation of ch(poly -length codes -length codes) J Feldman, C Stein, LP Decoding Achieves Capacity p18/18
Major Open Questions Achieve capacity, decoding time poly All previous capacity results: Explicit ML decoding of poly This result: Representation of ch(poly -length codes -length codes) Achieve near-capacity rates for LDPC codes without the tree assumption Small cycles are bad : fact of fiction? Expansion: to limited? J Feldman, C Stein, LP Decoding Achieves Capacity p18/18