This paper addresses:how many guesses are needed for maximum-likelihood(ML)performance by guessing decoding of short binary linear block codes—guessing random additive noise decoding(GRAND)and guessing codeword decod...This paper addresses:how many guesses are needed for maximum-likelihood(ML)performance by guessing decoding of short binary linear block codes—guessing random additive noise decoding(GRAND)and guessing codeword decoding(GCD)?We show the required guesswork depends weakly on code structure:codes with the same length n and dimension k and comparable ML performance need a similar number of guesses.For random codes,the GRAND guesswork is on the orderεRCU2n−k,whereεRCU is the random-coding union bound,tightening the trivial order 2n−k.We derive a universal upper bound on the number of(partial)test error patterns(TEPs),which is then evaluated accurately with a saddlepoint approximation.Analysis and simulations show that,in terms of guesswork,GCD never exceeds and often improves on GRAND,notably at low rates.To further reduce guesswork,we introduce an ordered-statistics decoder with local constraints(LC-OSD),derived by extending the most reliable basis and incorporating an early stopping rule.To reduce the decoding latency,we propose a parallel LC-OSD that utilizes pre-stored TEPs.Simulations of binary images of Reed-Solomon codes,which approach the randomcoding union(RCU)bound,confirm our analysis and demonstrate the superiority of soft-weight ordering over Hamming-weight ordering.展开更多
基金supported by the National Key Research and Development Program of China under Grant 2021YFA1000500.
文摘This paper addresses:how many guesses are needed for maximum-likelihood(ML)performance by guessing decoding of short binary linear block codes—guessing random additive noise decoding(GRAND)and guessing codeword decoding(GCD)?We show the required guesswork depends weakly on code structure:codes with the same length n and dimension k and comparable ML performance need a similar number of guesses.For random codes,the GRAND guesswork is on the orderεRCU2n−k,whereεRCU is the random-coding union bound,tightening the trivial order 2n−k.We derive a universal upper bound on the number of(partial)test error patterns(TEPs),which is then evaluated accurately with a saddlepoint approximation.Analysis and simulations show that,in terms of guesswork,GCD never exceeds and often improves on GRAND,notably at low rates.To further reduce guesswork,we introduce an ordered-statistics decoder with local constraints(LC-OSD),derived by extending the most reliable basis and incorporating an early stopping rule.To reduce the decoding latency,we propose a parallel LC-OSD that utilizes pre-stored TEPs.Simulations of binary images of Reed-Solomon codes,which approach the randomcoding union(RCU)bound,confirm our analysis and demonstrate the superiority of soft-weight ordering over Hamming-weight ordering.