期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Guesswork Analysis of Guessing Decoding of Short Codes
1
作者 Jifan Liang Yanzhi Chen Xiao Ma 《Journal of Communications and Information Networks》 2025年第3期287-298,共12页
This paper addresses:how many guesses are needed for maximum-likelihood(ML)performance by guessing decoding of short binary linear block codes—guessing random additive noise decoding(GRAND)and guessing codeword decod... This paper addresses:how many guesses are needed for maximum-likelihood(ML)performance by guessing decoding of short binary linear block codes—guessing random additive noise decoding(GRAND)and guessing codeword decoding(GCD)?We show the required guesswork depends weakly on code structure:codes with the same length n and dimension k and comparable ML performance need a similar number of guesses.For random codes,the GRAND guesswork is on the orderεRCU2n−k,whereεRCU is the random-coding union bound,tightening the trivial order 2n−k.We derive a universal upper bound on the number of(partial)test error patterns(TEPs),which is then evaluated accurately with a saddlepoint approximation.Analysis and simulations show that,in terms of guesswork,GCD never exceeds and often improves on GRAND,notably at low rates.To further reduce guesswork,we introduce an ordered-statistics decoder with local constraints(LC-OSD),derived by extending the most reliable basis and incorporating an early stopping rule.To reduce the decoding latency,we propose a parallel LC-OSD that utilizes pre-stored TEPs.Simulations of binary images of Reed-Solomon codes,which approach the randomcoding union(RCU)bound,confirm our analysis and demonstrate the superiority of soft-weight ordering over Hamming-weight ordering. 展开更多
关键词 Guessing codeword decoding(GCD) guessing random additive noise decoding(grand) parallel algorithm saddlepoint approximation universal decoding
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部