Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is indepe...Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is independent of the Huffman code. The proposed algorithm's codeword assignment is only based on symbol occurrence probability. It has many advantages over symmetrical construction algorithms available for easy realization and better code performance. In addition, the proposed algorithm simplifies the codeword selection mechanism dramatically.展开更多
Most of multimedia schemes employ variable-length codes (VLCs) like Huffman code as core components in obtaining high compression rates. However VLC methods are very sensitive to channel noise. The goal of this pape...Most of multimedia schemes employ variable-length codes (VLCs) like Huffman code as core components in obtaining high compression rates. However VLC methods are very sensitive to channel noise. The goal of this paper is to salvage as many data from the damaged packets as possible for higher audiovisual quality. This paper proposes an integrated joint source-channel decoder (I-JSCD) at a symbol-level using three-dimensional (3-D) trellis representation for first-order Markov sources encoded with VLC source code and convolutional channel code. This method combines source code and channel code state-spaces and bit-lengths to construct a two-dimensional (2-D) state-space, and then develops a 3-D trellis and a maximum a-posterior (MAP) algorithm to estimate the source sequence symbol by symbol. Experiment results demonstrate that our method results in significant improvement in decoding performance, it can salvage at least half of (50%) data in any channel error rate, and can provide additional error resilience to VLC stream like image, audio, video stream over high error rate links.展开更多
The syndrome a posteriori probability of the log-likelihood ratio of intercepted codewords is used to develop an algorithm that recognizes the polar code length and generator matrix of the underlying polar code.Based ...The syndrome a posteriori probability of the log-likelihood ratio of intercepted codewords is used to develop an algorithm that recognizes the polar code length and generator matrix of the underlying polar code.Based on the encoding structure,three theorems are proved,two related to the relationship between the length and rate of the polar code,and one related to the relationship between frozen-bit positions,information-bit positions,and codewords.With these three theorems,polar codes can be quickly reconstruced.In addition,to detect the dual vectors of codewords,the statistical characteristics of the log-likelihood ratio are analyzed,and then the information-and frozen-bit positions are distinguished based on the minimumerror decision criterion.The bit rate is obtained.The correctness of the theorems and effectiveness of the proposed algorithm are validated through simulations.The proposed algorithm exhibits robustness to noise and a reasonable computational complexity.展开更多
[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data cl...[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data clustering was regarded as high dimensional mixed data clustering. After preprocessing genetic data, the dimensions of the genetic data were reduced by principal component analysis, when genetic data presented Gaussian-like distribution. This distribution of genetic data could be clustered effectively through lossy data compression, which clustered the genes based on a simple clustering algorithm. This algorithm could achieve its best clustering result when the length of the codes of encoding clustered genes reached its minimum value. This algorithm and the traditional clustering algorithms were used to do the genetic data clustering of yeast and Arabidopsis, and the effectiveness of the algorithm was verified through genetic clustering internal evaluation and function evaluation. [Result] The clustering effect of the new algorithm in this study was superior to traditional clustering algorithms, and it also avoided the problems of subjective determination of clustering data and sensitiveness to initial clustering center. [Conclusion] This study provides a new clustering method for the genetic data clustering.展开更多
In this paper, a statistical recognition method of the binary BCH code is proposed. The method is applied to both primitive and non-primitive binary BCH code. The block length is first recognized based on the cyclic f...In this paper, a statistical recognition method of the binary BCH code is proposed. The method is applied to both primitive and non-primitive binary BCH code. The block length is first recognized based on the cyclic feature under the condition of the frame length known. And then candidate polynomials are achieved which meet the restrictions. Among the candidate polynomials, the most optimal polynomial is selected based on the minimum rule of the weights sum of the syndromes. Finally, the best polynomial was factorized to get the generator polynomial recognized. Simulation results show that the method has strong capability of anti-random bit error. Besides, the algorithm proposed is very simple, so it is very practical for hardware im-plementation.展开更多
A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of ou...A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of our decoding algorithm is that source a priori information with the form of bit transition probabilities corresponding to the VLC tree can be derived directly from sub-state transitions in new composite-state represented super-trellis. A Maximum Likelihood (ML) decoding algorithm for VLC sequence estimations based on the proposed super-trellis is also described. Simu-lation results show that the new iterative decoding scheme can obtain obvious encoding gain especially for Reversible Variable Length Codes (RVLCs),when compared with the classical separated turbo decoding and the previous joint decoding not considering source statistical characteristics.展开更多
This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design...This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design, according to the regularity of the codewords, the first one detector is used to solve the low efficiency and high power dissipation problem within the traditional method of table-searching. Considering the relevance of the data used in the process of runbefore's decoding, arithmetic operation is combined with finite state machine (FSM), which achieves higher decoding efficiency. According to the CAVLC decoding flow, clock gating is employed in the module level and the register level respectively, which reduces 43% of the overall dynamic power dissipation. The proposed design can decode every syntax element in one clock cycle. When the proposed design is synthesized at the clock constraint of 100 MHz, the synthesis result shows that the design costs 11 300 gates under a 0.25 μm CMOS technology, which meets the demand of real time decoding in the H.264/AVC standard.展开更多
Current exploration needs are satisfied by multisource technology,which offers low cost,high efficiency,and high precision.The delay time,which determines the separation effects of the multisource blended data,is one ...Current exploration needs are satisfied by multisource technology,which offers low cost,high efficiency,and high precision.The delay time,which determines the separation effects of the multisource blended data,is one of the most crucial parameters in the acquisition and separation of multisource data.This study uses the deblending method of multisource data based on a periodically varying cosine code and analyses the effects of the two parameters,namely,the period amplitude and period length,used in this method on the separation of the multisource blended data.Meanwhile,the obtained coherence data is used to prove the correlation between the separation of multisource data and the two parameters.Examples of synthetic and field data are adopted to demonstrate that from a qualitative perspective,increasing the amplitude of the periodic code improves the separation effect within a reasonable delay time range.When the period length varies in a suitable range,the secondary noise becomes relatively incoherent,resulting in the separation result with a higher signal-to-noise ratio(SNR).From a quantitative perspective,the significant values(Sig.)of the period amplitude and length on the SNRs are less than 0.05,verifying the correlation between the separation of multisource data and the two parameters.展开更多
This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and inter...This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and internal buffers,which significantly improves area,speed,and power.The proposed entropy decoder does not exploit embedded processor for bitstream manipulation, which also improves area,speed,and power.Its gate counts and maximum operation frequency are 77515 gates and 175MHz in 0.18um fabrication process,respectively.The proposed entropy decoder needs 2303 cycles in average for one macroblock decoding.It can run at 28MHz to meet the real-time processing requirement for CIF format video decoding on mobile applications.展开更多
In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results fr...In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).展开更多
This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(...This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.展开更多
Prior to hardware implementation, simulation is an important step in the study of systems such as Direct Sequence Code Division Multiple Access (DS-CDMA). A useful technique is presented, allowing to model and simulat...Prior to hardware implementation, simulation is an important step in the study of systems such as Direct Sequence Code Division Multiple Access (DS-CDMA). A useful technique is presented, allowing to model and simulate Linear Feedback Shift Register (LFSR) for CDMA. It uses the Scilab package and its modeling tool for dynamical systems Xcos. PN-Generators are designed for the quadrature-phase modulation and the Gold Code Generator for Global Positioning System (GPS). This study gives a great flexibility in the conception of LFSR and the analysis of Maximum Length Sequences (MLS) used by spread spectrum systems. Interesting results have been obtained, which allow the verification of generated sequences and their exploitation by signal processing tools.展开更多
Let F_q be a finite field with q = p^m, where p is an odd prime. In this paper, we study the repeated-root self-dual negacyclic codes over Fq. The enumeration of such codes is investigated. We obtain all the self-dual...Let F_q be a finite field with q = p^m, where p is an odd prime. In this paper, we study the repeated-root self-dual negacyclic codes over Fq. The enumeration of such codes is investigated. We obtain all the self-dual negacyclic codes of length 2~ap^r over F_q, a ≥ 1.The construction of self-dual negacyclic codes of length 2~abp^r over F_q is also provided, where gcd(2, b) = gcd(b, p) = 1 and a ≥ 1.展开更多
Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and comp...Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and complexity of image contents and their position information,secret message bits could be coded to embed in these complex redundancies.Together with a specific covering code called average run-length-coding,the embedding efficiency could be reduced to a low level and the resulting capacity outperforms traditional content-based image steganography,which modifies the image data itself and causes a real image quality degradation.Our experiment result demonstrates that the proposed method has limited processing latency and high embedding capacity.What’s more,this method has a low algorithm complexity and less image quality distortion compared with existing steganography methods.展开更多
The test vector compression is a key technique to reduce IC test time and cost since the explosion of the test data of system on chip (SoC) in recent years. To reduce the bandwidth requirement between the automatic ...The test vector compression is a key technique to reduce IC test time and cost since the explosion of the test data of system on chip (SoC) in recent years. To reduce the bandwidth requirement between the automatic test equipment (ATE) and the CUT (circuit under test) effectively, a novel VSPTIDR (variable shifting prefix-tail identifier reverse) code for test stimulus data compression is designed. The encoding scheme is defined and analyzed in detail, and the decoder is presented and discussed. While the probability of 0 bits in the test set is greater than 0.92, the compression ratio from VSPTIDR code is better than the frequency-directed run-length (FDR) code, which can be proved by theoretical analysis and experiments. And the on-chip area overhead of VSPTIDR decoder is about 15.75 % less than the FDR decoder.展开更多
基金Project (No. 60172030) partially supported by the National Natural Science Foundation of China
文摘Reversible variable length codes (RVLCs) have received much attention due to their excellent error resilient capabilities. In this paper, a novel construction algorithm for symmetrical RVLC is proposed which is independent of the Huffman code. The proposed algorithm's codeword assignment is only based on symbol occurrence probability. It has many advantages over symmetrical construction algorithms available for easy realization and better code performance. In addition, the proposed algorithm simplifies the codeword selection mechanism dramatically.
基金Supported by the Foundation of Ministry of Education of China (211CERS10)
文摘Most of multimedia schemes employ variable-length codes (VLCs) like Huffman code as core components in obtaining high compression rates. However VLC methods are very sensitive to channel noise. The goal of this paper is to salvage as many data from the damaged packets as possible for higher audiovisual quality. This paper proposes an integrated joint source-channel decoder (I-JSCD) at a symbol-level using three-dimensional (3-D) trellis representation for first-order Markov sources encoded with VLC source code and convolutional channel code. This method combines source code and channel code state-spaces and bit-lengths to construct a two-dimensional (2-D) state-space, and then develops a 3-D trellis and a maximum a-posterior (MAP) algorithm to estimate the source sequence symbol by symbol. Experiment results demonstrate that our method results in significant improvement in decoding performance, it can salvage at least half of (50%) data in any channel error rate, and can provide additional error resilience to VLC stream like image, audio, video stream over high error rate links.
基金supported by the National Natural Science Foundation of China(62371465)Taishan Scholar Project of Shandong Province(ts201511020)the Chinese National Key Laboratory of Science and Technology on Information System Security(6142111190404).
文摘The syndrome a posteriori probability of the log-likelihood ratio of intercepted codewords is used to develop an algorithm that recognizes the polar code length and generator matrix of the underlying polar code.Based on the encoding structure,three theorems are proved,two related to the relationship between the length and rate of the polar code,and one related to the relationship between frozen-bit positions,information-bit positions,and codewords.With these three theorems,polar codes can be quickly reconstruced.In addition,to detect the dual vectors of codewords,the statistical characteristics of the log-likelihood ratio are analyzed,and then the information-and frozen-bit positions are distinguished based on the minimumerror decision criterion.The bit rate is obtained.The correctness of the theorems and effectiveness of the proposed algorithm are validated through simulations.The proposed algorithm exhibits robustness to noise and a reasonable computational complexity.
文摘[Objective] This paper aimed to provide a new method for genetic data clustering by analyzing the clustering effect of genetic data clustering algorithm based on the minimum coding length. [Method] The genetic data clustering was regarded as high dimensional mixed data clustering. After preprocessing genetic data, the dimensions of the genetic data were reduced by principal component analysis, when genetic data presented Gaussian-like distribution. This distribution of genetic data could be clustered effectively through lossy data compression, which clustered the genes based on a simple clustering algorithm. This algorithm could achieve its best clustering result when the length of the codes of encoding clustered genes reached its minimum value. This algorithm and the traditional clustering algorithms were used to do the genetic data clustering of yeast and Arabidopsis, and the effectiveness of the algorithm was verified through genetic clustering internal evaluation and function evaluation. [Result] The clustering effect of the new algorithm in this study was superior to traditional clustering algorithms, and it also avoided the problems of subjective determination of clustering data and sensitiveness to initial clustering center. [Conclusion] This study provides a new clustering method for the genetic data clustering.
文摘In this paper, a statistical recognition method of the binary BCH code is proposed. The method is applied to both primitive and non-primitive binary BCH code. The block length is first recognized based on the cyclic feature under the condition of the frame length known. And then candidate polynomials are achieved which meet the restrictions. Among the candidate polynomials, the most optimal polynomial is selected based on the minimum rule of the weights sum of the syndromes. Finally, the best polynomial was factorized to get the generator polynomial recognized. Simulation results show that the method has strong capability of anti-random bit error. Besides, the algorithm proposed is very simple, so it is very practical for hardware im-plementation.
基金Supported by the National Natural Science Foundation of China (No.90304003, No.60573112, No.60272056)the Foundation Project of China (No.A1320061262).
文摘A novel Joint Source and Channel Decoding (JSCD) scheme for Variable Length Codes (VLCs) concatenated with turbo codes utilizing a new super-trellis decoding algorithm is presented in this letter. The basic idea of our decoding algorithm is that source a priori information with the form of bit transition probabilities corresponding to the VLC tree can be derived directly from sub-state transitions in new composite-state represented super-trellis. A Maximum Likelihood (ML) decoding algorithm for VLC sequence estimations based on the proposed super-trellis is also described. Simu-lation results show that the new iterative decoding scheme can obtain obvious encoding gain especially for Reversible Variable Length Codes (RVLCs),when compared with the classical separated turbo decoding and the previous joint decoding not considering source statistical characteristics.
基金Project supported by the Applied Materials Shanghai Research and Development Foundation (Grant No.08700741000)the Foundation of Shanghai Municipal Education Commission (Grant No.2006AZ068)
文摘This paper presents an efficient VLSI architecture of the contest-based adaptive variable length code (CAVLC) decoder with power optimized for the H.264/advanced video coding (AVC) standard. In the proposed design, according to the regularity of the codewords, the first one detector is used to solve the low efficiency and high power dissipation problem within the traditional method of table-searching. Considering the relevance of the data used in the process of runbefore's decoding, arithmetic operation is combined with finite state machine (FSM), which achieves higher decoding efficiency. According to the CAVLC decoding flow, clock gating is employed in the module level and the register level respectively, which reduces 43% of the overall dynamic power dissipation. The proposed design can decode every syntax element in one clock cycle. When the proposed design is synthesized at the clock constraint of 100 MHz, the synthesis result shows that the design costs 11 300 gates under a 0.25 μm CMOS technology, which meets the demand of real time decoding in the H.264/AVC standard.
基金supported by the National Key Research and Development Program of China(2018YFA0702503)the National Natural Science Foundation of China(41674122).
文摘Current exploration needs are satisfied by multisource technology,which offers low cost,high efficiency,and high precision.The delay time,which determines the separation effects of the multisource blended data,is one of the most crucial parameters in the acquisition and separation of multisource data.This study uses the deblending method of multisource data based on a periodically varying cosine code and analyses the effects of the two parameters,namely,the period amplitude and period length,used in this method on the separation of the multisource blended data.Meanwhile,the obtained coherence data is used to prove the correlation between the separation of multisource data and the two parameters.Examples of synthetic and field data are adopted to demonstrate that from a qualitative perspective,increasing the amplitude of the periodic code improves the separation effect within a reasonable delay time range.When the period length varies in a suitable range,the secondary noise becomes relatively incoherent,resulting in the separation result with a higher signal-to-noise ratio(SNR).From a quantitative perspective,the significant values(Sig.)of the period amplitude and length on the SNRs are less than 0.05,verifying the correlation between the separation of multisource data and the two parameters.
基金sponsored by ETRI System Semiconductor Industry Promotion Center,Human Resource Development Project for SoC Convergence.
文摘This paper proposes an efficient H.264/AVC entropy decoder.It requires no ROM/RAM fabrication process that decreases fabrication cost and increases operation speed.It was achieved by optimizing lookup tables and internal buffers,which significantly improves area,speed,and power.The proposed entropy decoder does not exploit embedded processor for bitstream manipulation, which also improves area,speed,and power.Its gate counts and maximum operation frequency are 77515 gates and 175MHz in 0.18um fabrication process,respectively.The proposed entropy decoder needs 2303 cycles in average for one macroblock decoding.It can run at 28MHz to meet the real-time processing requirement for CIF format video decoding on mobile applications.
文摘In this paper, we present a Joint Source-Channel Decoding algorithm (JSCD) for Low-Density Parity Check (LDPC) codes by modifying the Sum-Product Algorithm (SPA) to account for the source redun-dancy, which results from the neighbouring Huffman coded bits. Simulations demonstrate that in the presence of source redundancy, the proposed algorithm gives better performance than the Separate Source and Channel Decoding algorithm (SSCD).
基金supported by the Fundamental Research Fund for the Central Universities(NS2013016)
文摘This paper provides a direct and fast acquisition algorithm of civilian long length(CL) codes in the L2 civil(L2C) signal. The proposed algorithm simultaneously reduces the number of fast Fourier transformation(FFT) correlation through hyper code technique and the amount of points in every FFT correlation by using an averaging correlation method. To validate the proposed acquisition performance, the paper applies this algorithm to the real L2C signal collected by the global positioning system(GPS) L2C intermediate frequency(IF) signal sampler—SIS100L2C. The acquisition results show that the proposed modified algorithm can acquire the code phase accurately with less calculation and its acquisition performance is better than the single hyper code method.
文摘Prior to hardware implementation, simulation is an important step in the study of systems such as Direct Sequence Code Division Multiple Access (DS-CDMA). A useful technique is presented, allowing to model and simulate Linear Feedback Shift Register (LFSR) for CDMA. It uses the Scilab package and its modeling tool for dynamical systems Xcos. PN-Generators are designed for the quadrature-phase modulation and the Gold Code Generator for Global Positioning System (GPS). This study gives a great flexibility in the conception of LFSR and the analysis of Maximum Length Sequences (MLS) used by spread spectrum systems. Interesting results have been obtained, which allow the verification of generated sequences and their exploitation by signal processing tools.
基金Supported by Reward Fund for Outstanding Young and Middle-Aged Scientists of Shandong Province(Grant No.BS2011DX011)Qingdao Postdoctoral Fund(Grant No.861605040007)
文摘Let F_q be a finite field with q = p^m, where p is an odd prime. In this paper, we study the repeated-root self-dual negacyclic codes over Fq. The enumeration of such codes is investigated. We obtain all the self-dual negacyclic codes of length 2~ap^r over F_q, a ≥ 1.The construction of self-dual negacyclic codes of length 2~abp^r over F_q is also provided, where gcd(2, b) = gcd(b, p) = 1 and a ≥ 1.
基金This work is supported in part by the First Batch of Youth Innovation Fund Projects in 2020 under Grant No.3502Z202006012the Experimental Teaching Reform Project of National Huaqiao University under Grant No.SY2019L013.
文摘Web page has many redundancies,especially the dynamic html multimedia object.This paper proposes a novel method to employ the commonly used image elements on web pages.Due to the various types of image format and complexity of image contents and their position information,secret message bits could be coded to embed in these complex redundancies.Together with a specific covering code called average run-length-coding,the embedding efficiency could be reduced to a low level and the resulting capacity outperforms traditional content-based image steganography,which modifies the image data itself and causes a real image quality degradation.Our experiment result demonstrates that the proposed method has limited processing latency and high embedding capacity.What’s more,this method has a low algorithm complexity and less image quality distortion compared with existing steganography methods.
基金supported by the Shenzhen Government R&D Project under Grant No.JC200903160361A
文摘The test vector compression is a key technique to reduce IC test time and cost since the explosion of the test data of system on chip (SoC) in recent years. To reduce the bandwidth requirement between the automatic test equipment (ATE) and the CUT (circuit under test) effectively, a novel VSPTIDR (variable shifting prefix-tail identifier reverse) code for test stimulus data compression is designed. The encoding scheme is defined and analyzed in detail, and the decoder is presented and discussed. While the probability of 0 bits in the test set is greater than 0.92, the compression ratio from VSPTIDR code is better than the frequency-directed run-length (FDR) code, which can be proved by theoretical analysis and experiments. And the on-chip area overhead of VSPTIDR decoder is about 15.75 % less than the FDR decoder.