Using a quantum computer to simulate fermionic systems requires fermion-to-qubit transformations.Usually,lower Pauli weight of transformations means shallower quantum circuits.Therefore,most existing transformations a...Using a quantum computer to simulate fermionic systems requires fermion-to-qubit transformations.Usually,lower Pauli weight of transformations means shallower quantum circuits.Therefore,most existing transformations aim for lower Pauli weight.However,in some cases,the circuit depth depends not only on the Pauli weight but also on the coefficients of the Hamiltonian terms.In order to characterize the circuit depth of these algorithms,we propose a new metric called weighted Pauli weight,which depends on Pauli weight and coefficients of Hamiltonian terms.To achieve smaller weighted Pauli weight,we introduce a novel transformation,Huffman-code-based ternary tree(HTT)transformation,which is built upon the classical Huffman code and tailored to different Hamiltonians.We tested various molecular Hamiltonians and the results show that the weighted Pauli weight of the HTT transformation is smaller than that of commonly used mappings.At the same time,the HTT transformation also maintains a relatively small Pauli weight.The mapping we designed reduces the circuit depth of certain Hamiltonian simulation algorithms,facilitating faster simulation of fermionic systems.展开更多
Aiming at the problem that the bit error rate(BER)of asymmetrically clipped optical orthogonal frequency division multiplexing(ACO-OFDM)space optical communication system is significantly affected by different turbule...Aiming at the problem that the bit error rate(BER)of asymmetrically clipped optical orthogonal frequency division multiplexing(ACO-OFDM)space optical communication system is significantly affected by different turbulence intensities,the deep learning technique is proposed to the polarization code decoding in ACO-OFDM space optical communication system.Moreover,this system realizes the polarization code decoding and signal demodulation without frequency conduction with superior performance and robustness compared with the performance of traditional decoder.Simulations under different turbulence intensities as well as different mapping orders show that the convolutional neural network(CNN)decoder trained under weak-medium-strong turbulence atmospheric channels achieves a performance improvement of about 10^(2)compared to the conventional decoder at 4-quadrature amplitude modulation(4QAM),and the BERs for both 16QAM and 64QAM are in between those of the conventional decoder.展开更多
介绍了高温蠕变工况下运行的压力容器可能出现的失效模式,结合工程设计现状,指出了我国当前压力容器标准体系在确定高温蠕变工况许用压应力时存在的技术瓶颈,在此基础之上引出ASME Code Case 3029,对其适用范围、发展历程、产生背景及...介绍了高温蠕变工况下运行的压力容器可能出现的失效模式,结合工程设计现状,指出了我国当前压力容器标准体系在确定高温蠕变工况许用压应力时存在的技术瓶颈,在此基础之上引出ASME Code Case 3029,对其适用范围、发展历程、产生背景及工程意义进行了简单的介绍,以某工程设计项目中的实际结构为例,介绍了该方法的使用过程及注意事项,并结合压力容器工程设计领域的实际需求,对我国标准体系下一步的制定或修订方向提出了展望。展开更多
Quantum error correction is a technique that enhances a system’s ability to combat noise by encoding logical information into additional quantum bits,which plays a key role in building practical quantum computers.The...Quantum error correction is a technique that enhances a system’s ability to combat noise by encoding logical information into additional quantum bits,which plays a key role in building practical quantum computers.The XZZX surface code,with only one stabilizer generator on each face,demonstrates significant application potential under biased noise.However,the existing minimum weight perfect matching(MWPM)algorithm has high computational complexity and lacks flexibility in large-scale systems.Therefore,this paper proposes a decoding method that combines graph neural networks(GNN)with multi-classifiers,the syndrome is transformed into an undirected graph,and the features are aggregated by convolutional layers,providing a more efficient and accurate decoding strategy.In the experiments,we evaluated the performance of the XZZX code under different biased noise conditions(bias=1,20,200)and different code distances(d=3,5,7,9,11).The experimental results show that under low bias noise(bias=1),the GNN decoder achieves a threshold of 0.18386,an improvement of approximately 19.12%compared to the MWPM decoder.Under high bias noise(bias=200),the GNN decoder reaches a threshold of 0.40542,improving by approximately 20.76%,overcoming the limitations of the conventional decoder.They demonstrate that the GNN decoding method exhibits superior performance and has broad application potential in the error correction of XZZX code.展开更多
Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-med...Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-medium length regime.However,this list decoding complexity becomes formidable as the decoding output list size increases.This is primarily incurred by the OSD.Addressing this challenge,this paper proposes the low complexity SCL decoding through reducing the complexity of component code decoding,and pruning the redundant SCL decoding paths.For the former,an efficient skipping rule is introduced for the OSD so that the higher order decoding can be skipped when they are not possible to provide a more likely codeword candidate.It is further extended to the OSD variant,the box-andmatch algorithm(BMA),in facilitating the component code decoding.Moreover,through estimating the correlation distance lower bounds(CDLBs)of the component code decoding outputs,a path pruning(PP)-SCL decoding is proposed to further facilitate the decoding of U-UV codes.In particular,its integration with the improved OSD and BMA is discussed.Simulation results show that significant complexity reduction can be achieved.Consequently,the U-UV codes can outperform the cyclic redundancy check(CRC)-polar codes with a similar decoding complexity.展开更多
Space laser communication(SLC)is an emerging technology to support high-throughput data transmissions in space networks.In this paper,to guarantee the reliability of high-speed SLC links,we aim at practical implementa...Space laser communication(SLC)is an emerging technology to support high-throughput data transmissions in space networks.In this paper,to guarantee the reliability of high-speed SLC links,we aim at practical implementation of low-density paritycheck(LDPC)decoding under resource-restricted space platforms.Particularly,due to the supply restriction and cost issues of high-speed on-board devices such as analog-to-digital converters(ADCs),the input of LDPC decoding will be usually constrained by hard-decision channel output.To tackle this challenge,density-evolution-based theoretical analysis is firstly performed to identify the cause of performance degradation in the conventional binaryinitialized iterative decoding(BIID)algorithm.Then,a computation-efficient decoding algorithm named multiary-initialized iterative decoding with early termination(MIID-ET)is proposed,which improves the error-correcting performance and computation efficiency by using a reliability-based initialization method and a threshold-based decoding termination rule.Finally,numerical simulations are conducted on example codes of rates 7/8 and 1/2 to evaluate the performance of different LDPC decoding algorithms,where the proposed MIID-ET outperforms the BIID with a coding gain of 0.38 dB and variable node calculation saving of 37%.With this advantage,the proposed MIID-ET can notably reduce LDPC decoder’s hardware implementation complexity under the same bit error rate performance,which successfully doubles the total throughput to 10 Gbps on a single-chip FPGA.展开更多
Viruses circulating in small mammals possess the potential to infect humans.Tree shrews are a group of small mammals inhabiting widely in forests and plantations,but studies on viruses in tree shrews are quite limited...Viruses circulating in small mammals possess the potential to infect humans.Tree shrews are a group of small mammals inhabiting widely in forests and plantations,but studies on viruses in tree shrews are quite limited.Herein,viral metagenomic sequencing was employed to detect the virome in the tissue and swab samples from seventy-six tree shrews that we collected in Yunnan Province.As the results,genomic fragments belonging to eighteen viral families were identified,thirteen of which contain mammalian viruses.Through polymerase chain reaction(PCR)and Sanger sequencing,twelve complete genomes were determined,including five parvoviruses,three torque teno viruses(TTVs),two adenoviruses,one pneumovirus,and one hepacivirus,together with three partial genomes,including two hepatitis E viruses and one paramyxovirus.Notably,the three TTVs,named TSTTV-HNU1,TSTTV-HNU2,and TSTTV-HNU3,may compose a new genus within the family Anelloviridae.Notably,TSParvoV-HNU5,one of the tree shrew parvoviruses detected,was likely to be a recombination of two murine viruses.Divergence time estimation further revealed the potential cross-species-transmission history of the tree shrew pneumovirus TSPneV-HNU1.Our study provides a comprehensive exploration of viral diversity in wild tree shrews,significantly enhancing our understanding of their roles as natural virus reservoirs.展开更多
Forests play a critical role in mitigating cli-mate change by sequestering carbon,yet their responses to environmental shifts remain complex and multifaceted.This special issue,“Tree Rings,Forest Carbon Sink,and Clim...Forests play a critical role in mitigating cli-mate change by sequestering carbon,yet their responses to environmental shifts remain complex and multifaceted.This special issue,“Tree Rings,Forest Carbon Sink,and Climate Change,”compiles 41 interdisciplinary studies exploring forest-climate interactions through dendrochro-nological and ecological approaches.It addresses climate reconstruction(e.g.,temperature,precipitation,isotopes)using tree-ring proxies,species-specific and age-dependent growth responses to warming and drought,anatomical adap-tations,and methodological innovations in isotope analysis and multi-proxy integration.Key findings reveal ENSO/AMO modulation of historical climates,elevation-and latitude-driven variability in tree resilience,contrasting carbon dynamics under stress,and projected habitat shifts for vulnerable species.The issue underscores forests’dual role as climate archives and carbon regulators,offering insights for adaptive management and nature-based climate solutions.Contributions bridge micro-scale physiological processes to macro-scale ecological modeling,advancing sustainable strategies amid global environmental challenges.展开更多
To improve the decoding performance of quantum error-correcting codes in asymmetric noise channels,a neural network-based decoding algorithm for bias-tailored quantum codes is proposed.The algorithm consists of a bias...To improve the decoding performance of quantum error-correcting codes in asymmetric noise channels,a neural network-based decoding algorithm for bias-tailored quantum codes is proposed.The algorithm consists of a biased noise model,a neural belief propagation decoder,a convolutional optimization layer,and a multi-objective loss function.The biased noise model simulates asymmetric error generation,providing a training dataset for decoding.The neural network,leveraging dynamic weight learning and a multi-objective loss function,mitigates error degeneracy.Additionally,the convolutional optimization layer enhances early-stage convergence efficiency.Numerical results show that for bias-tailored quantum codes,our decoder performs much better than the belief propagation(BP)with ordered statistics decoding(BP+OSD).Our decoder achieves an order of magnitude improvement in the error suppression compared to higher-order BP+OSD.Furthermore,the decoding threshold of our decoder for surface codes reaches a high threshold of 20%.展开更多
Mobile communications are reaching out to every aspect of our daily life,necessitating highefficiency data transmission and support for diverse data types and communication scenarios.Polar codes have emerged as a prom...Mobile communications are reaching out to every aspect of our daily life,necessitating highefficiency data transmission and support for diverse data types and communication scenarios.Polar codes have emerged as a promising solution due to their outstanding error-correction performance and low complexity.Unequal error protection(UEP)involves nonuniform error safeguarding for distinct data segments,achieving a fine balance between error resilience and resource allocation,which ultimately enhancing system performance and efficiency.In this paper,we propose a novel class of UEP rateless polar codes.The codes are designed based on matrix extension of polar codes,and elegant mapping and duplication operations are designed to achieve UEP property while preserving the overall performance of conventional polar codes.Superior UEP performance is attained without significant modifications to conventional polar codes,making it straightforward for compatibility with existing polar codes.A theoretical analysis is conducted on the block error rate and throughput efficiency performance.To the best of our knowledge,this work provides the first theoretical performance analysis of UEP rateless polar codes.Simulation results show that the proposed codes significantly outperform existing polar coding schemes in both block error rate and throughput efficiency.展开更多
Soil organic carbon in forest affects nutrient availability,microbial processes,and organic matter inputs.Dominant tree species have increasingly shifted from ectomycorrhizal to arbuscular mycorrhizal associations in ...Soil organic carbon in forest affects nutrient availability,microbial processes,and organic matter inputs.Dominant tree species have increasingly shifted from ectomycorrhizal to arbuscular mycorrhizal associations in subtropical forests.However,the consequences of this shift for soil organic carbon is poorly understood.To address this,a field study was conducted across a natural gradient of arbuscular tree associations to investigate how different mycorrhizal associations affect soil organic carbon quantity,composition,chemical stability,and related soil properties.Soil organic carbon fractions,functional groups,microbial enzyme activities were analyzed.Results showed that increasing arbuscular mycorrhizal dominance was associated with declines in total soil organic carbon,particularly in recalcitrant and aromatic carbon forms.Ectomycorrhizaldominated forests exhibited higher nitrogen availability and elevated nitrogen-hydrolyzing enzyme activity,suggesting enhanced nitrogen acquisition strategies that suppress soil organic carbon decomposition and promote carbon retention.These findings indicate that mycorrhizal-mediated shifts in tree composition may significantly alter soil carbon sequestration potential.Incorporating mycorrhizal functional traits into forest management and carbon modeling could improve predictions of soil organic carbon responses under future environmental change.展开更多
Allometric equations are fundamental tools in ecological research and forestry management,widely used for estimating above-ground biomass and production,serving as the core foundations of dynamic vegetation models.Usi...Allometric equations are fundamental tools in ecological research and forestry management,widely used for estimating above-ground biomass and production,serving as the core foundations of dynamic vegetation models.Using global datasets from Tallo(a tree allometry and crown architecture database encompassing thousands of species)and TRY(a plant traits database),we fit B ayesian hierarchical models with three alternative functional forms(powerlaw,generalized Michaelis-Menten(gMM),and Weibull)to characterize how diameter at breast height(DBH),tree height(H),and crown radius(CR)scale with and without wood density as a species-level predictor.Our analysis revealed that the saturating Weibull function best captured the relationship between tree height and DBH in both functional groups,whereas the CR-DBH relationship was best predicted by a power-law function in angiosperms and by the gMM function in gymnosperms.Although including wood density did not significantly improve predictive performance,it revealed important ecological trade-offs:lighter-wood angiosperms achieve taller mature heights more rapidly,and denser wood promotes wider crown expansion across clades.We also found that accurately estimating DBH required considering both height and crown size,highlighting how these variables together distinguish trees of similar height but differing trunk diameters.Our results emphasize the importance of applying saturating functions for large trees to improve forest biomass estimates and show that wood density,though not always predictive at broad scales,helps illuminate the biomechanical and ecological constraints underlying diverse tree architectures.These findings offer practical pathways for integrating height-and crown-based metrics into existing carbon monitoring programs worldwide.展开更多
Compact size,high brightness,and wide field of view(FOV)are key requirements for long-wave infrared imagers used in military surveillance or night navigation.However,to meet the imaging requirements of high resolution...Compact size,high brightness,and wide field of view(FOV)are key requirements for long-wave infrared imagers used in military surveillance or night navigation.However,to meet the imaging requirements of high resolution and wide FOV,infrared optical systems often adopt complex optical lens groups,which will increase the size and weight of the optical system.In this paper,a strategy based on wavefront coding(WFC)is proposed to design a compact wide-FOV infrared imager.A cubic phase mask is inserted into the pupil plane of the infrared imager to correct the aberration.The simulated results show that,the WFC infrared imager has good imaging quality in a wide FOV of±16°.In addition,the WFC infrared imager achieves compactness with its 40 mm×40 mm×40 mm size.A fast focal ratio of 1 combined with an entrance pupil diameter of 25 mm ensures brightness.This work is of significance for designing a compact wide-FOV infrared imager.展开更多
Background 3D botanical tree reconstruction from a single image plays a vital role in the field of computer graphics.However,accurately capturing the intricate branching patterns and detailed morphologies of trees rem...Background 3D botanical tree reconstruction from a single image plays a vital role in the field of computer graphics.However,accurately capturing the intricate branching patterns and detailed morphologies of trees remains a challenge.Methods In this study,we proposed a novel approach for single-image tree reconstruction using a conditional generative adversarial network to infer the 3D skeleton of a tree in the form of a 2D skeleton depth map.Based on the 2D skeleton depth map,a corresponding branching structure(3D skeleton)that inherits the tree shape in the input image and leaves can be generated using a procedural modeling technique.Result Experimental results show that the proposed method accurately reconstructs diverse tree structures across species.Both quantitative and qualitative evaluations demonstrate improved skeleton completeness,branching accuracy,and visual realism over baseline methods,while requiring no user input.Conclusions Our proposed approach for generating lifelike 3D tree models from a single image with no user input shows its proficiency in achieving efficient and reliable reconstruction.These results showcase the capability of the proposed model to recreate complex tree architectures while capturing their visual authenticity.展开更多
Differential pulse-position modulation(DP PM)can achieve a good compromise between power and bandwidth requirements.However,the output sequence has undetectable insertions and deletions.This paper proposes a successiv...Differential pulse-position modulation(DP PM)can achieve a good compromise between power and bandwidth requirements.However,the output sequence has undetectable insertions and deletions.This paper proposes a successive cancellation(SC)decoding scheme based on the weighted levenshtein distance(WLD)of polar codes for correcting insertions/deletions in DPPM systems.In this method,the WLD is used to calculate the transfer probabilities recursively to obtain likelihood ratios,and the low-complexity SC decoding method is built according to the error characteristics to match the DPPM system.Additionally,the proposed SC decoding scheme is extended to list decoding,which can further improve error correction performance.Simulation results show that the proposed scheme can effectively correct insertions/deletions in the DPPM system,which enhances its reliability and performance.展开更多
This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for ...This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for identifying critical failure modes and their root causes,while BN introduces flexibility in probabilistic reasoning,enabling dynamic updates based on new evidence.This dual methodology overcomes the limitations of static FTA models,offering a comprehensive framework for system reliability analysis.Critical failures,including External Leakage(ELU),Failure to Start(FTS),and Overheating(OHE),were identified as key risks.By incorporating redundancy into high-risk components such as pumps and batteries,the likelihood of these failures was significantly reduced.For instance,redundant pumps reduced the probability of ELU by 31.88%,while additional batteries decreased the occurrence of FTS by 36.45%.The results underscore the practical benefits of combining FTA and BN for enhancing system reliability,particularly in maritime applications where operational safety and efficiency are critical.This research provides valuable insights for maintenance planning and highlights the importance of redundancy in critical systems,especially as the industry transitions toward more autonomous vessels.展开更多
As artificial Intelligence(AI)continues to expand exponentially,particularly with the emergence of generative pre-trained transformers(GPT)based on a transformer’s architecture,which has revolutionized data processin...As artificial Intelligence(AI)continues to expand exponentially,particularly with the emergence of generative pre-trained transformers(GPT)based on a transformer’s architecture,which has revolutionized data processing and enabled significant improvements in various applications.This document seeks to investigate the security vulnerabilities detection in the source code using a range of large language models(LLM).Our primary objective is to evaluate the effectiveness of Static Application Security Testing(SAST)by applying various techniques such as prompt persona,structure outputs and zero-shot.To the selection of the LLMs(CodeLlama 7B,DeepSeek coder 7B,Gemini 1.5 Flash,Gemini 2.0 Flash,Mistral 7b Instruct,Phi 38b Mini 128K instruct,Qwen 2.5 coder,StartCoder 27B)with comparison and combination with Find Security Bugs.The evaluation method will involve using a selected dataset containing vulnerabilities,and the results to provide insights for different scenarios according to the software criticality(Business critical,non-critical,minimum effort,best effort)In detail,the main objectives of this study are to investigate if large language models outperform or exceed the capabilities of traditional static analysis tools,if the combining LLMs with Static Application Security Testing(SAST)tools lead to an improvement and the possibility that local machine learning models on a normal computer produce reliable results.Summarizing the most important conclusions of the research,it can be said that while it is true that the results have improved depending on the size of the LLM for business-critical software,the best results have been obtained by SAST analysis.This differs in“NonCritical,”“Best Effort,”and“Minimum Effort”scenarios,where the combination of LLM(Gemini)+SAST has obtained better results.展开更多
基金supported by the National Key Research and Development Program of China(Grant No.2024YFB4504101)the National Nat-ural Science Foundation of China(Grant No.22303022)the Anhui Province Innovation Plan for Science and Technology(Grant No.202423r06050002).
文摘Using a quantum computer to simulate fermionic systems requires fermion-to-qubit transformations.Usually,lower Pauli weight of transformations means shallower quantum circuits.Therefore,most existing transformations aim for lower Pauli weight.However,in some cases,the circuit depth depends not only on the Pauli weight but also on the coefficients of the Hamiltonian terms.In order to characterize the circuit depth of these algorithms,we propose a new metric called weighted Pauli weight,which depends on Pauli weight and coefficients of Hamiltonian terms.To achieve smaller weighted Pauli weight,we introduce a novel transformation,Huffman-code-based ternary tree(HTT)transformation,which is built upon the classical Huffman code and tailored to different Hamiltonians.We tested various molecular Hamiltonians and the results show that the weighted Pauli weight of the HTT transformation is smaller than that of commonly used mappings.At the same time,the HTT transformation also maintains a relatively small Pauli weight.The mapping we designed reduces the circuit depth of certain Hamiltonian simulation algorithms,facilitating faster simulation of fermionic systems.
基金supported by the National Natural Science Foundation of China(No.12104141).
文摘Aiming at the problem that the bit error rate(BER)of asymmetrically clipped optical orthogonal frequency division multiplexing(ACO-OFDM)space optical communication system is significantly affected by different turbulence intensities,the deep learning technique is proposed to the polarization code decoding in ACO-OFDM space optical communication system.Moreover,this system realizes the polarization code decoding and signal demodulation without frequency conduction with superior performance and robustness compared with the performance of traditional decoder.Simulations under different turbulence intensities as well as different mapping orders show that the convolutional neural network(CNN)decoder trained under weak-medium-strong turbulence atmospheric channels achieves a performance improvement of about 10^(2)compared to the conventional decoder at 4-quadrature amplitude modulation(4QAM),and the BERs for both 16QAM and 64QAM are in between those of the conventional decoder.
文摘介绍了高温蠕变工况下运行的压力容器可能出现的失效模式,结合工程设计现状,指出了我国当前压力容器标准体系在确定高温蠕变工况许用压应力时存在的技术瓶颈,在此基础之上引出ASME Code Case 3029,对其适用范围、发展历程、产生背景及工程意义进行了简单的介绍,以某工程设计项目中的实际结构为例,介绍了该方法的使用过程及注意事项,并结合压力容器工程设计领域的实际需求,对我国标准体系下一步的制定或修订方向提出了展望。
基金supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021MF049)the Joint Fund of Natural Science Foundation of Shandong Province,China(Grant Nos.ZR2022LL.Z012 and ZR2021LLZ001)the Key Research and Development Program of Shandong Province,China(Grant No.2023CXGC010901).
文摘Quantum error correction is a technique that enhances a system’s ability to combat noise by encoding logical information into additional quantum bits,which plays a key role in building practical quantum computers.The XZZX surface code,with only one stabilizer generator on each face,demonstrates significant application potential under biased noise.However,the existing minimum weight perfect matching(MWPM)algorithm has high computational complexity and lacks flexibility in large-scale systems.Therefore,this paper proposes a decoding method that combines graph neural networks(GNN)with multi-classifiers,the syndrome is transformed into an undirected graph,and the features are aggregated by convolutional layers,providing a more efficient and accurate decoding strategy.In the experiments,we evaluated the performance of the XZZX code under different biased noise conditions(bias=1,20,200)and different code distances(d=3,5,7,9,11).The experimental results show that under low bias noise(bias=1),the GNN decoder achieves a threshold of 0.18386,an improvement of approximately 19.12%compared to the MWPM decoder.Under high bias noise(bias=200),the GNN decoder reaches a threshold of 0.40542,improving by approximately 20.76%,overcoming the limitations of the conventional decoder.They demonstrate that the GNN decoding method exhibits superior performance and has broad application potential in the error correction of XZZX code.
基金supported by the National Natural Science Foundation of China(NSFC)with project ID 62071498the Guangdong National Science Foundation(GDNSF)with project ID 2024A1515010213.
文摘Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-medium length regime.However,this list decoding complexity becomes formidable as the decoding output list size increases.This is primarily incurred by the OSD.Addressing this challenge,this paper proposes the low complexity SCL decoding through reducing the complexity of component code decoding,and pruning the redundant SCL decoding paths.For the former,an efficient skipping rule is introduced for the OSD so that the higher order decoding can be skipped when they are not possible to provide a more likely codeword candidate.It is further extended to the OSD variant,the box-andmatch algorithm(BMA),in facilitating the component code decoding.Moreover,through estimating the correlation distance lower bounds(CDLBs)of the component code decoding outputs,a path pruning(PP)-SCL decoding is proposed to further facilitate the decoding of U-UV codes.In particular,its integration with the improved OSD and BMA is discussed.Simulation results show that significant complexity reduction can be achieved.Consequently,the U-UV codes can outperform the cyclic redundancy check(CRC)-polar codes with a similar decoding complexity.
基金supported by the National Key R&D Program of China(Grant No.2022YFA1005000)the National Natural Science Foundation of China(Grant No.62101308 and 62025110).
文摘Space laser communication(SLC)is an emerging technology to support high-throughput data transmissions in space networks.In this paper,to guarantee the reliability of high-speed SLC links,we aim at practical implementation of low-density paritycheck(LDPC)decoding under resource-restricted space platforms.Particularly,due to the supply restriction and cost issues of high-speed on-board devices such as analog-to-digital converters(ADCs),the input of LDPC decoding will be usually constrained by hard-decision channel output.To tackle this challenge,density-evolution-based theoretical analysis is firstly performed to identify the cause of performance degradation in the conventional binaryinitialized iterative decoding(BIID)algorithm.Then,a computation-efficient decoding algorithm named multiary-initialized iterative decoding with early termination(MIID-ET)is proposed,which improves the error-correcting performance and computation efficiency by using a reliability-based initialization method and a threshold-based decoding termination rule.Finally,numerical simulations are conducted on example codes of rates 7/8 and 1/2 to evaluate the performance of different LDPC decoding algorithms,where the proposed MIID-ET outperforms the BIID with a coding gain of 0.38 dB and variable node calculation saving of 37%.With this advantage,the proposed MIID-ET can notably reduce LDPC decoder’s hardware implementation complexity under the same bit error rate performance,which successfully doubles the total throughput to 10 Gbps on a single-chip FPGA.
基金funded by the National Natural Science Foundation of China(No.U2002218)the Science and Technology Innovation Program of Hunan Province(2024RC1028)Hunan University(No.521119400156).
文摘Viruses circulating in small mammals possess the potential to infect humans.Tree shrews are a group of small mammals inhabiting widely in forests and plantations,but studies on viruses in tree shrews are quite limited.Herein,viral metagenomic sequencing was employed to detect the virome in the tissue and swab samples from seventy-six tree shrews that we collected in Yunnan Province.As the results,genomic fragments belonging to eighteen viral families were identified,thirteen of which contain mammalian viruses.Through polymerase chain reaction(PCR)and Sanger sequencing,twelve complete genomes were determined,including five parvoviruses,three torque teno viruses(TTVs),two adenoviruses,one pneumovirus,and one hepacivirus,together with three partial genomes,including two hepatitis E viruses and one paramyxovirus.Notably,the three TTVs,named TSTTV-HNU1,TSTTV-HNU2,and TSTTV-HNU3,may compose a new genus within the family Anelloviridae.Notably,TSParvoV-HNU5,one of the tree shrew parvoviruses detected,was likely to be a recombination of two murine viruses.Divergence time estimation further revealed the potential cross-species-transmission history of the tree shrew pneumovirus TSPneV-HNU1.Our study provides a comprehensive exploration of viral diversity in wild tree shrews,significantly enhancing our understanding of their roles as natural virus reservoirs.
基金supported by the Outstanding Action Plan of Chinese Sci-tech Journals(Grant No.OAP-C-077).
文摘Forests play a critical role in mitigating cli-mate change by sequestering carbon,yet their responses to environmental shifts remain complex and multifaceted.This special issue,“Tree Rings,Forest Carbon Sink,and Climate Change,”compiles 41 interdisciplinary studies exploring forest-climate interactions through dendrochro-nological and ecological approaches.It addresses climate reconstruction(e.g.,temperature,precipitation,isotopes)using tree-ring proxies,species-specific and age-dependent growth responses to warming and drought,anatomical adap-tations,and methodological innovations in isotope analysis and multi-proxy integration.Key findings reveal ENSO/AMO modulation of historical climates,elevation-and latitude-driven variability in tree resilience,contrasting carbon dynamics under stress,and projected habitat shifts for vulnerable species.The issue underscores forests’dual role as climate archives and carbon regulators,offering insights for adaptive management and nature-based climate solutions.Contributions bridge micro-scale physiological processes to macro-scale ecological modeling,advancing sustainable strategies amid global environmental challenges.
基金supported by the National Natural Science Foundation of China(Grant Nos.62371240,61802175,62401266,and 12201300)the National Key R&D Program of China(Grant No.2022YFB3103800)+2 种基金the Natural Science Foundation of Jiangsu Province(Grant No.BK20241452)the Fundamental Research Funds for the Central Universities(Grant No.30923011014)the fund of Laboratory for Advanced Computing and Intelligence Engineering(Grant No.2023-LYJJ-01-009)。
文摘To improve the decoding performance of quantum error-correcting codes in asymmetric noise channels,a neural network-based decoding algorithm for bias-tailored quantum codes is proposed.The algorithm consists of a biased noise model,a neural belief propagation decoder,a convolutional optimization layer,and a multi-objective loss function.The biased noise model simulates asymmetric error generation,providing a training dataset for decoding.The neural network,leveraging dynamic weight learning and a multi-objective loss function,mitigates error degeneracy.Additionally,the convolutional optimization layer enhances early-stage convergence efficiency.Numerical results show that for bias-tailored quantum codes,our decoder performs much better than the belief propagation(BP)with ordered statistics decoding(BP+OSD).Our decoder achieves an order of magnitude improvement in the error suppression compared to higher-order BP+OSD.Furthermore,the decoding threshold of our decoder for surface codes reaches a high threshold of 20%.
基金supported by National Natural Science Foundation of China(No.62301008)China Postdoctoral Science Foundation(No.2022M720272)New Cornerstone Science Foundation through the XPLORER PRIZE。
文摘Mobile communications are reaching out to every aspect of our daily life,necessitating highefficiency data transmission and support for diverse data types and communication scenarios.Polar codes have emerged as a promising solution due to their outstanding error-correction performance and low complexity.Unequal error protection(UEP)involves nonuniform error safeguarding for distinct data segments,achieving a fine balance between error resilience and resource allocation,which ultimately enhancing system performance and efficiency.In this paper,we propose a novel class of UEP rateless polar codes.The codes are designed based on matrix extension of polar codes,and elegant mapping and duplication operations are designed to achieve UEP property while preserving the overall performance of conventional polar codes.Superior UEP performance is attained without significant modifications to conventional polar codes,making it straightforward for compatibility with existing polar codes.A theoretical analysis is conducted on the block error rate and throughput efficiency performance.To the best of our knowledge,this work provides the first theoretical performance analysis of UEP rateless polar codes.Simulation results show that the proposed codes significantly outperform existing polar coding schemes in both block error rate and throughput efficiency.
基金supported by the National Natural Science Foundation of China(grant numbers 32471851,32171759 and 32201533)Double Thousand Plan of Jiangxi Province(jxsq2023201058)Jiangxi Province Ganpo Juncai Support Plan(2024BCE50043).
文摘Soil organic carbon in forest affects nutrient availability,microbial processes,and organic matter inputs.Dominant tree species have increasingly shifted from ectomycorrhizal to arbuscular mycorrhizal associations in subtropical forests.However,the consequences of this shift for soil organic carbon is poorly understood.To address this,a field study was conducted across a natural gradient of arbuscular tree associations to investigate how different mycorrhizal associations affect soil organic carbon quantity,composition,chemical stability,and related soil properties.Soil organic carbon fractions,functional groups,microbial enzyme activities were analyzed.Results showed that increasing arbuscular mycorrhizal dominance was associated with declines in total soil organic carbon,particularly in recalcitrant and aromatic carbon forms.Ectomycorrhizaldominated forests exhibited higher nitrogen availability and elevated nitrogen-hydrolyzing enzyme activity,suggesting enhanced nitrogen acquisition strategies that suppress soil organic carbon decomposition and promote carbon retention.These findings indicate that mycorrhizal-mediated shifts in tree composition may significantly alter soil carbon sequestration potential.Incorporating mycorrhizal functional traits into forest management and carbon modeling could improve predictions of soil organic carbon responses under future environmental change.
基金supported by the Xingdian Talent Support Program of Yunnan Province(E5YNR03B01)the Xishuangbanna State Rainforest Talent Support Program(E4BN041B01)the CAS President’s International Fellowship Initiative(2020FYB0003)。
文摘Allometric equations are fundamental tools in ecological research and forestry management,widely used for estimating above-ground biomass and production,serving as the core foundations of dynamic vegetation models.Using global datasets from Tallo(a tree allometry and crown architecture database encompassing thousands of species)and TRY(a plant traits database),we fit B ayesian hierarchical models with three alternative functional forms(powerlaw,generalized Michaelis-Menten(gMM),and Weibull)to characterize how diameter at breast height(DBH),tree height(H),and crown radius(CR)scale with and without wood density as a species-level predictor.Our analysis revealed that the saturating Weibull function best captured the relationship between tree height and DBH in both functional groups,whereas the CR-DBH relationship was best predicted by a power-law function in angiosperms and by the gMM function in gymnosperms.Although including wood density did not significantly improve predictive performance,it revealed important ecological trade-offs:lighter-wood angiosperms achieve taller mature heights more rapidly,and denser wood promotes wider crown expansion across clades.We also found that accurately estimating DBH required considering both height and crown size,highlighting how these variables together distinguish trees of similar height but differing trunk diameters.Our results emphasize the importance of applying saturating functions for large trees to improve forest biomass estimates and show that wood density,though not always predictive at broad scales,helps illuminate the biomechanical and ecological constraints underlying diverse tree architectures.These findings offer practical pathways for integrating height-and crown-based metrics into existing carbon monitoring programs worldwide.
文摘Compact size,high brightness,and wide field of view(FOV)are key requirements for long-wave infrared imagers used in military surveillance or night navigation.However,to meet the imaging requirements of high resolution and wide FOV,infrared optical systems often adopt complex optical lens groups,which will increase the size and weight of the optical system.In this paper,a strategy based on wavefront coding(WFC)is proposed to design a compact wide-FOV infrared imager.A cubic phase mask is inserted into the pupil plane of the infrared imager to correct the aberration.The simulated results show that,the WFC infrared imager has good imaging quality in a wide FOV of±16°.In addition,the WFC infrared imager achieves compactness with its 40 mm×40 mm×40 mm size.A fast focal ratio of 1 combined with an entrance pupil diameter of 25 mm ensures brightness.This work is of significance for designing a compact wide-FOV infrared imager.
文摘Background 3D botanical tree reconstruction from a single image plays a vital role in the field of computer graphics.However,accurately capturing the intricate branching patterns and detailed morphologies of trees remains a challenge.Methods In this study,we proposed a novel approach for single-image tree reconstruction using a conditional generative adversarial network to infer the 3D skeleton of a tree in the form of a 2D skeleton depth map.Based on the 2D skeleton depth map,a corresponding branching structure(3D skeleton)that inherits the tree shape in the input image and leaves can be generated using a procedural modeling technique.Result Experimental results show that the proposed method accurately reconstructs diverse tree structures across species.Both quantitative and qualitative evaluations demonstrate improved skeleton completeness,branching accuracy,and visual realism over baseline methods,while requiring no user input.Conclusions Our proposed approach for generating lifelike 3D tree models from a single image with no user input shows its proficiency in achieving efficient and reliable reconstruction.These results showcase the capability of the proposed model to recreate complex tree architectures while capturing their visual authenticity.
基金supported by National Natural Science Foundation of China(No.61801327).
文摘Differential pulse-position modulation(DP PM)can achieve a good compromise between power and bandwidth requirements.However,the output sequence has undetectable insertions and deletions.This paper proposes a successive cancellation(SC)decoding scheme based on the weighted levenshtein distance(WLD)of polar codes for correcting insertions/deletions in DPPM systems.In this method,the WLD is used to calculate the transfer probabilities recursively to obtain likelihood ratios,and the low-complexity SC decoding method is built according to the error characteristics to match the DPPM system.Additionally,the proposed SC decoding scheme is extended to list decoding,which can further improve error correction performance.Simulation results show that the proposed scheme can effectively correct insertions/deletions in the DPPM system,which enhances its reliability and performance.
基金supported by Istanbul Technical University(Project No.45698)supported through the“Young Researchers’Career Development Project-training of doctoral students”of the Croatian Science Foundation.
文摘This paper investigates the reliability of internal marine combustion engines using an integrated approach that combines Fault Tree Analysis(FTA)and Bayesian Networks(BN).FTA provides a structured,top-down method for identifying critical failure modes and their root causes,while BN introduces flexibility in probabilistic reasoning,enabling dynamic updates based on new evidence.This dual methodology overcomes the limitations of static FTA models,offering a comprehensive framework for system reliability analysis.Critical failures,including External Leakage(ELU),Failure to Start(FTS),and Overheating(OHE),were identified as key risks.By incorporating redundancy into high-risk components such as pumps and batteries,the likelihood of these failures was significantly reduced.For instance,redundant pumps reduced the probability of ELU by 31.88%,while additional batteries decreased the occurrence of FTS by 36.45%.The results underscore the practical benefits of combining FTA and BN for enhancing system reliability,particularly in maritime applications where operational safety and efficiency are critical.This research provides valuable insights for maintenance planning and highlights the importance of redundancy in critical systems,especially as the industry transitions toward more autonomous vessels.
文摘As artificial Intelligence(AI)continues to expand exponentially,particularly with the emergence of generative pre-trained transformers(GPT)based on a transformer’s architecture,which has revolutionized data processing and enabled significant improvements in various applications.This document seeks to investigate the security vulnerabilities detection in the source code using a range of large language models(LLM).Our primary objective is to evaluate the effectiveness of Static Application Security Testing(SAST)by applying various techniques such as prompt persona,structure outputs and zero-shot.To the selection of the LLMs(CodeLlama 7B,DeepSeek coder 7B,Gemini 1.5 Flash,Gemini 2.0 Flash,Mistral 7b Instruct,Phi 38b Mini 128K instruct,Qwen 2.5 coder,StartCoder 27B)with comparison and combination with Find Security Bugs.The evaluation method will involve using a selected dataset containing vulnerabilities,and the results to provide insights for different scenarios according to the software criticality(Business critical,non-critical,minimum effort,best effort)In detail,the main objectives of this study are to investigate if large language models outperform or exceed the capabilities of traditional static analysis tools,if the combining LLMs with Static Application Security Testing(SAST)tools lead to an improvement and the possibility that local machine learning models on a normal computer produce reliable results.Summarizing the most important conclusions of the research,it can be said that while it is true that the results have improved depending on the size of the LLM for business-critical software,the best results have been obtained by SAST analysis.This differs in“NonCritical,”“Best Effort,”and“Minimum Effort”scenarios,where the combination of LLM(Gemini)+SAST has obtained better results.