Non-orthogonal multiple access(NOMA)is viewed as a key technique to improve the spectrum efficiency and solve the issue of massive connectivity.However,for power domain NOMA,the required overall transmit power should ...Non-orthogonal multiple access(NOMA)is viewed as a key technique to improve the spectrum efficiency and solve the issue of massive connectivity.However,for power domain NOMA,the required overall transmit power should be increased rapidly with the increasing number of users in order to ensure that the signal-to-interference-plus-noise ratio reaches a predefined threshold.In addition,since the successive interference cancellation(SIC)is adopted,the error propagation would become more serious as the order of SIC increases.Aiming at minimizing the total transmit power and satisfying each user’s service requirement,this paper proposes a novel framework with group-based SIC for the deep integration between power domain NOMA and multi-antenna technology.Based on the proposed framework,a joint optimization of power control and equalizer design is investigated to minimize transmit power consumption for uplink multi-antenna NOMA system with error propagations.Based on the relationship between the equalizer and the transmit power coefficients,the original problem is transformed to a transmit power optimization problem,which is further addressed by a parallel iteration algorithm.It is shown by simulations that,in terms of the total power consumption,the proposed scheme outperforms the conventional OMA and the existing cluster-based NOMA schemes.展开更多
Traditional chaotic maps struggle with narrow chaotic ranges and inefficiencies,limiting their use for lightweight,secure image encryption in resource-constrained Wireless Sensor Networks(WSNs).We propose the SPCM,a n...Traditional chaotic maps struggle with narrow chaotic ranges and inefficiencies,limiting their use for lightweight,secure image encryption in resource-constrained Wireless Sensor Networks(WSNs).We propose the SPCM,a novel one-dimensional discontinuous chaotic system integrating polynomial and sine functions,leveraging a piecewise function to achieve a broad chaotic range()and a high Lyapunov exponent(5.04).Validated through nine benchmarks,including standard randomness tests,Diehard tests,and Shannon entropy(3.883),SPCM demonstrates superior randomness and high sensitivity to initial conditions.Applied to image encryption,SPCM achieves 0.152582 s(39%faster than some techniques)and 433.42 KB/s throughput(134%higher than some techniques),setting new benchmarks for chaotic map-based methods in WSNs.Chaos-based permutation and exclusive or(XOR)diffusion yield near-zero correlation in encrypted images,ensuring strong resistance to Statistical Attacks(SA)and accurate recovery.SPCM also exhibits a strong avalanche effect(bit difference),making it an efficient,secure solution for WSNs in domains like healthcare and smart cities.展开更多
With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these d...With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these devices often handle personal information under limited resources,cryptographic algorithms must be executed efficiently.Their computational characteristics strongly affect system performance,making it necessary to analyze resource impact and predict usage under diverse configurations.In this paper,we analyze the phase-level resource usage of AES variants,ChaCha20,ECC,and RSA on an edge device and develop a prediction model.We apply these algorithms under varying parallelism levels and execution strategies across key generation,encryption,and decryption phases.Based on the analysis,we train a unified Random Forest model using execution context and temporal features,achieving R2 values up to 0.994 for power and 0.988 for temperature.Furthermore,the model maintains practical predictive performance even for cryptographic algorithms not included during training,demonstrating its ability to generalize across distinct computational characteristics.Our proposed approach reveals how execution characteristics and resource usage interacts,supporting proactive resource planning and efficient deployment of cryptographic workloads on edge devices.As our approach is grounded in phase-level computational characteristics rather than in any single algorithm,it provides generalizable insights that can be extended to a broader range of cryptographic algorithms that exhibit comparable phase-level execution patterns and to heterogeneous edge architectures.展开更多
As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)system...As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)systems.These systems are essential for monitoring and controlling industrial operations,making their security paramount.A key threat arises from Shor’s algorithm,a powerful quantum computing tool that can compromise current hash functions,leading to significant concerns about data integrity and confidentiality.To tackle these issues,this article introduces a novel Quantum-Resistant Hash Algorithm(QRHA)known as the Modular Hash Learning Algorithm(MHLA).This algorithm is meticulously crafted to withstand potential quantum attacks by incorporating advanced mathematical and algorithmic techniques,enhancing its overall security framework.Our research delves into the effectiveness ofMHLA in defending against both traditional and quantum-based threats,with a particular emphasis on its resilience to Shor’s algorithm.The findings from our study demonstrate that MHLA significantly enhances the security of SCADA systems in the context of quantum technology.By ensuring that sensitive data remains protected and confidential,MHLA not only fortifies individual systems but also contributes to the broader efforts of safeguarding industrial and infrastructure control systems against future quantumthreats.Our evaluation demonstrates that MHLA improves security by 38%against quantumattack simulations compared to traditional hash functionswhilemaintaining a computational efficiency ofO(m⋅n⋅k+v+n).The algorithm achieved a 98%success rate in detecting data tampering during integrity testing.These findings underline MHLA’s effectiveness in enhancing SCADA system security amidst evolving quantum technologies.This research represents a crucial step toward developing more secure cryptographic systems that can adapt to the rapidly changing technological landscape,ultimately ensuring the reliability and integrity of critical infrastructure in an era where quantum computing poses a growing risk.展开更多
Cloud environments are essential for modern computing,but are increasingly vulnerable to Side-Channel Attacks(SCAs),which exploit indirect information to compromise sensitive data.To address this critical challenge,we...Cloud environments are essential for modern computing,but are increasingly vulnerable to Side-Channel Attacks(SCAs),which exploit indirect information to compromise sensitive data.To address this critical challenge,we propose SecureCons Framework(SCF),a novel consensus-based cryptographic framework designed to enhance resilience against SCAs in cloud environments.SCF integrates a dual-layer approach combining lightweight cryptographic algorithms with a blockchain-inspired consensus mechanism to secure data exchanges and thwart potential side-channel exploits.The framework includes adaptive anomaly detection models,cryptographic obfuscation techniques,and real-time monitoring to identify and mitigate vulnerabilities proactively.Experimental evaluations demonstrate the framework's robustness,achieving over 95%resilience against advanced SCAs with minimal computational overhead.SCF provides a scalable,secure,and efficient solution,setting a new benchmark for side-channel attack mitigation in cloud ecosystems.展开更多
针对目前格上环签名方案在环成员数量较多的情况下,签名效率低下且签名尺寸和公钥尺寸过大的问题,基于零知识证明,使用E-MLWE(extended module learning with errors)和MSIS(module short interger solution)问题降低了公钥大小,结合拒...针对目前格上环签名方案在环成员数量较多的情况下,签名效率低下且签名尺寸和公钥尺寸过大的问题,基于零知识证明,使用E-MLWE(extended module learning with errors)和MSIS(module short interger solution)问题降低了公钥大小,结合拒绝采样算法和追踪机制设计了一种可追踪环签名方案,签名算法中使用递归算法压缩了承诺的大小,进一步降低了签名尺寸,在随机预言机模型下证明方案满足可链接性、匿名性和抗陷害性。性能分析表明,签名尺寸与环成员数量为对数大小关系,在环成员数量较多时,公钥的存储开销和签名的通信开销具有明显优势。展开更多
随着神经密码学的出现,越来越多研究使用神经网络来训练加解密算法,其中采用对抗网络可实现端到端的高安全加解密,但存在开销大、速度慢等问题。通过对运算单元核心、数据存储架构和数据流行为进行协同优化设计,提出一种面向神经网络的...随着神经密码学的出现,越来越多研究使用神经网络来训练加解密算法,其中采用对抗网络可实现端到端的高安全加解密,但存在开销大、速度慢等问题。通过对运算单元核心、数据存储架构和数据流行为进行协同优化设计,提出一种面向神经网络的可配置加解密硬件设计方案。该方案首先对加解密模型进行硬件友好型优化,完成网络训练和量化;然后,采用Winograd+DSP48的卷积加速方法,将所需96个乘法器降低到32个;最后,设计CPU控制与调度系统架构,结合动态控制加速器的操作模式,实现高性能可配置加解密硬件电路。实验结果表明,所提方案最高工作频率为133 MHz,功耗为32.4 m W,吞吐量为17.06 GOPs。加解密网络的正确率达100%,破解网络正确率接近50%,硬件电路具备可配置和高安全特性。展开更多
Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategie...Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategies and prognostic assessment.However,trajectory clustering analysis of time-series clinical data poses substantial methodological challenges for researchers.This study provides a comprehensive tutorial framework demonstrating six trajectory modeling approaches integrated with proteomic analysis to guide researchers in identifying sepsis subtypes after laparoscopic surgery.Methods:This study employs simulated longitudinal data from 300 septic patients after laparoscopic surgery to demonstrate six trajectory modeling methods(group-based trajectory modeling,latent growth mixture modeling,latent transition analysis,time-varying effect modeling,K-means for longitudinal data,agglomerative hierarchical clustering)for identifying associations between predefinedsequential organ failure assessment trajectories and 25 proteomic biomarkers.Clustering performance was evaluated via multiple metrics,and a biomarker discovery pipeline integrating principal component analysis,random forests,feature selection,and receiver operating characteristic analysis was developed.Results:The six methods demonstrated varying performance in identifying trajectory structures,with each approach exhibiting distinct analytical characteristics.The performance metrics revealed differences across methods,which may inform context-specificmethod selection and interpretation strategies.Conclusion:This study illustrates practical implementations of trajectory modeling approaches under controlled conditions,facilitating informed method selection for clinical researchers.The inclusion of complete R code and integrated proteomics workflows offers a reproducible analytical framework connecting temporal pattern recognition to biomarker discovery.Beyond sepsis,this pipeline-oriented approach may be adapted to diverse clinical scenarios requiring longitudinal disease characterization and precision medicine applications.The comparative analysis reveals that each method has distinct strengths,providing a practical guide for clinical researchers in selecting appropriate methods based on their specificstudy goals and data characteristics.展开更多
The growing recognition of the role of genetics in the development of amyotrophic lateral sclerosis is evident.However,there has yet to be a comprehensive analysis of the clinical characteristics and genetics of famil...The growing recognition of the role of genetics in the development of amyotrophic lateral sclerosis is evident.However,there has yet to be a comprehensive analysis of the clinical characteristics and genetics of familial amyotrophic lateral sclerosis in an Asian population.This study aimed to provide an in-depth analysis of the clinical features and genetic spectrum of familial amyotrophic lateral sclerosis over 15 years in a clinic-based cohort of patients from the Chinese mainland.Enrollment of 302 amyotrophic lateral sclerosis families from 28 provinces was undertaken from January 2008 to September 2023.A group-based trajectory model for disease progression based on amyotrophic lateral sclerosis Functional Rating Scale-Revised(ALSFRS-R)scores was validated using bootstrap internal validation in patients with familial amyotrophic lateral sclerosis,as well as patients with sporadic amyotrophic lateral sclerosis(matched at a 1:4 ratio,with replacement).DNA samples from 244 index patients were screened for variants in the pathogenic genes SOD1,FUS,TDP43,and C9ORF72,of which 146 were also subjected to genome-wide next-generation sequencing.Gene-level burden analysis was used to evaluate the distribution of rare variants in the cohort.We found that rapid dynamic disease progression was associated with an older age at onset,shorter diagnostic delay,lower body mass index,bulbar onset,and≥1 affected first-degree relative.Certain attributes,such as age at onset and time from onset to diagnosis,had comparable impacts on the clinical progression trajectories of both familial amyotrophic lateral sclerosis and sporadic amyotrophic lateral sclerosis.Harboring pathogenic/likely pathogenic variants in amyotrophic lateral sclerosis-causative genes reduced the age of onset of familial amyotrophic lateral sclerosis.Among the patients with familial amyotrophic lateral sclerosis,17.8%possessed≥2 pathogenic/likely pathogenic variants.Sequencing kernel association test analysis showed that the SOD1 rare variant burden(P=1.3e-15)was associated with a significant risk of familial amyotrophic lateral sclerosis.Our findings conclusively confirmed the clinical features and genetic spectrum of familial amyotrophic lateral sclerosis over 15 years in a clinical cohort from China,contributing to a deeper understanding of genotype-phenotype relationships in familial amyotrophic lateral sclerosis.This comprehensive evaluation of specific clinical characteristics,clinical prognosis,and genetic variants of amyotrophic lateral sclerosis based on detailed clinical and genetic information may lead to the development of genotype-specific treatment approaches.展开更多
The advent of 5G technology has significantly enhanced the transmission of images over networks,expanding data accessibility and exposure across various applications in digital technology and social media.Consequently...The advent of 5G technology has significantly enhanced the transmission of images over networks,expanding data accessibility and exposure across various applications in digital technology and social media.Consequently,the protection of sensitive data has become increasingly critical.Regardless of the complexity of the encryption algorithm used,a robust and highly secure encryption key is essential,with randomness and key space being crucial factors.This paper proposes a new Robust Deoxyribonucleic Acid(RDNA)nucleotide-based encryption method.The RDNA encryption method leverages the unique properties of DNA nucleotides,including their inherent randomness and extensive key space,to generate a highly secure encryption key.By employing transposition and substitution operations,the RDNA method ensures significant diffusion and confusion in the encrypted images.Additionally,it utilises a pseudorandom generation technique based on the random sequence of nucleotides in the DNA secret key.The performance of the RDNA encryption method is evaluated through various statistical and visual tests,and compared against established encryption methods such as 3DES,AES,and a DNA-based method.Experimental results demonstrate that the RDNA encryption method outperforms its rivals in the literature,and achieves superior performance in terms of information entropy,avalanche effect,encryption execution time,and correlation reduction,while maintaining competitive values for NMAE,PSNR,NPCR,and UACI.The high degree of randomness and sensitivity to key changes inherent in the RDNA method offers enhanced security,making it highly resistant to brute force and differential attacks.展开更多
The advent of quantum computing poses a significant challenge to traditional cryptographic protocols,particularly those used in SecureMultiparty Computation(MPC),a fundamental cryptographic primitive for privacypreser...The advent of quantum computing poses a significant challenge to traditional cryptographic protocols,particularly those used in SecureMultiparty Computation(MPC),a fundamental cryptographic primitive for privacypreserving computation.Classical MPC relies on cryptographic techniques such as homomorphic encryption,secret sharing,and oblivious transfer,which may become vulnerable in the post-quantum era due to the computational power of quantum adversaries.This study presents a review of 140 peer-reviewed articles published between 2000 and 2025 that used different databases like MDPI,IEEE Explore,Springer,and Elsevier,examining the applications,types,and security issues with the solution of Quantum computing in different fields.This review explores the impact of quantum computing on MPC security,assesses emerging quantum-resistant MPC protocols,and examines hybrid classicalquantum approaches aimed at mitigating quantum threats.We analyze the role of Quantum Key Distribution(QKD),post-quantum cryptography(PQC),and quantum homomorphic encryption in securing multiparty computations.Additionally,we discuss the challenges of scalability,computational efficiency,and practical deployment of quantumsecure MPC frameworks in real-world applications such as privacy-preserving AI,secure blockchain transactions,and confidential data analysis.This review provides insights into the future research directions and open challenges in ensuring secure,scalable,and quantum-resistant multiparty computation.展开更多
基金supported in part by the National Natural Science Foundation of China under Grant 62171235 and Grant 62171237in part by the Qinglan Project of Jiangsu Provincein part by the Open Research Foundation of National Mobile Communications Research Laboratory of Southeast University under Grant 2023D01.
文摘Non-orthogonal multiple access(NOMA)is viewed as a key technique to improve the spectrum efficiency and solve the issue of massive connectivity.However,for power domain NOMA,the required overall transmit power should be increased rapidly with the increasing number of users in order to ensure that the signal-to-interference-plus-noise ratio reaches a predefined threshold.In addition,since the successive interference cancellation(SIC)is adopted,the error propagation would become more serious as the order of SIC increases.Aiming at minimizing the total transmit power and satisfying each user’s service requirement,this paper proposes a novel framework with group-based SIC for the deep integration between power domain NOMA and multi-antenna technology.Based on the proposed framework,a joint optimization of power control and equalizer design is investigated to minimize transmit power consumption for uplink multi-antenna NOMA system with error propagations.Based on the relationship between the equalizer and the transmit power coefficients,the original problem is transformed to a transmit power optimization problem,which is further addressed by a parallel iteration algorithm.It is shown by simulations that,in terms of the total power consumption,the proposed scheme outperforms the conventional OMA and the existing cluster-based NOMA schemes.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korean government Ministry of Science and ICT(MIST)(RS-2022-00165225).
文摘Traditional chaotic maps struggle with narrow chaotic ranges and inefficiencies,limiting their use for lightweight,secure image encryption in resource-constrained Wireless Sensor Networks(WSNs).We propose the SPCM,a novel one-dimensional discontinuous chaotic system integrating polynomial and sine functions,leveraging a piecewise function to achieve a broad chaotic range()and a high Lyapunov exponent(5.04).Validated through nine benchmarks,including standard randomness tests,Diehard tests,and Shannon entropy(3.883),SPCM demonstrates superior randomness and high sensitivity to initial conditions.Applied to image encryption,SPCM achieves 0.152582 s(39%faster than some techniques)and 433.42 KB/s throughput(134%higher than some techniques),setting new benchmarks for chaotic map-based methods in WSNs.Chaos-based permutation and exclusive or(XOR)diffusion yield near-zero correlation in encrypted images,ensuring strong resistance to Statistical Attacks(SA)and accurate recovery.SPCM also exhibits a strong avalanche effect(bit difference),making it an efficient,secure solution for WSNs in domains like healthcare and smart cities.
基金supported in part by the National Research Foundation of Korea(NRF)(No.RS-2025-00554650)supported by the Chung-Ang University research grant in 2024。
文摘With the accelerated growth of the Internet of Things(IoT),real-time data processing on edge devices is increasingly important for reducing overhead and enhancing security by keeping sensitive data local.Since these devices often handle personal information under limited resources,cryptographic algorithms must be executed efficiently.Their computational characteristics strongly affect system performance,making it necessary to analyze resource impact and predict usage under diverse configurations.In this paper,we analyze the phase-level resource usage of AES variants,ChaCha20,ECC,and RSA on an edge device and develop a prediction model.We apply these algorithms under varying parallelism levels and execution strategies across key generation,encryption,and decryption phases.Based on the analysis,we train a unified Random Forest model using execution context and temporal features,achieving R2 values up to 0.994 for power and 0.988 for temperature.Furthermore,the model maintains practical predictive performance even for cryptographic algorithms not included during training,demonstrating its ability to generalize across distinct computational characteristics.Our proposed approach reveals how execution characteristics and resource usage interacts,supporting proactive resource planning and efficient deployment of cryptographic workloads on edge devices.As our approach is grounded in phase-level computational characteristics rather than in any single algorithm,it provides generalizable insights that can be extended to a broader range of cryptographic algorithms that exhibit comparable phase-level execution patterns and to heterogeneous edge architectures.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R343),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabiathe Deanship of Scientific Research at Northern Border University,Arar,Saudi Arabia for funding this research work through the project number NBU-FFR-2025-1092-10.
文摘As quantum computing continues to advance,traditional cryptographic methods are increasingly challenged,particularly when it comes to securing critical systems like Supervisory Control andData Acquisition(SCADA)systems.These systems are essential for monitoring and controlling industrial operations,making their security paramount.A key threat arises from Shor’s algorithm,a powerful quantum computing tool that can compromise current hash functions,leading to significant concerns about data integrity and confidentiality.To tackle these issues,this article introduces a novel Quantum-Resistant Hash Algorithm(QRHA)known as the Modular Hash Learning Algorithm(MHLA).This algorithm is meticulously crafted to withstand potential quantum attacks by incorporating advanced mathematical and algorithmic techniques,enhancing its overall security framework.Our research delves into the effectiveness ofMHLA in defending against both traditional and quantum-based threats,with a particular emphasis on its resilience to Shor’s algorithm.The findings from our study demonstrate that MHLA significantly enhances the security of SCADA systems in the context of quantum technology.By ensuring that sensitive data remains protected and confidential,MHLA not only fortifies individual systems but also contributes to the broader efforts of safeguarding industrial and infrastructure control systems against future quantumthreats.Our evaluation demonstrates that MHLA improves security by 38%against quantumattack simulations compared to traditional hash functionswhilemaintaining a computational efficiency ofO(m⋅n⋅k+v+n).The algorithm achieved a 98%success rate in detecting data tampering during integrity testing.These findings underline MHLA’s effectiveness in enhancing SCADA system security amidst evolving quantum technologies.This research represents a crucial step toward developing more secure cryptographic systems that can adapt to the rapidly changing technological landscape,ultimately ensuring the reliability and integrity of critical infrastructure in an era where quantum computing poses a growing risk.
文摘Cloud environments are essential for modern computing,but are increasingly vulnerable to Side-Channel Attacks(SCAs),which exploit indirect information to compromise sensitive data.To address this critical challenge,we propose SecureCons Framework(SCF),a novel consensus-based cryptographic framework designed to enhance resilience against SCAs in cloud environments.SCF integrates a dual-layer approach combining lightweight cryptographic algorithms with a blockchain-inspired consensus mechanism to secure data exchanges and thwart potential side-channel exploits.The framework includes adaptive anomaly detection models,cryptographic obfuscation techniques,and real-time monitoring to identify and mitigate vulnerabilities proactively.Experimental evaluations demonstrate the framework's robustness,achieving over 95%resilience against advanced SCAs with minimal computational overhead.SCF provides a scalable,secure,and efficient solution,setting a new benchmark for side-channel attack mitigation in cloud ecosystems.
文摘针对目前格上环签名方案在环成员数量较多的情况下,签名效率低下且签名尺寸和公钥尺寸过大的问题,基于零知识证明,使用E-MLWE(extended module learning with errors)和MSIS(module short interger solution)问题降低了公钥大小,结合拒绝采样算法和追踪机制设计了一种可追踪环签名方案,签名算法中使用递归算法压缩了承诺的大小,进一步降低了签名尺寸,在随机预言机模型下证明方案满足可链接性、匿名性和抗陷害性。性能分析表明,签名尺寸与环成员数量为对数大小关系,在环成员数量较多时,公钥的存储开销和签名的通信开销具有明显优势。
文摘随着神经密码学的出现,越来越多研究使用神经网络来训练加解密算法,其中采用对抗网络可实现端到端的高安全加解密,但存在开销大、速度慢等问题。通过对运算单元核心、数据存储架构和数据流行为进行协同优化设计,提出一种面向神经网络的可配置加解密硬件设计方案。该方案首先对加解密模型进行硬件友好型优化,完成网络训练和量化;然后,采用Winograd+DSP48的卷积加速方法,将所需96个乘法器降低到32个;最后,设计CPU控制与调度系统架构,结合动态控制加速器的操作模式,实现高性能可配置加解密硬件电路。实验结果表明,所提方案最高工作频率为133 MHz,功耗为32.4 m W,吞吐量为17.06 GOPs。加解密网络的正确率达100%,破解网络正确率接近50%,硬件电路具备可配置和高安全特性。
基金funding from the China National Key Research and Development Program(No.2023YFC3603104)the National Natural Science Foundation of China(Nos.82472243 and 82272180)+6 种基金the Fundamental Research Funds for the Central Universities(No.226-2025-00024)the Huadong Medicine Joint Funds of the Zhejiang Provincial Natural Science Foundation of China(No.LHDMD24H150001)the Key Research&Development Project of Zhejiang Province(No.2024C03240)a collaborative scientific project co-established by the Science and Technology Department of the National Administration of Traditional Chinese Medicine and the Zhejiang Provincial Administration of Traditional Chinese Medicine(No.GZY-ZJ-KJ-24082)he General Health Science and Technology Program of Zhejiang Province(No.2024KY1099)the Project of Zhejiang University Longquan Innovation Center(No.ZJDXLQCXZCJBGS2024016)Wu Jieping Medical Foundation Special Research Grant(No.320.6750.2024-23-07).
文摘Objective:Sepsis exhibits remarkable heterogeneity in disease progression trajectories,and accurate identificationof distinct trajectory-based phenotypes is critical for implementing personalized therapeutic strategies and prognostic assessment.However,trajectory clustering analysis of time-series clinical data poses substantial methodological challenges for researchers.This study provides a comprehensive tutorial framework demonstrating six trajectory modeling approaches integrated with proteomic analysis to guide researchers in identifying sepsis subtypes after laparoscopic surgery.Methods:This study employs simulated longitudinal data from 300 septic patients after laparoscopic surgery to demonstrate six trajectory modeling methods(group-based trajectory modeling,latent growth mixture modeling,latent transition analysis,time-varying effect modeling,K-means for longitudinal data,agglomerative hierarchical clustering)for identifying associations between predefinedsequential organ failure assessment trajectories and 25 proteomic biomarkers.Clustering performance was evaluated via multiple metrics,and a biomarker discovery pipeline integrating principal component analysis,random forests,feature selection,and receiver operating characteristic analysis was developed.Results:The six methods demonstrated varying performance in identifying trajectory structures,with each approach exhibiting distinct analytical characteristics.The performance metrics revealed differences across methods,which may inform context-specificmethod selection and interpretation strategies.Conclusion:This study illustrates practical implementations of trajectory modeling approaches under controlled conditions,facilitating informed method selection for clinical researchers.The inclusion of complete R code and integrated proteomics workflows offers a reproducible analytical framework connecting temporal pattern recognition to biomarker discovery.Beyond sepsis,this pipeline-oriented approach may be adapted to diverse clinical scenarios requiring longitudinal disease characterization and precision medicine applications.The comparative analysis reveals that each method has distinct strengths,providing a practical guide for clinical researchers in selecting appropriate methods based on their specificstudy goals and data characteristics.
基金supported by the Natural Science Foundation of Beijing,Nos.7244428(to WZ)and 7222215(to JH)the Peking University Medicine Sailing Program forYoung Scholars’Scientific and Technological Innovation,No.BMU2023YFJHPY034(to WZ)+4 种基金the National Natural Science Foundation of China,Nos.81873784,82071426(to DF),and81974197(to JH)the Clinical Cohort Construction Program of Peking University Third Hospital,No.BYSYDL2019002(to DF)Beijing Physician-Scientist TrainingProgram,No.BJPSTP-2024-03(to JH)the China Postdoctoral Science Foundation,Nos.2022TQ0014(to LX),2022M720284(to LX)the E-Town Cooperation&Development Foundation,No.YCXJ-JZ-2023-017(to LX).
文摘The growing recognition of the role of genetics in the development of amyotrophic lateral sclerosis is evident.However,there has yet to be a comprehensive analysis of the clinical characteristics and genetics of familial amyotrophic lateral sclerosis in an Asian population.This study aimed to provide an in-depth analysis of the clinical features and genetic spectrum of familial amyotrophic lateral sclerosis over 15 years in a clinic-based cohort of patients from the Chinese mainland.Enrollment of 302 amyotrophic lateral sclerosis families from 28 provinces was undertaken from January 2008 to September 2023.A group-based trajectory model for disease progression based on amyotrophic lateral sclerosis Functional Rating Scale-Revised(ALSFRS-R)scores was validated using bootstrap internal validation in patients with familial amyotrophic lateral sclerosis,as well as patients with sporadic amyotrophic lateral sclerosis(matched at a 1:4 ratio,with replacement).DNA samples from 244 index patients were screened for variants in the pathogenic genes SOD1,FUS,TDP43,and C9ORF72,of which 146 were also subjected to genome-wide next-generation sequencing.Gene-level burden analysis was used to evaluate the distribution of rare variants in the cohort.We found that rapid dynamic disease progression was associated with an older age at onset,shorter diagnostic delay,lower body mass index,bulbar onset,and≥1 affected first-degree relative.Certain attributes,such as age at onset and time from onset to diagnosis,had comparable impacts on the clinical progression trajectories of both familial amyotrophic lateral sclerosis and sporadic amyotrophic lateral sclerosis.Harboring pathogenic/likely pathogenic variants in amyotrophic lateral sclerosis-causative genes reduced the age of onset of familial amyotrophic lateral sclerosis.Among the patients with familial amyotrophic lateral sclerosis,17.8%possessed≥2 pathogenic/likely pathogenic variants.Sequencing kernel association test analysis showed that the SOD1 rare variant burden(P=1.3e-15)was associated with a significant risk of familial amyotrophic lateral sclerosis.Our findings conclusively confirmed the clinical features and genetic spectrum of familial amyotrophic lateral sclerosis over 15 years in a clinical cohort from China,contributing to a deeper understanding of genotype-phenotype relationships in familial amyotrophic lateral sclerosis.This comprehensive evaluation of specific clinical characteristics,clinical prognosis,and genetic variants of amyotrophic lateral sclerosis based on detailed clinical and genetic information may lead to the development of genotype-specific treatment approaches.
文摘The advent of 5G technology has significantly enhanced the transmission of images over networks,expanding data accessibility and exposure across various applications in digital technology and social media.Consequently,the protection of sensitive data has become increasingly critical.Regardless of the complexity of the encryption algorithm used,a robust and highly secure encryption key is essential,with randomness and key space being crucial factors.This paper proposes a new Robust Deoxyribonucleic Acid(RDNA)nucleotide-based encryption method.The RDNA encryption method leverages the unique properties of DNA nucleotides,including their inherent randomness and extensive key space,to generate a highly secure encryption key.By employing transposition and substitution operations,the RDNA method ensures significant diffusion and confusion in the encrypted images.Additionally,it utilises a pseudorandom generation technique based on the random sequence of nucleotides in the DNA secret key.The performance of the RDNA encryption method is evaluated through various statistical and visual tests,and compared against established encryption methods such as 3DES,AES,and a DNA-based method.Experimental results demonstrate that the RDNA encryption method outperforms its rivals in the literature,and achieves superior performance in terms of information entropy,avalanche effect,encryption execution time,and correlation reduction,while maintaining competitive values for NMAE,PSNR,NPCR,and UACI.The high degree of randomness and sensitivity to key changes inherent in the RDNA method offers enhanced security,making it highly resistant to brute force and differential attacks.
文摘The advent of quantum computing poses a significant challenge to traditional cryptographic protocols,particularly those used in SecureMultiparty Computation(MPC),a fundamental cryptographic primitive for privacypreserving computation.Classical MPC relies on cryptographic techniques such as homomorphic encryption,secret sharing,and oblivious transfer,which may become vulnerable in the post-quantum era due to the computational power of quantum adversaries.This study presents a review of 140 peer-reviewed articles published between 2000 and 2025 that used different databases like MDPI,IEEE Explore,Springer,and Elsevier,examining the applications,types,and security issues with the solution of Quantum computing in different fields.This review explores the impact of quantum computing on MPC security,assesses emerging quantum-resistant MPC protocols,and examines hybrid classicalquantum approaches aimed at mitigating quantum threats.We analyze the role of Quantum Key Distribution(QKD),post-quantum cryptography(PQC),and quantum homomorphic encryption in securing multiparty computations.Additionally,we discuss the challenges of scalability,computational efficiency,and practical deployment of quantumsecure MPC frameworks in real-world applications such as privacy-preserving AI,secure blockchain transactions,and confidential data analysis.This review provides insights into the future research directions and open challenges in ensuring secure,scalable,and quantum-resistant multiparty computation.