This study investigates the thermal and statistical properties of the Dirac oscillator within the framework of two prominent formulations of doubly special relativity(DSR):the Amelino-Camelia and Magueijo-Smolin model...This study investigates the thermal and statistical properties of the Dirac oscillator within the framework of two prominent formulations of doubly special relativity(DSR):the Amelino-Camelia and Magueijo-Smolin models.DSR extends Einstein's special relativity by introducing an additional invariant scale—the Planck energy—leading to modified energy-momentum relations that encode potential quantum-gravitational effects at ultra-high energies.In this context,we derive the modified Dirac equations for both DSR scenarios and analytically determine the corresponding energy spectra.These spectra are subsequently used to compute the partition function and key thermodynamic quantities,including specific heat,by employing the Euler-Maclaurin formula to facilitate an efficient approximation of the partition function.The analysis is restricted to the positive-energy sector,enabled by the exact Foldy-Wouthuysen transformation,which effectively decouples positive and negative energy states.The findings reveal that Planck-scale deformation parameters induce significant modifications in the energy spectrum and thermodynamic behavior of the Dirac oscillator in each DSR framework,thereby offering valuable insights into possible observable imprints of quantum gravitational phenomena in relativistic quantum systems.展开更多
As a prototypical high-energy-density reactive material system,metastable intermolecular composites(MICs)have attracted considerable interest owing to their customizable component configurations and interfacial archit...As a prototypical high-energy-density reactive material system,metastable intermolecular composites(MICs)have attracted considerable interest owing to their customizable component configurations and interfacial architectures.Nevertheless,their energy release characteristics are fundamentally constrained by the formation of condensed-phase products with elevated boiling points,thereby diminishing their efficacy in applications requiring rapid pressure generation or shock wave propagation.Herein,we demonstrate a molecular-level fluorination approach that enables oxygen substitution by fluorine within bismuth oxide crystalline frameworks,yielding ternary BixOyFz crystals with atomically precise F/O stoichiometric control through systematic solvent polarity engineering.This energetics system,designed through a multilevel regulation strategy,realizes stepwise redox reactions of Al–F and Al–O during energy release,with the partitioning between these redox pathways being precisely allocable through hierarchical regulation.Furthermore,the pre-ignition reaction(PIR)between BixOyFz and Al2O3(the inert passivation shell of Al)weakens the passivation layer,lowering the ignition threshold.The in situ generation of low-boiling-point AlF3 promotes rapid gas expansion,leading to significantly enhanced pressurization rates and deflagration wave velocities under confinement compared to conventional strategies.To evaluate energy output capabilities and validate potential safety-protection applications,the system successfully achieved instantaneous destruction of SD chips,enabling secure data erasure.This work establishes crystalline lattice fluorination as a generalized materials design strategy to transcend intrinsic limitations of MICs systems in component selection and reaction thermodynamics,providing new paradigms for adaptive energetic architectures and transient microelectromechanical applications.展开更多
Based on monthly runoff and climate datasets spanning 2000–2024,this study employed the Theil–Sen’s slope estimation,Mann–Kendall(M–K)trend test,as well as Pearson correlation and Spearman rank correlation analys...Based on monthly runoff and climate datasets spanning 2000–2024,this study employed the Theil–Sen’s slope estimation,Mann–Kendall(M–K)trend test,as well as Pearson correlation and Spearman rank correlation analyses to systematically examine the spatiotemporal patterns of runoff and its climatic driving mechanisms across Tajikistan,providing a scientific basis for sustainable water resource utilization and management in the study area.Results indicated that during 2000–2024,the annual runoff in Tajikistan exhibited statistically non-significant long-term trend(P=0.76),while displaying pronounced seasonal variability and strong spatial heterogeneity.Spring and summer average runoff primarily exhibited slight declining tendencies,while winter average runoff exhibited pronounced reduction in localized regions,such as the Syr Darya Basin,the Vakhsh River Basin,and the lower reaches of the Zeravshan River Basin.Precipitation emerged as the dominant positive driver of runoff,exhibiting moderate to strong positive correlations across over 78.00%of the country,whereas potential evapotranspiration consistently functioned as a negative driver.Rising temperatures exerted a dual competitive effect on runoff:in high-elevation,glacier-covered regions,rising temperatures temporarily increased runoff by accelerating glacier melt;however,at the national scale,the negative impact of rising temperature on runoff has played a slightly dominant role to a certain extent by enhancing evapotranspiration.Collectively,these results indicated that the present stability of runoff in Tajikistan is strongly dependent on the short-term compensatory effects of glacier melt and the risk of future runoff decline is likely to intensify as glacier reserves continue to diminish.This study provides a critical scientific evidence to inform sustainable water resource management in Tajikistan and underscores the need for glacier conservation and integrated water resource management strategies.展开更多
The cloud-fog computing paradigm has emerged as a novel hybrid computing model that integrates computational resources at both fog nodes and cloud servers to address the challenges posed by dynamic and heterogeneous c...The cloud-fog computing paradigm has emerged as a novel hybrid computing model that integrates computational resources at both fog nodes and cloud servers to address the challenges posed by dynamic and heterogeneous computing networks.Finding an optimal computational resource for task offloading and then executing efficiently is a critical issue to achieve a trade-off between energy consumption and transmission delay.In this network,the task processed at fog nodes reduces transmission delay.Still,it increases energy consumption,while routing tasks to the cloud server saves energy at the cost of higher communication delay.Moreover,the order in which offloaded tasks are executed affects the system’s efficiency.For instance,executing lower-priority tasks before higher-priority jobs can disturb the reliability and stability of the system.Therefore,an efficient strategy of optimal computation offloading and task scheduling is required for operational efficacy.In this paper,we introduced a multi-objective and enhanced version of Cheeta Optimizer(CO),namely(MoECO),to jointly optimize the computation offloading and task scheduling in cloud-fog networks to minimize two competing objectives,i.e.,energy consumption and communication delay.MoECO first assigns tasks to the optimal computational nodes and then the allocated tasks are scheduled for processing based on the task priority.The mathematical modelling of CO needs improvement in computation time and convergence speed.Therefore,MoECO is proposed to increase the search capability of agents by controlling the search strategy based on a leader’s location.The adaptive step length operator is adjusted to diversify the solution and thus improves the exploration phase,i.e.,global search strategy.Consequently,this prevents the algorithm from getting trapped in the local optimal solution.Moreover,the interaction factor during the exploitation phase is also adjusted based on the location of the prey instead of the adjacent Cheetah.This increases the exploitation capability of agents,i.e.,local search capability.Furthermore,MoECO employs a multi-objective Pareto-optimal front to simultaneously minimize designated objectives.Comprehensive simulations in MATLAB demonstrate that the proposed algorithm obtains multiple solutions via a Pareto-optimal front and achieves an efficient trade-off between optimization objectives compared to baseline methods.展开更多
The convergence of Software Defined Networking(SDN)in Internet of Vehicles(IoV)enables a flexible,programmable,and globally visible network control architecture across Road Side Units(RSUs),cloud servers,and automobil...The convergence of Software Defined Networking(SDN)in Internet of Vehicles(IoV)enables a flexible,programmable,and globally visible network control architecture across Road Side Units(RSUs),cloud servers,and automobiles.While this integration enhances scalability and safety,it also raises sophisticated cyberthreats,particularly Distributed Denial of Service(DDoS)attacks.Traditional rule-based anomaly detection methods often struggle to detectmodern low-and-slowDDoS patterns,thereby leading to higher false positives.To this end,this study proposes an explainable hybrid framework to detect DDoS attacks in SDN-enabled IoV(SDN-IoV).The hybrid framework utilizes a Residual Network(ResNet)to capture spatial correlations and a Bi-Long Short-Term Memory(BiLSTM)to capture both forward and backward temporal dependencies in high-dimensional input patterns.To ensure transparency and trustworthiness,themodel integrates the Explainable AI(XAI)technique,i.e.,SHapley Additive exPlanations(SHAP).SHAP highlights the contribution of each feature during the decision-making process,facilitating security analysts to understand the rationale behind the attack classification decision.The SDN-IoV environment is created in Mininet-WiFi and SUMO,and the hybrid model is trained on the CICDDoS2019 security dataset.The simulation results reveal the efficacy of the proposed model in terms of standard performance metrics compared to similar baseline methods.展开更多
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
In order to determine the bulk density of refractory raw materials,the so-called water method following the Archimedes principle is normally used.This is where the effect of water displacement on the mass of the sampl...In order to determine the bulk density of refractory raw materials,the so-called water method following the Archimedes principle is normally used.This is where the effect of water displacement on the mass of the sample is used to determine the bulk volume of the sample grains.During this test procedure,the surface of water infiltrated sample grains must be dried with a wet towel.Experience shows,that this drying step is the main root cause for variation in reproducibility of results and even repeatability of tests.A new spin dryer(centrifuge)was developed and introduced to automate this surface drying step,and is now included as a new method in ISO 8840:2021.The paper discusses the improvement of measurement with the new approach and industrial experiences from two big industrial players in the raw material business.展开更多
This study focuses on the real-world context where artificial intelligence(AI)deeply permeates corporate operations,systematically exploring the core value,challenges,and optimization paths of corporate culture manage...This study focuses on the real-world context where artificial intelligence(AI)deeply permeates corporate operations,systematically exploring the core value,challenges,and optimization paths of corporate culture management during the process of intelligent transformation.By deeply analyzing the semantic deviations in the digitalization of cultural elements and the potential conflicts between algorithm-based decision-making and humanistic values in human-machine collaboration scenarios,and combining theories of organizational behavior with the characteristics of AI technology,a series of strategies for enhancing effectiveness are proposed,including dynamic cultural modeling and embedding cognitive-collaborative rules.With detailed empirical data and case studies,this research provides a theoretical basis and practical guidance for enterprises to achieve a dynamic balance between technological rationality and humanistic care.展开更多
Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnos...Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnostic methods,electrocardiography(ECG)is particularly well-known for its ability to detect MI.However,confirming its accuracy—particularly in identifying the localization of myocardial damage—often presents challenges in practice.This study,therefore,proposes a new approach based on machine learning models for the analysis of 12-lead ECG data to accurately identify the localization of MI.In particular,the learning vector quantization(LVQ)algorithm was applied,considering the contribution of each ECG lead in the 12-channel system,which obtained an accuracy of 87%in localizing damaged myocardium.The developed model was tested on verified data from the PTB database,including 445 ECG recordings from both healthy individuals and MI-diagnosed patients.The results demonstrated that the 12-lead ECG system allows for a comprehensive understanding of cardiac activities in myocardial infarction patients,serving as an essential tool for the diagnosis of myocardial conditions and localizing their damage.A comprehensive comparison was performed,including CNN,SVM,and Logistic Regression,to evaluate the proposed LVQ model.The results demonstrate that the LVQ model achieves competitive performance in diagnostic tasks while maintaining computational efficiency,making it suitable for resource-constrained environments.This study also applies a carefully designed data pre-processing flow,including class balancing and noise removal,which improves the reliability and reproducibility of the results.These aspects highlight the potential application of the LVQ model in cardiac diagnostics,opening up prospects for its use along with more complex neural network architectures.展开更多
This article aims to enhance seismic hazard assessment methods for Kazakhstan’s seismotectonic conditions.It combines probabilistic seismic hazard analysis(PSHA),ground motion simulation,sitespecific geological and g...This article aims to enhance seismic hazard assessment methods for Kazakhstan’s seismotectonic conditions.It combines probabilistic seismic hazard analysis(PSHA),ground motion simulation,sitespecific geological and geotechnical data analysis,and seismic scenario analysis to develop Probabilistic General Seismic Zoning(GSZ)maps for Kazakhstan and Probabilistic Seismic Microzoning maps for Almaty.These maps align with Eurocode 8 principles,incorporating seismic intensity and engineering parameters like peak ground acceleration(PGA).The new procedure,applied in national projects,has resulted in GSZ maps for the country,seismic microzoning maps for Almaty,and detailed seismic zoning maps for East Kazakhstan.These maps,part of a regulatory document,guide earthquake-resistant design and construction.They offer a comprehensive assessment of seismic hazards,integrating traditional Medvedev-Sponheuer-Karnik(MSK-64)intensity scale points with quantitative parameters like peak ground acceleration.This innovative approach promises to advance methods for quantifying seismic hazards in specific regions.展开更多
Robotic manipulators increasingly operate in complex three-dimensional workspaces where accuracy and strict limits on position,velocity,and acceleration must be satisfied.Conventional geometric planners emphasize path...Robotic manipulators increasingly operate in complex three-dimensional workspaces where accuracy and strict limits on position,velocity,and acceleration must be satisfied.Conventional geometric planners emphasize path smoothness but often ignore dynamic feasibility,motivating control-aware trajectory generation.This study presents a novel model predictive control(MPC)framework for three-dimensional trajectory planning of robotic manipulators that integrates second-order dynamic modeling and multi-objective parameter optimization.Unlike conventional interpolation techniques such as cubic splines,B-splines,and linear interpolation,which neglect physical constraints and system dynamics,the proposed method generates dynamically feasible trajectories by directly optimizing over acceleration inputs while minimizing both tracking error and control effort.A key innovation lies in the use of Pareto front analysis for tuning prediction horizon and sampling time,enabling a systematic balance between accuracy and motion smoothness.Comparative evaluation using simulated experiments demonstrates that the proposed MPC approach achieves a minimum mean absolute error(MAE)of 0.170 and reduces maximum acceleration to 0.0217,compared to 0.0385 in classical linear methods.The maximum deviation error was also reduced by approximately 27.4%relative to MPC configurations without tuned parameters.All experiments were conducted in a simulation environment,with computational times per control cycle consistently remaining below 20 milliseconds,indicating practical feasibility for real-time applications.Thiswork advances the state-of-the-art inMPC-based trajectory planning by offering a scalable and interpretable control architecture that meets physical constraints while optimizing motion efficiency,thus making it suitable for deployment in safety-critical robotic applications.展开更多
Chronic suppurative otitis media(CSOM)is a prevalent condition in otolaryngology with significant medical and social implications,including hearing loss and severe intracranial complications.This article discusses the...Chronic suppurative otitis media(CSOM)is a prevalent condition in otolaryngology with significant medical and social implications,including hearing loss and severe intracranial complications.This article discusses the challenges in diagnosing cholesteatoma,a common complication of CSOM,particularly when using computed tomography(CT)and magnetic resonance imaging(MRI).We present three clinical cases where MRI,particularly in the non-EPI diffusion-weighted imaging(DWI)and apparent diffusion coefficient(ADC)modes,effectively identified the presence and extent of cholesteatoma that CT could not reliably distinguish due to overlapping features with other soft tissue formations.The high sensitivity of MRI,highlight its value in both primary diagnosis and assessment of recurrence.Our findings advocate for the incorporation of MRI into the diagnostic protocols for CSOM in the Republic of Kazakhstan,emphasizing the need for reliable epidemiological data to inform future research and prevent potential intracranial complications.展开更多
This study explores the cultural and value foundations of the educational goals of higher education in Kazakhstan and China.Based on the historical development and cultural traditions of the two countries,this study c...This study explores the cultural and value foundations of the educational goals of higher education in Kazakhstan and China.Based on the historical development and cultural traditions of the two countries,this study compares the similarities and differences of the educational goals of the two countries through qualitative literature content analysis.Both countries have taken“modernization and internationalization”as one of the core development directions of higher education development,but China’s educational philosophy is rooted in Confucianism and socialist core values,emphasizing country and collectivism;while Kazakhstan,based on neoliberal orientation,draws on the European education framework,gradually integrates multicultural concepts,emphasizes national identity and attaches importance to students’individual development.This study uses Hofstede’s cultural dimensions and postcolonial education theory to explore how different nation-building narratives affect the educational goals of higher education.By comparing value systems,institutional logics,and student training models,it helps to understand how the educational systems of the“Global South”countries seek a balance between international standards and local cultural identity.It provides inspiration for the development of education based on culture and mutual reference under the global education trend,and provides a comparative education perspective for reform localization.展开更多
Deep learning now underpins many state-of-the-art systems for biomedical image and signal processing,enabling automated lesion detection,physiological monitoring,and therapy planning with accuracy that rivals expert p...Deep learning now underpins many state-of-the-art systems for biomedical image and signal processing,enabling automated lesion detection,physiological monitoring,and therapy planning with accuracy that rivals expert performance.This survey reviews the principal model families as convolutional,recurrent,generative,reinforcement,autoencoder,and transfer-learning approaches as emphasising how their architectural choices map to tasks such as segmentation,classification,reconstruction,and anomaly detection.A dedicated treatment of multimodal fusion networks shows how imaging features can be integrated with genomic profiles and clinical records to yield more robust,context-aware predictions.To support clinical adoption,we outline post-hoc explainability techniques(Grad-CAM,SHAP,LIME)and describe emerging intrinsically interpretable designs that expose decision logic to end users.Regulatory guidance from the U.S.FDA,the European Medicines Agency,and the EU AI Act is summarised,linking transparency and lifecycle-monitoring requirements to concrete development practices.Remaining challenges as data imbalance,computational cost,privacy constraints,and cross-domain generalization are discussed alongside promising solutions such as federated learning,uncertainty quantification,and lightweight 3-D architectures.The article therefore offers researchers,clinicians,and policymakers a concise,practice-oriented roadmap for deploying trustworthy deep-learning systems in healthcare.展开更多
BACKGROUND For over half a century,the administration of maternal corticosteroids before anticipated preterm birth has been regarded as a cornerstone intervention for enhancing neonatal outcomes,particularly in preven...BACKGROUND For over half a century,the administration of maternal corticosteroids before anticipated preterm birth has been regarded as a cornerstone intervention for enhancing neonatal outcomes,particularly in preventing respiratory distress syndrome.Ongoing research on antenatal corticosteroids(ACS)is continuously refining the evidence regarding their efficacy and potential side effects,which may alter the application of this treatment.Recent findings indicate that in resource-limited settings,the effectiveness of ACS is contingent upon meeting specific conditions,including providing adequate medical support for preterm newborns.Future studies are expected to concentrate on developing evidence-based strategies to safely enhance ACS utilization in low-and middle-income countries.AIM To analyze the clinical effectiveness of antenatal corticosteroids in improving outcomes for preterm newborns in a tertiary care hospital setting in Kazakhstan,following current World Health Organization guidelines.METHODS This study employs a comparative retrospective cohort design to analyze single-center clinical data collected from January 2022 to February 2024.A total of 152 medical records of preterm newborns with gestational ages between 24 and 34 weeks were reviewed,focusing on the completeness of the ACS received.Quantitative variables are presented as means with standard deviations,while frequency analysis of qualitative indicators was performed using Pearson'sχ^(2) test(χ^(2))and Fisher's exact test.If statistical significance was identified,pairwise comparisons between the three observation groups were conducted using the Bonferroni correction.RESULTS The obtained data indicate that the complete implementation of antenatal steroid prophylaxis(ASP)improves neonatal outcomes,particularly by reducing the frequency of birth asphyxia(P=0.002),the need for primary resuscitation(P=0.002),the use of nasal continuous positive airway pressure(P=0.022),and the need for surfactant replacement therapy(P=0.038)compared to groups with incomplete or no ASP.Furthermore,complete ASP contributed to a decrease in morbidity among preterm newborns(e.g.,respiratory distress syndrome,intrauterine pneumonia,cerebral ischemia,bronchopulmonary dysplasia,etc.),improved Apgar scores,and reduced the need for re-intubation and the frequency of mechanical ventilation.However,it was associated with an increased incidence of uterine atony in postpartum women(P=0.0095).CONCLUSION In a tertiary hospital setting,the implementation of ACS therapy for pregnancies between 24 and 34 weeks of gestation at high risk for preterm birth significantly reduces the incidence of neonatal complications and related interventions.This,in turn,contributes to better outcomes for this cohort of children.However,the impact of ACS on maternal outcomes requires further thorough investigation.展开更多
A plasma screening model that accounts for electronic exchange-correlation effects and ionic nonideality in dense quantum plasmas is proposed.This model can be used as an input in various plasma interaction models to ...A plasma screening model that accounts for electronic exchange-correlation effects and ionic nonideality in dense quantum plasmas is proposed.This model can be used as an input in various plasma interaction models to calculate scattering cross-sections and transport properties.The applicability of the proposed plasma screening model is demonstrated using the example of the temperature relaxation rate in dense hydrogen and warm dense aluminum.Additionally,the conductivity of warm dense aluminum is computed in the regime where collisions are dominated by electron-ion scattering.The results obtained are compared with available theoretical results and simulation data.展开更多
This paper examines the application of the Verkle tree—an efficient data structure that leverages commitments and a novel proof technique in cryptographic solutions.Unlike traditional Merkle trees,the Verkle tree sig...This paper examines the application of the Verkle tree—an efficient data structure that leverages commitments and a novel proof technique in cryptographic solutions.Unlike traditional Merkle trees,the Verkle tree significantly reduces signature size by utilizing polynomial and vector commitments.Compact proofs also accelerate the verification process,reducing computational overhead,which makes Verkle trees particularly useful.The study proposes a new approach based on a non-positional polynomial notation(NPN)employing the Chinese Remainder Theorem(CRT).CRT enables efficient data representation and verification by decomposing data into smaller,indepen-dent components,simplifying computations,reducing overhead,and enhancing scalability.This technique facilitates parallel data processing,which is especially advantageous in cryptographic applications such as commitment and proof construction in Verkle trees,as well as in systems with constrained computational resources.Theoretical foundations of the approach,its advantages,and practical implementation aspects are explored,including resistance to potential attacks,application domains,and a comparative analysis with existing methods based on well-known parameters and characteristics.An analysis of potential attacks and vulnerabilities,including greatest common divisor(GCD)attacks,approximate multiple attacks(LLL lattice-based),brute-force search for irreducible polynomials,and the estimation of their total number,indicates that no vulnerabilities have been identified in the proposed method thus far.Furthermore,the study demonstrates that integrating CRT with Verkle trees ensures high scalability,making this approach promising for blockchain systems and other distributed systems requiring compact and efficient proofs.展开更多
基金funded by the Science Committee of the Ministry of Science and Higher Education of the Republic of Kazakhstan,Program No.BR24992759。
文摘This study investigates the thermal and statistical properties of the Dirac oscillator within the framework of two prominent formulations of doubly special relativity(DSR):the Amelino-Camelia and Magueijo-Smolin models.DSR extends Einstein's special relativity by introducing an additional invariant scale—the Planck energy—leading to modified energy-momentum relations that encode potential quantum-gravitational effects at ultra-high energies.In this context,we derive the modified Dirac equations for both DSR scenarios and analytically determine the corresponding energy spectra.These spectra are subsequently used to compute the partition function and key thermodynamic quantities,including specific heat,by employing the Euler-Maclaurin formula to facilitate an efficient approximation of the partition function.The analysis is restricted to the positive-energy sector,enabled by the exact Foldy-Wouthuysen transformation,which effectively decouples positive and negative energy states.The findings reveal that Planck-scale deformation parameters induce significant modifications in the energy spectrum and thermodynamic behavior of the Dirac oscillator in each DSR framework,thereby offering valuable insights into possible observable imprints of quantum gravitational phenomena in relativistic quantum systems.
基金supported by the National Natural Science Foundation of China,China(Grant No.22305100,No.22405104)the Hubei Provincial International Science and Technology Cooperation Program Project(Grant No.2023EHA014)+4 种基金the National Foreign Experts Program(Grant No.Y20240022,H20240275)the Hubei Natural Science Foundation(Grant No.2025AFB460)the Hubei Provincial Department of Education Scientific Research Project(Grant No.F2023033,Q20234414)the Wuhan Natural Science Foundation Exploration Project(Chenguang Program)(Grant No.2025040601020173)the Jianghan University Scientific Research Startup Fund(Grant No.PBSKL-2022-QD-08,No.PBSKL-2024-QD-03).
文摘As a prototypical high-energy-density reactive material system,metastable intermolecular composites(MICs)have attracted considerable interest owing to their customizable component configurations and interfacial architectures.Nevertheless,their energy release characteristics are fundamentally constrained by the formation of condensed-phase products with elevated boiling points,thereby diminishing their efficacy in applications requiring rapid pressure generation or shock wave propagation.Herein,we demonstrate a molecular-level fluorination approach that enables oxygen substitution by fluorine within bismuth oxide crystalline frameworks,yielding ternary BixOyFz crystals with atomically precise F/O stoichiometric control through systematic solvent polarity engineering.This energetics system,designed through a multilevel regulation strategy,realizes stepwise redox reactions of Al–F and Al–O during energy release,with the partitioning between these redox pathways being precisely allocable through hierarchical regulation.Furthermore,the pre-ignition reaction(PIR)between BixOyFz and Al2O3(the inert passivation shell of Al)weakens the passivation layer,lowering the ignition threshold.The in situ generation of low-boiling-point AlF3 promotes rapid gas expansion,leading to significantly enhanced pressurization rates and deflagration wave velocities under confinement compared to conventional strategies.To evaluate energy output capabilities and validate potential safety-protection applications,the system successfully achieved instantaneous destruction of SD chips,enabling secure data erasure.This work establishes crystalline lattice fluorination as a generalized materials design strategy to transcend intrinsic limitations of MICs systems in component selection and reaction thermodynamics,providing new paradigms for adaptive energetic architectures and transient microelectromechanical applications.
基金funded by the Strategic Priority Research Program of the Chinese Academy of Sciences(XDB0720203)the National Key Research and Development Program of China(2023YFF0805603).
文摘Based on monthly runoff and climate datasets spanning 2000–2024,this study employed the Theil–Sen’s slope estimation,Mann–Kendall(M–K)trend test,as well as Pearson correlation and Spearman rank correlation analyses to systematically examine the spatiotemporal patterns of runoff and its climatic driving mechanisms across Tajikistan,providing a scientific basis for sustainable water resource utilization and management in the study area.Results indicated that during 2000–2024,the annual runoff in Tajikistan exhibited statistically non-significant long-term trend(P=0.76),while displaying pronounced seasonal variability and strong spatial heterogeneity.Spring and summer average runoff primarily exhibited slight declining tendencies,while winter average runoff exhibited pronounced reduction in localized regions,such as the Syr Darya Basin,the Vakhsh River Basin,and the lower reaches of the Zeravshan River Basin.Precipitation emerged as the dominant positive driver of runoff,exhibiting moderate to strong positive correlations across over 78.00%of the country,whereas potential evapotranspiration consistently functioned as a negative driver.Rising temperatures exerted a dual competitive effect on runoff:in high-elevation,glacier-covered regions,rising temperatures temporarily increased runoff by accelerating glacier melt;however,at the national scale,the negative impact of rising temperature on runoff has played a slightly dominant role to a certain extent by enhancing evapotranspiration.Collectively,these results indicated that the present stability of runoff in Tajikistan is strongly dependent on the short-term compensatory effects of glacier melt and the risk of future runoff decline is likely to intensify as glacier reserves continue to diminish.This study provides a critical scientific evidence to inform sustainable water resource management in Tajikistan and underscores the need for glacier conservation and integrated water resource management strategies.
基金appreciation to the Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R384)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The cloud-fog computing paradigm has emerged as a novel hybrid computing model that integrates computational resources at both fog nodes and cloud servers to address the challenges posed by dynamic and heterogeneous computing networks.Finding an optimal computational resource for task offloading and then executing efficiently is a critical issue to achieve a trade-off between energy consumption and transmission delay.In this network,the task processed at fog nodes reduces transmission delay.Still,it increases energy consumption,while routing tasks to the cloud server saves energy at the cost of higher communication delay.Moreover,the order in which offloaded tasks are executed affects the system’s efficiency.For instance,executing lower-priority tasks before higher-priority jobs can disturb the reliability and stability of the system.Therefore,an efficient strategy of optimal computation offloading and task scheduling is required for operational efficacy.In this paper,we introduced a multi-objective and enhanced version of Cheeta Optimizer(CO),namely(MoECO),to jointly optimize the computation offloading and task scheduling in cloud-fog networks to minimize two competing objectives,i.e.,energy consumption and communication delay.MoECO first assigns tasks to the optimal computational nodes and then the allocated tasks are scheduled for processing based on the task priority.The mathematical modelling of CO needs improvement in computation time and convergence speed.Therefore,MoECO is proposed to increase the search capability of agents by controlling the search strategy based on a leader’s location.The adaptive step length operator is adjusted to diversify the solution and thus improves the exploration phase,i.e.,global search strategy.Consequently,this prevents the algorithm from getting trapped in the local optimal solution.Moreover,the interaction factor during the exploitation phase is also adjusted based on the location of the prey instead of the adjacent Cheetah.This increases the exploitation capability of agents,i.e.,local search capability.Furthermore,MoECO employs a multi-objective Pareto-optimal front to simultaneously minimize designated objectives.Comprehensive simulations in MATLAB demonstrate that the proposed algorithm obtains multiple solutions via a Pareto-optimal front and achieves an efficient trade-off between optimization objectives compared to baseline methods.
基金extend their appreciation to the Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2026R760)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.The authors also extend their appreciation to the Deanship of Research and Graduate Studies at King Khalid University for funding this work through small group research under grant number RGP2/714/46.
文摘The convergence of Software Defined Networking(SDN)in Internet of Vehicles(IoV)enables a flexible,programmable,and globally visible network control architecture across Road Side Units(RSUs),cloud servers,and automobiles.While this integration enhances scalability and safety,it also raises sophisticated cyberthreats,particularly Distributed Denial of Service(DDoS)attacks.Traditional rule-based anomaly detection methods often struggle to detectmodern low-and-slowDDoS patterns,thereby leading to higher false positives.To this end,this study proposes an explainable hybrid framework to detect DDoS attacks in SDN-enabled IoV(SDN-IoV).The hybrid framework utilizes a Residual Network(ResNet)to capture spatial correlations and a Bi-Long Short-Term Memory(BiLSTM)to capture both forward and backward temporal dependencies in high-dimensional input patterns.To ensure transparency and trustworthiness,themodel integrates the Explainable AI(XAI)technique,i.e.,SHapley Additive exPlanations(SHAP).SHAP highlights the contribution of each feature during the decision-making process,facilitating security analysts to understand the rationale behind the attack classification decision.The SDN-IoV environment is created in Mininet-WiFi and SUMO,and the hybrid model is trained on the CICDDoS2019 security dataset.The simulation results reveal the efficacy of the proposed model in terms of standard performance metrics compared to similar baseline methods.
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
文摘In order to determine the bulk density of refractory raw materials,the so-called water method following the Archimedes principle is normally used.This is where the effect of water displacement on the mass of the sample is used to determine the bulk volume of the sample grains.During this test procedure,the surface of water infiltrated sample grains must be dried with a wet towel.Experience shows,that this drying step is the main root cause for variation in reproducibility of results and even repeatability of tests.A new spin dryer(centrifuge)was developed and introduced to automate this surface drying step,and is now included as a new method in ISO 8840:2021.The paper discusses the improvement of measurement with the new approach and industrial experiences from two big industrial players in the raw material business.
文摘This study focuses on the real-world context where artificial intelligence(AI)deeply permeates corporate operations,systematically exploring the core value,challenges,and optimization paths of corporate culture management during the process of intelligent transformation.By deeply analyzing the semantic deviations in the digitalization of cultural elements and the potential conflicts between algorithm-based decision-making and humanistic values in human-machine collaboration scenarios,and combining theories of organizational behavior with the characteristics of AI technology,a series of strategies for enhancing effectiveness are proposed,including dynamic cultural modeling and embedding cognitive-collaborative rules.With detailed empirical data and case studies,this research provides a theoretical basis and practical guidance for enterprises to achieve a dynamic balance between technological rationality and humanistic care.
基金funded by the Ministry of Science and Higher Education of the Republic of Kazakhstan,grant numbers AP14969403 and AP23485820.
文摘Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnostic methods,electrocardiography(ECG)is particularly well-known for its ability to detect MI.However,confirming its accuracy—particularly in identifying the localization of myocardial damage—often presents challenges in practice.This study,therefore,proposes a new approach based on machine learning models for the analysis of 12-lead ECG data to accurately identify the localization of MI.In particular,the learning vector quantization(LVQ)algorithm was applied,considering the contribution of each ECG lead in the 12-channel system,which obtained an accuracy of 87%in localizing damaged myocardium.The developed model was tested on verified data from the PTB database,including 445 ECG recordings from both healthy individuals and MI-diagnosed patients.The results demonstrated that the 12-lead ECG system allows for a comprehensive understanding of cardiac activities in myocardial infarction patients,serving as an essential tool for the diagnosis of myocardial conditions and localizing their damage.A comprehensive comparison was performed,including CNN,SVM,and Logistic Regression,to evaluate the proposed LVQ model.The results demonstrate that the LVQ model achieves competitive performance in diagnostic tasks while maintaining computational efficiency,making it suitable for resource-constrained environments.This study also applies a carefully designed data pre-processing flow,including class balancing and noise removal,which improves the reliability and reproducibility of the results.These aspects highlight the potential application of the LVQ model in cardiac diagnostics,opening up prospects for its use along with more complex neural network architectures.
基金The work was carried out in the framework of earmarked funding“Assessment of seismic hazard of territories of Kazakhstan on modern scientific and methodological basis”,programme code number F.0980.Source of funding-Ministry of Science and Higher Education of the Republic of Kazakhstan.
文摘This article aims to enhance seismic hazard assessment methods for Kazakhstan’s seismotectonic conditions.It combines probabilistic seismic hazard analysis(PSHA),ground motion simulation,sitespecific geological and geotechnical data analysis,and seismic scenario analysis to develop Probabilistic General Seismic Zoning(GSZ)maps for Kazakhstan and Probabilistic Seismic Microzoning maps for Almaty.These maps align with Eurocode 8 principles,incorporating seismic intensity and engineering parameters like peak ground acceleration(PGA).The new procedure,applied in national projects,has resulted in GSZ maps for the country,seismic microzoning maps for Almaty,and detailed seismic zoning maps for East Kazakhstan.These maps,part of a regulatory document,guide earthquake-resistant design and construction.They offer a comprehensive assessment of seismic hazards,integrating traditional Medvedev-Sponheuer-Karnik(MSK-64)intensity scale points with quantitative parameters like peak ground acceleration.This innovative approach promises to advance methods for quantifying seismic hazards in specific regions.
基金funded by the research project“BR24992947—Development of Robots,Scientific,Technical,and Software for Flexible Robotization and Industrial Automation(RPA)in Automotive Industrial Enterprises in Kazakhstan Using Artificial Intelligence”.
文摘Robotic manipulators increasingly operate in complex three-dimensional workspaces where accuracy and strict limits on position,velocity,and acceleration must be satisfied.Conventional geometric planners emphasize path smoothness but often ignore dynamic feasibility,motivating control-aware trajectory generation.This study presents a novel model predictive control(MPC)framework for three-dimensional trajectory planning of robotic manipulators that integrates second-order dynamic modeling and multi-objective parameter optimization.Unlike conventional interpolation techniques such as cubic splines,B-splines,and linear interpolation,which neglect physical constraints and system dynamics,the proposed method generates dynamically feasible trajectories by directly optimizing over acceleration inputs while minimizing both tracking error and control effort.A key innovation lies in the use of Pareto front analysis for tuning prediction horizon and sampling time,enabling a systematic balance between accuracy and motion smoothness.Comparative evaluation using simulated experiments demonstrates that the proposed MPC approach achieves a minimum mean absolute error(MAE)of 0.170 and reduces maximum acceleration to 0.0217,compared to 0.0385 in classical linear methods.The maximum deviation error was also reduced by approximately 27.4%relative to MPC configurations without tuned parameters.All experiments were conducted in a simulation environment,with computational times per control cycle consistently remaining below 20 milliseconds,indicating practical feasibility for real-time applications.Thiswork advances the state-of-the-art inMPC-based trajectory planning by offering a scalable and interpretable control architecture that meets physical constraints while optimizing motion efficiency,thus making it suitable for deployment in safety-critical robotic applications.
文摘Chronic suppurative otitis media(CSOM)is a prevalent condition in otolaryngology with significant medical and social implications,including hearing loss and severe intracranial complications.This article discusses the challenges in diagnosing cholesteatoma,a common complication of CSOM,particularly when using computed tomography(CT)and magnetic resonance imaging(MRI).We present three clinical cases where MRI,particularly in the non-EPI diffusion-weighted imaging(DWI)and apparent diffusion coefficient(ADC)modes,effectively identified the presence and extent of cholesteatoma that CT could not reliably distinguish due to overlapping features with other soft tissue formations.The high sensitivity of MRI,highlight its value in both primary diagnosis and assessment of recurrence.Our findings advocate for the incorporation of MRI into the diagnostic protocols for CSOM in the Republic of Kazakhstan,emphasizing the need for reliable epidemiological data to inform future research and prevent potential intracranial complications.
文摘This study explores the cultural and value foundations of the educational goals of higher education in Kazakhstan and China.Based on the historical development and cultural traditions of the two countries,this study compares the similarities and differences of the educational goals of the two countries through qualitative literature content analysis.Both countries have taken“modernization and internationalization”as one of the core development directions of higher education development,but China’s educational philosophy is rooted in Confucianism and socialist core values,emphasizing country and collectivism;while Kazakhstan,based on neoliberal orientation,draws on the European education framework,gradually integrates multicultural concepts,emphasizes national identity and attaches importance to students’individual development.This study uses Hofstede’s cultural dimensions and postcolonial education theory to explore how different nation-building narratives affect the educational goals of higher education.By comparing value systems,institutional logics,and student training models,it helps to understand how the educational systems of the“Global South”countries seek a balance between international standards and local cultural identity.It provides inspiration for the development of education based on culture and mutual reference under the global education trend,and provides a comparative education perspective for reform localization.
基金supported by the Science Committee of the Ministry of Higher Education and Science of the Republic of Kazakhstan within the framework of grant AP23489899“Applying Deep Learning and Neuroimaging Methods for Brain Stroke Diagnosis”.
文摘Deep learning now underpins many state-of-the-art systems for biomedical image and signal processing,enabling automated lesion detection,physiological monitoring,and therapy planning with accuracy that rivals expert performance.This survey reviews the principal model families as convolutional,recurrent,generative,reinforcement,autoencoder,and transfer-learning approaches as emphasising how their architectural choices map to tasks such as segmentation,classification,reconstruction,and anomaly detection.A dedicated treatment of multimodal fusion networks shows how imaging features can be integrated with genomic profiles and clinical records to yield more robust,context-aware predictions.To support clinical adoption,we outline post-hoc explainability techniques(Grad-CAM,SHAP,LIME)and describe emerging intrinsically interpretable designs that expose decision logic to end users.Regulatory guidance from the U.S.FDA,the European Medicines Agency,and the EU AI Act is summarised,linking transparency and lifecycle-monitoring requirements to concrete development practices.Remaining challenges as data imbalance,computational cost,privacy constraints,and cross-domain generalization are discussed alongside promising solutions such as federated learning,uncertainty quantification,and lightweight 3-D architectures.The article therefore offers researchers,clinicians,and policymakers a concise,practice-oriented roadmap for deploying trustworthy deep-learning systems in healthcare.
基金Supported by Non-profit Joint Stock Company“S.D.Asfendiyarov Kazakh National Medical University”,Almaty,Kazakhstan。
文摘BACKGROUND For over half a century,the administration of maternal corticosteroids before anticipated preterm birth has been regarded as a cornerstone intervention for enhancing neonatal outcomes,particularly in preventing respiratory distress syndrome.Ongoing research on antenatal corticosteroids(ACS)is continuously refining the evidence regarding their efficacy and potential side effects,which may alter the application of this treatment.Recent findings indicate that in resource-limited settings,the effectiveness of ACS is contingent upon meeting specific conditions,including providing adequate medical support for preterm newborns.Future studies are expected to concentrate on developing evidence-based strategies to safely enhance ACS utilization in low-and middle-income countries.AIM To analyze the clinical effectiveness of antenatal corticosteroids in improving outcomes for preterm newborns in a tertiary care hospital setting in Kazakhstan,following current World Health Organization guidelines.METHODS This study employs a comparative retrospective cohort design to analyze single-center clinical data collected from January 2022 to February 2024.A total of 152 medical records of preterm newborns with gestational ages between 24 and 34 weeks were reviewed,focusing on the completeness of the ACS received.Quantitative variables are presented as means with standard deviations,while frequency analysis of qualitative indicators was performed using Pearson'sχ^(2) test(χ^(2))and Fisher's exact test.If statistical significance was identified,pairwise comparisons between the three observation groups were conducted using the Bonferroni correction.RESULTS The obtained data indicate that the complete implementation of antenatal steroid prophylaxis(ASP)improves neonatal outcomes,particularly by reducing the frequency of birth asphyxia(P=0.002),the need for primary resuscitation(P=0.002),the use of nasal continuous positive airway pressure(P=0.022),and the need for surfactant replacement therapy(P=0.038)compared to groups with incomplete or no ASP.Furthermore,complete ASP contributed to a decrease in morbidity among preterm newborns(e.g.,respiratory distress syndrome,intrauterine pneumonia,cerebral ischemia,bronchopulmonary dysplasia,etc.),improved Apgar scores,and reduced the need for re-intubation and the frequency of mechanical ventilation.However,it was associated with an increased incidence of uterine atony in postpartum women(P=0.0095).CONCLUSION In a tertiary hospital setting,the implementation of ACS therapy for pregnancies between 24 and 34 weeks of gestation at high risk for preterm birth significantly reduces the incidence of neonatal complications and related interventions.This,in turn,contributes to better outcomes for this cohort of children.However,the impact of ACS on maternal outcomes requires further thorough investigation.
基金funded by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan Grant No.AP19678033“The study of the transport and optical properties of hydrogen at high pressure.”。
文摘A plasma screening model that accounts for electronic exchange-correlation effects and ionic nonideality in dense quantum plasmas is proposed.This model can be used as an input in various plasma interaction models to calculate scattering cross-sections and transport properties.The applicability of the proposed plasma screening model is demonstrated using the example of the temperature relaxation rate in dense hydrogen and warm dense aluminum.Additionally,the conductivity of warm dense aluminum is computed in the regime where collisions are dominated by electron-ion scattering.The results obtained are compared with available theoretical results and simulation data.
基金funded by the Ministry of Science and Higher Education of Kazakhstan and carried out within the framework of the project AP23488112“Development and study of a quantum-resistant digital signature scheme based on a Verkle tree”at the Institute of Information and Computational Technologies.
文摘This paper examines the application of the Verkle tree—an efficient data structure that leverages commitments and a novel proof technique in cryptographic solutions.Unlike traditional Merkle trees,the Verkle tree significantly reduces signature size by utilizing polynomial and vector commitments.Compact proofs also accelerate the verification process,reducing computational overhead,which makes Verkle trees particularly useful.The study proposes a new approach based on a non-positional polynomial notation(NPN)employing the Chinese Remainder Theorem(CRT).CRT enables efficient data representation and verification by decomposing data into smaller,indepen-dent components,simplifying computations,reducing overhead,and enhancing scalability.This technique facilitates parallel data processing,which is especially advantageous in cryptographic applications such as commitment and proof construction in Verkle trees,as well as in systems with constrained computational resources.Theoretical foundations of the approach,its advantages,and practical implementation aspects are explored,including resistance to potential attacks,application domains,and a comparative analysis with existing methods based on well-known parameters and characteristics.An analysis of potential attacks and vulnerabilities,including greatest common divisor(GCD)attacks,approximate multiple attacks(LLL lattice-based),brute-force search for irreducible polynomials,and the estimation of their total number,indicates that no vulnerabilities have been identified in the proposed method thus far.Furthermore,the study demonstrates that integrating CRT with Verkle trees ensures high scalability,making this approach promising for blockchain systems and other distributed systems requiring compact and efficient proofs.