Climate change affects distribution and persistence of species. However, forecasting species' re-sponses to these changes requires long-term data series that are often lacking in ecological studies.We used 15 years o...Climate change affects distribution and persistence of species. However, forecasting species' re-sponses to these changes requires long-term data series that are often lacking in ecological studies.We used 15 years of small mammal trapping data collected between 1978 and 2015 in 3 areas atDoSana National Park (southwest Spain) to (i) describe changes in species composition and (ii) test theassociation between local climate conditions and size of small mammal populations. Overall, 5 specieswere captured: wood mouse Apodemus sylvaticus, algerian mouse Mus spretus, greater white-toothed shrew Crocidura russula, garden dormouse Eliomys quercinus, and black rat Rattus rattus. Thetemporal pattern in the proportion of captures of each species suggests that the small mammal diver-sity declined with time. Although the larger species (e.g., E. quercinus), better adapted to colder cli-mate, have disappeared from our trapping records, M. spretus, a small species inhabiting southwestEurope and the Mediterranean coast of Africa, currently is almost the only trapped species. We used 2-level hierarchical models to separate changes in abundance from changes in probability of captureusing records of A. sylvaticus in all 3 areas and of Mo spretus in 1. We found that heavy rainfall and lowtemperatures were positively related to abundance of A. sylvaticus, and that the number of extremelyhot days was negatively related to abundance of M. spretus. Despite other mechanisms are likely to beinvolved, our findings support the importance of climate for the distribution and persistence of thesespecies and raise conservation concerns about potential cascading effects in the Donana ecosystem.展开更多
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei...Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.展开更多
Investigations into the long-term creep behavior of Beishan granite in uniaxial compression were conducted.Four levels of axial stress(60,70,87,and 95 MPa)were applied to rock specimens.Contrasting with earlier resear...Investigations into the long-term creep behavior of Beishan granite in uniaxial compression were conducted.Four levels of axial stress(60,70,87,and 95 MPa)were applied to rock specimens.Contrasting with earlier research,the long-term creep data in this work present a substantial advancement in the time dimension.Except for the sample subjected to 60 MPa axial loading,which did not fail after a loading duration of 1650 d,the specimens under the other three stresses all failed after sustained constant loading durations of 1204,1023,and 839 d,respectively.A lower envelope of driving stress-ratio for crystalline rocks was obtained,tending towards approximately 0.45 over an infinite time scale.According to the experimental results,as axial stress increases,both the axial strain accumulated in the transient creep process and the strain rate associated with steady-state creep deformation increase exponentially;however,the share of steady-state creep strain remains nearly constant at about82.53%.A novel damage-based creep model was put forward.It provides an enhanced depiction of the comprehensive creep process in rocks,notably improving the accuracy in forecasting the accelerated creep phase,which significantly impacts the long-term stability of engineering structures.展开更多
While oceanic and coastal acidification has gained increased attention,long-term pH trends and their drivers in large freshwater systems remain poorly understood.The Laurentian Great Lakes are the world’s largest fre...While oceanic and coastal acidification has gained increased attention,long-term pH trends and their drivers in large freshwater systems remain poorly understood.The Laurentian Great Lakes are the world’s largest freshwater system,and in many ways resemble marine ecosystems.However,unlike the open ocean and coastal waters where pH has declined due to rising atmospheric CO_(2),no significant pH trends have been observed in the Laurentian Great Lakes,despite significant ecosystem changes driven partly by the invasion of dreissenid mussels.This study examined 41 years of field observations from Lake Michigan to investigate the long-term carbonate chemistry dynamics.Observational results revealed substantial declines in both total alkalinity(TA)and dissolved inorganic carbon(DIC)over the four decades.Mussel shell calcification emerged as the primary mechanism behind these declines,accounting for 97%and 47%of the observed changes in TA and DIC,respectively,lowering water column pH by 0.24 units.Elevated carbon accumulation in soft mussel tissues,coupled with long-term changes in the air-water pCO_(2)gradient during summer,significantly contributed to long-term DIC variations,explaining 18%and 28%of the lake-wide DIC loss.These two mechanisms also resulted in an overall pH increase of 0.09 and 0.12 units,largely offsetting the calcification-driven pH decrease.These findings bridge a gap in acidification research for large freshwater systems and provide valuable insights for comprehensive lake-wide management strategies.展开更多
Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and...Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research.展开更多
Objective:To investigate the long-term prognosis and postoperative cosmetic outcomes of breast-conserving surgery combined with sentinel lymph node biopsy in patients with early-stage breast cancer,providing a referen...Objective:To investigate the long-term prognosis and postoperative cosmetic outcomes of breast-conserving surgery combined with sentinel lymph node biopsy in patients with early-stage breast cancer,providing a reference for the selection of clinical treatment plans.Methods:A retrospective analysis was conducted on the clinical data of 68 patients with early-stage breast cancer admitted from January 2022 to December 2025.Based on the surgical approach,patients were divided into an observation group(breast-conserving surgery+sentinel lymph node biopsy)and a control group(other surgical methods such as modified radical mastectomy/total mastectomy).Clinical and pathological characteristics,incidence of postoperative complications,follow-up prognosis,and satisfaction with cosmetic outcomes were compared between the two groups.Results:Among the 68 patients,41 were in the observation group and 27 in the control group.The average age of patients in the observation group was(54.32±8.15)years,while that in the control group was(62.45±9.76)years.The average tumor size in the observation group was(1.86±0.72)cm,compared to(3.21±1.45)cm in the control group.The incidence of postoperative complications in the observation group was 9.76%,significantly lower than that in the control group at 33.33%(P<0.05).The 6-month disease-free survival rate was 95.12%in the observation group and 88.89%in the control group,with no statistically significant difference between the two groups(P>0.05).The excellent and good rate of cosmetic outcomes in the observation group was 87.80%,significantly higher than that in the control group at 29.63%(P<0.05).Conclusion:Breast-conserving surgery combined with sentinel lymph node biopsy for early-stage breast cancer can achieve long-term prognostic outcomes comparable to those of traditional radical surgery,with the advantages of fewer postoperative complications and superior cosmetic results.This approach is worthy of clinical promotion and application,particularly for early-stage breast cancer patients who have a demand for preserving breast morphology.展开更多
Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.Howev...Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality.展开更多
AIM:To investigate the long-term outcomes in acute primary angle closure(APAC)patients treated with lens extraction(LE)surgery and to identify risk factors for glaucomatous optic neuropathy(GON).METHODS:In this longit...AIM:To investigate the long-term outcomes in acute primary angle closure(APAC)patients treated with lens extraction(LE)surgery and to identify risk factors for glaucomatous optic neuropathy(GON).METHODS:In this longitudinal observational study,detailed medical histories of APAC patients and comprehensive ophthalmic examinations at final followup were collected.Logistic regression analysis was performed to identify predictors of blindness.Univariate and multivariate linear regression analyses were conducted to determine risk factors associated with visual outcomes.RESULTS:This study included 39 affected eyes of 31 subjects(26 females)with an average age of 74.1±8.0y.At 6.7±4.2y after APAC attack,2(5.7%)eyes had bestcorrected visual acuity(VA)worse than 3/60.Advanced glaucomatous visual field loss was observed in 15(39.5%)affected eyes and 5(25.0%)fellow eyes.Nine affected eyes(23.7%)had GON,and 11(28.9%)were blind.Six(15.4%)affected eyes and 2(9.1%)fellow eyes had suspicious progression.A significantly higher blindness rate in factory workers compared to office workers.Logistic regression identified that worse VA at attack(OR 10.568,95%CI 1.288-86.695;P=0.028)and worse early postoperative VA(OR 13.214,95%CI 1.157-150.881;P=0.038)were risk factors for blindness.Multivariate regression showed that longer duration of elevated intraocular pressure(P=0.004)and worse early postoperative VA(P=0.009)were associated with worse visual outcomes.CONCLUSION:Despite LE surgery,some APAC patients experience continued visual function deterioration.Lifelong monitoring is necessary.Target pressure and progression rates should be re-evaluated during follow-up.展开更多
Taking the rural low-income population of Zhejiang Province as its subject, this paper examines how to build a sustainable income-growth mechanism and identify feasible implementation paths within the context of the c...Taking the rural low-income population of Zhejiang Province as its subject, this paper examines how to build a sustainable income-growth mechanism and identify feasible implementation paths within the context of the common prosperity strategy. The research identifies key obstacles to income expansion, including an undiversified industrial structure, insufficient human capital, and a lack of robust social protection. These call for systemic solutions featuring institutional innovation, resource consolidation, and capability enhancement. Building on Zhejiang's experience as a common prosperity demonstration zone, the article constructs an integrated framework centered on four pillars: industrial empowerment, education upgrading, social security reinforcement, and digital coordination. It further offers concrete policy proposals involving the cultivation of localized industries, vocational skill training, enhanced safety nets, and the adoption of digital tools. The study thus offers both theoretical insights and practical paradigms for tackling the challenge of raising incomes in low-income rural areas.展开更多
Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy a...Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys.展开更多
With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-...With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause.展开更多
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a...Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service...With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.展开更多
Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data co...Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment.展开更多
As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and ...As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities.展开更多
Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable ene...Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation.展开更多
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a...Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.展开更多
Long-term manure application has the potential to alleviate soil acidification, and increase carbon sequestration and nutrient availability, thus improving cropland fertility. However, the mechanisms behind greenhouse...Long-term manure application has the potential to alleviate soil acidification, and increase carbon sequestration and nutrient availability, thus improving cropland fertility. However, the mechanisms behind greenhouse gas N_(2)O emissions from acidic soil mediated by long-term manure application remain poorly understood. Herein, we investigated N_(2)O emission and its linkage with gross N mineralization and nitrification rates, as well as nitrifying and denitrifying microbes in an acidic upland soil subjected to 36-year fertilization treatments, including an unfertilized control(CK), inorganic fertilizer(F), 2× rate of inorganic fertilizer(2F), manure(M), and the combination of inorganic fertilizer and manure(FM) treatments. Compared to the CK treatment(1.34 μg N kg^(-1) d^(-1)), fertilization strongly increased N_(2)O emissions by 34-fold on average, with more pronounced increases in the manure-amendment(10.6-169 μg N kg^(-1) d^(-1)) than those in the inorganic fertilizer treatments(3.26-5.51 μg N kg^(-1) d^(-1)). The manure amendment-stimulated N_(2)O emissions were highly associated with increased soil pH, mean weight diameter of soil aggregates, substrate availability(e.g., particulate organic carbon, NO_(3)^(-)and available phosphorus), gross N mineralization rates, denitrifier abundances and the(nirK+nirS)/nosZ ratio. These findings suggest that the increased N_(2)O emissions primarily resulted from alleviated acidification, increased substrate availability and improved soil structure, thus enhancing microbial N mineralization and favoring N_(2)O^(-)producing denitrifiers over N_(2)O consumers. Moreover, ammonia-oxidizing bacteria(AOB) rather than ammonia-oxidizing archaea(AOA) positively correlated with soil NO_(3)^(-)concentration and N_(2)O emissions, indicating that nitrification indirectly contributed to N_(2)O production by supplying NO_(3)^(-)for denitrification. Collectively, manure amendment potentially stimulates N_(2)O emissions, primarily resulting from alleviated soil acidification and increased substrate availability, thus enhancing N mineralization and denitrifier-mediated N_(2)O production. Our findings suggest that consideration should be given to the greenhouse gas budgets of agricultural ecosystems when applying manure for managing the pH and fertility of acidic soils.展开更多
To ensure the safe and stable operation of rotating machinery,intelligent fault diagnosis methods hold significant research value.However,existing diagnostic approaches largely rely on manual feature extraction and ex...To ensure the safe and stable operation of rotating machinery,intelligent fault diagnosis methods hold significant research value.However,existing diagnostic approaches largely rely on manual feature extraction and expert experience,which limits their adaptability under variable operating conditions and strong noise environments,severely affecting the generalization capability of diagnostic models.To address this issue,this study proposes a multimodal fusion fault diagnosis framework based on Mel-spectrograms and automated machine learning(AutoML).The framework first extracts fault-sensitive Mel time–frequency features from acoustic signals and fuses them with statistical features of vibration signals to construct complementary fault representations.On this basis,automated machine learning techniques are introduced to enable end-to-end diagnostic workflow construction and optimal model configuration acquisition.Finally,diagnostic decisions are achieved by automatically integrating the predictions of multiple high-performance base models.Experimental results on a centrifugal pump vibration and acoustic dataset demonstrate that the proposed framework achieves high diagnostic accuracy under noise-free conditions and maintains strong robustness under noisy interference,validating its efficiency,scalability,and practical value for rotating machinery fault diagnosis.展开更多
文摘Climate change affects distribution and persistence of species. However, forecasting species' re-sponses to these changes requires long-term data series that are often lacking in ecological studies.We used 15 years of small mammal trapping data collected between 1978 and 2015 in 3 areas atDoSana National Park (southwest Spain) to (i) describe changes in species composition and (ii) test theassociation between local climate conditions and size of small mammal populations. Overall, 5 specieswere captured: wood mouse Apodemus sylvaticus, algerian mouse Mus spretus, greater white-toothed shrew Crocidura russula, garden dormouse Eliomys quercinus, and black rat Rattus rattus. Thetemporal pattern in the proportion of captures of each species suggests that the small mammal diver-sity declined with time. Although the larger species (e.g., E. quercinus), better adapted to colder cli-mate, have disappeared from our trapping records, M. spretus, a small species inhabiting southwestEurope and the Mediterranean coast of Africa, currently is almost the only trapped species. We used 2-level hierarchical models to separate changes in abundance from changes in probability of captureusing records of A. sylvaticus in all 3 areas and of Mo spretus in 1. We found that heavy rainfall and lowtemperatures were positively related to abundance of A. sylvaticus, and that the number of extremelyhot days was negatively related to abundance of M. spretus. Despite other mechanisms are likely to beinvolved, our findings support the importance of climate for the distribution and persistence of thesespecies and raise conservation concerns about potential cascading effects in the Donana ecosystem.
文摘Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.
基金financially supported by the China Atomic Energy Authority(CAEA)through the Geological Disposal Programthe National Natural Science Foundation of China(No.42307258)the China National Nuclear Corporation Fundamental Research Project(No.CNNC-JCYJ-202307)。
文摘Investigations into the long-term creep behavior of Beishan granite in uniaxial compression were conducted.Four levels of axial stress(60,70,87,and 95 MPa)were applied to rock specimens.Contrasting with earlier research,the long-term creep data in this work present a substantial advancement in the time dimension.Except for the sample subjected to 60 MPa axial loading,which did not fail after a loading duration of 1650 d,the specimens under the other three stresses all failed after sustained constant loading durations of 1204,1023,and 839 d,respectively.A lower envelope of driving stress-ratio for crystalline rocks was obtained,tending towards approximately 0.45 over an infinite time scale.According to the experimental results,as axial stress increases,both the axial strain accumulated in the transient creep process and the strain rate associated with steady-state creep deformation increase exponentially;however,the share of steady-state creep strain remains nearly constant at about82.53%.A novel damage-based creep model was put forward.It provides an enhanced depiction of the comprehensive creep process in rocks,notably improving the accuracy in forecasting the accelerated creep phase,which significantly impacts the long-term stability of engineering structures.
基金Supported by the National Natural Science Foundation of China(No.43277051)the Key Laboratory of Integrated Regulation and Resources Development of Shallow Lakes of Ministry of Education(No.B230203006).
文摘While oceanic and coastal acidification has gained increased attention,long-term pH trends and their drivers in large freshwater systems remain poorly understood.The Laurentian Great Lakes are the world’s largest freshwater system,and in many ways resemble marine ecosystems.However,unlike the open ocean and coastal waters where pH has declined due to rising atmospheric CO_(2),no significant pH trends have been observed in the Laurentian Great Lakes,despite significant ecosystem changes driven partly by the invasion of dreissenid mussels.This study examined 41 years of field observations from Lake Michigan to investigate the long-term carbonate chemistry dynamics.Observational results revealed substantial declines in both total alkalinity(TA)and dissolved inorganic carbon(DIC)over the four decades.Mussel shell calcification emerged as the primary mechanism behind these declines,accounting for 97%and 47%of the observed changes in TA and DIC,respectively,lowering water column pH by 0.24 units.Elevated carbon accumulation in soft mussel tissues,coupled with long-term changes in the air-water pCO_(2)gradient during summer,significantly contributed to long-term DIC variations,explaining 18%and 28%of the lake-wide DIC loss.These two mechanisms also resulted in an overall pH increase of 0.09 and 0.12 units,largely offsetting the calcification-driven pH decrease.These findings bridge a gap in acidification research for large freshwater systems and provide valuable insights for comprehensive lake-wide management strategies.
基金supported by the International Partnership program of the Chinese Academy of Sciences(170GJHZ2023074GC)National Natural Science Foundation of China(42425706 and 42488201)+1 种基金National Key Research and Development Program of China(2024YFF0807902)Beijing Natural Science Foundation(8242041),and China Postdoctoral Science Foundation(2025M770353).
文摘Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research.
文摘Objective:To investigate the long-term prognosis and postoperative cosmetic outcomes of breast-conserving surgery combined with sentinel lymph node biopsy in patients with early-stage breast cancer,providing a reference for the selection of clinical treatment plans.Methods:A retrospective analysis was conducted on the clinical data of 68 patients with early-stage breast cancer admitted from January 2022 to December 2025.Based on the surgical approach,patients were divided into an observation group(breast-conserving surgery+sentinel lymph node biopsy)and a control group(other surgical methods such as modified radical mastectomy/total mastectomy).Clinical and pathological characteristics,incidence of postoperative complications,follow-up prognosis,and satisfaction with cosmetic outcomes were compared between the two groups.Results:Among the 68 patients,41 were in the observation group and 27 in the control group.The average age of patients in the observation group was(54.32±8.15)years,while that in the control group was(62.45±9.76)years.The average tumor size in the observation group was(1.86±0.72)cm,compared to(3.21±1.45)cm in the control group.The incidence of postoperative complications in the observation group was 9.76%,significantly lower than that in the control group at 33.33%(P<0.05).The 6-month disease-free survival rate was 95.12%in the observation group and 88.89%in the control group,with no statistically significant difference between the two groups(P>0.05).The excellent and good rate of cosmetic outcomes in the observation group was 87.80%,significantly higher than that in the control group at 29.63%(P<0.05).Conclusion:Breast-conserving surgery combined with sentinel lymph node biopsy for early-stage breast cancer can achieve long-term prognostic outcomes comparable to those of traditional radical surgery,with the advantages of fewer postoperative complications and superior cosmetic results.This approach is worthy of clinical promotion and application,particularly for early-stage breast cancer patients who have a demand for preserving breast morphology.
基金supported by the National Key R&D Program of China[Grant No.2023YFF0713600]the National Natural Science Foundation of China[Grant No.62275062]+3 种基金Project of Shandong Innovation and Startup Community of High-end Medical Apparatus and Instruments[Grant No.2023-SGTTXM-002 and 2024-SGTTXM-005]the Shandong Province Technology Innovation Guidance Plan(Central Leading Local Science and Technology Development Fund)[Grant No.YDZX2023115]the Taishan Scholar Special Funding Project of Shandong Provincethe Shandong Laboratory of Advanced Biomaterials and Medical Devices in Weihai[Grant No.ZL202402].
文摘Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality.
文摘AIM:To investigate the long-term outcomes in acute primary angle closure(APAC)patients treated with lens extraction(LE)surgery and to identify risk factors for glaucomatous optic neuropathy(GON).METHODS:In this longitudinal observational study,detailed medical histories of APAC patients and comprehensive ophthalmic examinations at final followup were collected.Logistic regression analysis was performed to identify predictors of blindness.Univariate and multivariate linear regression analyses were conducted to determine risk factors associated with visual outcomes.RESULTS:This study included 39 affected eyes of 31 subjects(26 females)with an average age of 74.1±8.0y.At 6.7±4.2y after APAC attack,2(5.7%)eyes had bestcorrected visual acuity(VA)worse than 3/60.Advanced glaucomatous visual field loss was observed in 15(39.5%)affected eyes and 5(25.0%)fellow eyes.Nine affected eyes(23.7%)had GON,and 11(28.9%)were blind.Six(15.4%)affected eyes and 2(9.1%)fellow eyes had suspicious progression.A significantly higher blindness rate in factory workers compared to office workers.Logistic regression identified that worse VA at attack(OR 10.568,95%CI 1.288-86.695;P=0.028)and worse early postoperative VA(OR 13.214,95%CI 1.157-150.881;P=0.038)were risk factors for blindness.Multivariate regression showed that longer duration of elevated intraocular pressure(P=0.004)and worse early postoperative VA(P=0.009)were associated with worse visual outcomes.CONCLUSION:Despite LE surgery,some APAC patients experience continued visual function deterioration.Lifelong monitoring is necessary.Target pressure and progression rates should be re-evaluated during follow-up.
文摘Taking the rural low-income population of Zhejiang Province as its subject, this paper examines how to build a sustainable income-growth mechanism and identify feasible implementation paths within the context of the common prosperity strategy. The research identifies key obstacles to income expansion, including an undiversified industrial structure, insufficient human capital, and a lack of robust social protection. These call for systemic solutions featuring institutional innovation, resource consolidation, and capability enhancement. Building on Zhejiang's experience as a common prosperity demonstration zone, the article constructs an integrated framework centered on four pillars: industrial empowerment, education upgrading, social security reinforcement, and digital coordination. It further offers concrete policy proposals involving the cultivation of localized industries, vocational skill training, enhanced safety nets, and the adoption of digital tools. The study thus offers both theoretical insights and practical paradigms for tackling the challenge of raising incomes in low-income rural areas.
文摘Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys.
文摘With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause.
文摘Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
文摘With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.
基金supported by Natural Science Foundation of Qinghai Province(2025-ZJ-994M)Scientific Research Innovation Capability Support Project for Young Faculty(SRICSPYF-BS2025007)National Natural Science Foundation of China(62566050).
文摘Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment.
基金supported by the National Science Foundation of China(No.62171387)the Science and Technology Program of Sichuan Province(No.2024NSFSC0468)the China Postdoctoral Science Foundation(No.2019M663475).
文摘As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities.
基金supported by Science and Technology Standard Project of Guangdong Electric Power Design Institute(ER11301W,ER11811W).
文摘Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R104)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.
基金financially supported by the National Science & Technology Fundamental Resources Investigation Project of China (2021FY100501)the Youth Innovation of Chinese Academy of Agricultural Sciences (Y2023QC16)。
文摘Long-term manure application has the potential to alleviate soil acidification, and increase carbon sequestration and nutrient availability, thus improving cropland fertility. However, the mechanisms behind greenhouse gas N_(2)O emissions from acidic soil mediated by long-term manure application remain poorly understood. Herein, we investigated N_(2)O emission and its linkage with gross N mineralization and nitrification rates, as well as nitrifying and denitrifying microbes in an acidic upland soil subjected to 36-year fertilization treatments, including an unfertilized control(CK), inorganic fertilizer(F), 2× rate of inorganic fertilizer(2F), manure(M), and the combination of inorganic fertilizer and manure(FM) treatments. Compared to the CK treatment(1.34 μg N kg^(-1) d^(-1)), fertilization strongly increased N_(2)O emissions by 34-fold on average, with more pronounced increases in the manure-amendment(10.6-169 μg N kg^(-1) d^(-1)) than those in the inorganic fertilizer treatments(3.26-5.51 μg N kg^(-1) d^(-1)). The manure amendment-stimulated N_(2)O emissions were highly associated with increased soil pH, mean weight diameter of soil aggregates, substrate availability(e.g., particulate organic carbon, NO_(3)^(-)and available phosphorus), gross N mineralization rates, denitrifier abundances and the(nirK+nirS)/nosZ ratio. These findings suggest that the increased N_(2)O emissions primarily resulted from alleviated acidification, increased substrate availability and improved soil structure, thus enhancing microbial N mineralization and favoring N_(2)O^(-)producing denitrifiers over N_(2)O consumers. Moreover, ammonia-oxidizing bacteria(AOB) rather than ammonia-oxidizing archaea(AOA) positively correlated with soil NO_(3)^(-)concentration and N_(2)O emissions, indicating that nitrification indirectly contributed to N_(2)O production by supplying NO_(3)^(-)for denitrification. Collectively, manure amendment potentially stimulates N_(2)O emissions, primarily resulting from alleviated soil acidification and increased substrate availability, thus enhancing N mineralization and denitrifier-mediated N_(2)O production. Our findings suggest that consideration should be given to the greenhouse gas budgets of agricultural ecosystems when applying manure for managing the pH and fertility of acidic soils.
基金supported in part by the National Natural Science Foundation of China under Grants 52475102 and 52205101in part by the Guangdong Basic and Applied Basic Research Foundation under Grant 2023A1515240021+1 种基金in part by the Young Talent Support Project of Guangzhou Association for Science and Technology(QT-2024-28)in part by the Youth Development Initiative of Guangdong Association for Science and Technology(SKXRC2025254).
文摘To ensure the safe and stable operation of rotating machinery,intelligent fault diagnosis methods hold significant research value.However,existing diagnostic approaches largely rely on manual feature extraction and expert experience,which limits their adaptability under variable operating conditions and strong noise environments,severely affecting the generalization capability of diagnostic models.To address this issue,this study proposes a multimodal fusion fault diagnosis framework based on Mel-spectrograms and automated machine learning(AutoML).The framework first extracts fault-sensitive Mel time–frequency features from acoustic signals and fuses them with statistical features of vibration signals to construct complementary fault representations.On this basis,automated machine learning techniques are introduced to enable end-to-end diagnostic workflow construction and optimal model configuration acquisition.Finally,diagnostic decisions are achieved by automatically integrating the predictions of multiple high-performance base models.Experimental results on a centrifugal pump vibration and acoustic dataset demonstrate that the proposed framework achieves high diagnostic accuracy under noise-free conditions and maintains strong robustness under noisy interference,validating its efficiency,scalability,and practical value for rotating machinery fault diagnosis.