In the realm of subsurface flow simulations,deep-learning-based surrogate models have emerged as a promising alternative to traditional simulation methods,especially in addressing complex optimization problems.However...In the realm of subsurface flow simulations,deep-learning-based surrogate models have emerged as a promising alternative to traditional simulation methods,especially in addressing complex optimization problems.However,a significant challenge lies in the necessity of numerous high-fidelity training simulations to construct these deep-learning models,which limits their application to field-scale problems.To overcome this limitation,we introduce a training procedure that leverages transfer learning with multi-fidelity training data to construct surrogate models efficiently.The procedure begins with the pre-training of the surrogate model using a relatively larger amount of data that can be efficiently generated from upscaled coarse-scale models.Subsequently,the model parameters are finetuned with a much smaller set of high-fidelity simulation data.For the cases considered in this study,this method leads to about a 75%reduction in total computational cost,in comparison with the traditional training approach,without any sacrifice of prediction accuracy.In addition,a dedicated well-control embedding model is introduced to the traditional U-Net architecture to improve the surrogate model's prediction accuracy,which is shown to be particularly effective when dealing with large-scale reservoir models under time-varying well control parameters.Comprehensive results and analyses are presented for the prediction of well rates,pressure and saturation states of a 3D synthetic reservoir system.Finally,the proposed procedure is applied to a field-scale production optimization problem.The trained surrogate model is shown to provide excellent generalization capabilities during the optimization process,in which the final optimized net-present-value is much higher than those from the training data ranges.展开更多
Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysi...Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.展开更多
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei...Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.展开更多
Current gas well decline analysis under boundary-dominated flow(BDF)is largely based on the Arps'empirical hyperbolic decline model and the analytical type curve tools associated with pseudo-functions.Due to the n...Current gas well decline analysis under boundary-dominated flow(BDF)is largely based on the Arps'empirical hyperbolic decline model and the analytical type curve tools associated with pseudo-functions.Due to the nonlinear flow behavior of natural gas,these analysis methods generally require iterative calculations.In this study,the dimensionless gas rate(qg/qgi)is introduced,and an explicit method to determine the average reservoir pressure and the original gas in place(OGIP)for a volumetric gas reservoir is proposed.We show that the dimensionless gas rate in the BDF is only the function of the gas PVT parameters and reservoir pressure.Step-by-step analysis procedures are presented that enable explicit and straightforward estimation of average reservoir pressure and OGIP by straight-line analysis.Compared with current techniques,this methodology avoids the iterative calculation of pseudo-time and pseudo-pressure functions,lowers the multiplicity of type curve analysis,and is applicable in different production situations(constant/variable gas flow rate,constant/variable bottom-hole pressure)with a broad range of applications and ease of use.Reservoir numerical simulation and field examples are thoroughly discussed to highlight the capabilities of the proposed approach.展开更多
The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pr...The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days.展开更多
Viral infections play a crucial role in marine biogeochemical cycles,by regulating bacterial mortality and mediating nutrient and carbon fluxes.However,despite of their ecological significance,existing climate change ...Viral infections play a crucial role in marine biogeochemical cycles,by regulating bacterial mortality and mediating nutrient and carbon fluxes.However,despite of their ecological significance,existing climate change models generally fail to incorporate virus-mediated ecological processes due to the current limited understanding of marine viral dynamics under global warming.While numerous studies have explored the effect of warming for viral decay and production,how temperature regulates the total abundance of marine viruses remains unclear.In this study,we conducted year-round measurements of viral production and decay rates in Qingdao's coastal waters,with additional experimental warming treatments.The result showed that under in-situ temperature,the viral decay and production rate displayed distinct seasonal variations.With the exception of summer,elevated temperature stimulated both viral decay rate and production rate,and further improved the net viral production rate.While in summer,the net viral production rate turned negative,implying divergent threshold viral decay and viral production rate on warming.Our study deepens the understanding of the effect of global warming on marine viruses and provides scientific data for climate change models.展开更多
Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and...Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research.展开更多
Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.Howev...Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality.展开更多
When the operating temperature of a solid oxide electrolysis cell(SOEC)is lower than the outlet temperature of a nuclear reactor,the reactor can be directly coupled with the SOEC as a high-temperature heat source.Howe...When the operating temperature of a solid oxide electrolysis cell(SOEC)is lower than the outlet temperature of a nuclear reactor,the reactor can be directly coupled with the SOEC as a high-temperature heat source.However,the key to the efficiency and return on investment of this hybrid energy system lies in the expected lifetime of the SOEC.This study assessed Ni-YSZ|YSZ|GDC|LSC fuel electrode support cells’long-term stability during electrolysis at 650℃with a current density of−0.5Acm^(−2)over 1818 h.The average voltage degradation rate of 2.63%kh^(−1)unfolded in two phases:an initial rapid decay(90 to 1120 h at 3.58%kh^(−1))and a stable decay(1120 to 1818 h at 2.14%kh^(−1)),emphasizing SOECs’probability coupling with nuclear reactors at 650℃.Post-1818-hour electrolysis revealed nickel particle formation associated with Ni(OH)_(x)diffusion and re-deposition,alongside a strontium-containing layer causing interface cracking.Despite minimal strontium segregation in the EDS,XPS data indicated surface segregation of Sr.This study provides crucial insights into prolonged SOEC operation,highlighting both its potential and challenges.展开更多
Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy a...Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys.展开更多
With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-...With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause.展开更多
High-quality silage is the cornerstone to sustainable livestock development and animal food production.As the core fermentation bacteria of silage,Lactobacillus directly regulates silage fermentation by producing lact...High-quality silage is the cornerstone to sustainable livestock development and animal food production.As the core fermentation bacteria of silage,Lactobacillus directly regulates silage fermentation by producing lactic acid,enzymes,and other bioactive molecules.However,traditional screening methods for functional strains are labor-intensive and time-consuming.Recent advances in synthetic biology,particularly the development of CRISPR-Cas genome editing technology,offer a revolutionary approach to designing Lactobacillus strains with customized traits.This review systematically reviewed the importance of silage in sustainable agricultural development and the limitations of current silage preparation and promotion.It also discussed the application of strain engineering approaches in optimizing the phenotypic performance of Lactobacillus for better silage.Building on this,we reviewed the research progress of CRISPR-Cas9 gene editing in Lactobacillus and discussed how to leverage its high efficiency and precision to optimize the strain's traits for improved silage quality and functionality.CRISPR-Cas9 toolkits are expected to achieve directed evolution of strain performance,ultimately yielding next-generation silage microbial inoculants with multiple functions,adaptability to multiple substrates,and eco-friendly characteristics.The use of such innovative biotechnologies would facilitate resource-efficient utilization,promote animal performance and health for sustainable development in livestock production.展开更多
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a...Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service...With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.展开更多
Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data co...Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment.展开更多
At present,the global tea industry is in a stage of transformation towards intelligent chemical development.Although traditional machine learning methods have achieved good results in the production and processing sup...At present,the global tea industry is in a stage of transformation towards intelligent chemical development.Although traditional machine learning methods have achieved good results in the production and processing supervision of flower and fruit tea,it is difficult to improve supervision efficiency due to the limitations of manually extracting features.the automatic feature learning function of convolutional neural network(cNN)solves this limitation and opens up a new perspective for the intelligent development of the flower and fruit tea industry.this article reviews the latest progress in the application advantages of cNN in the flower and fruit tea industry.A systematic review and meta-analysis were conducted on applying cNN in pest control,harvesting,and processing methods of flower and fruit tea raw materials(teas,flowers,fruits).finally,an outlook was made on the relevant advanced progress and prospects.compared with traditional machine learning methods,cNN has significant advantages in supervising flower and fruit tea production and processing.this review is expected to provide new insights into the application of intelligent technology in the tea industry.展开更多
As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and ...As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities.展开更多
Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable ene...Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation.展开更多
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a...Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.展开更多
基金funding support from the National Natural Science Foundation of China(No.52204065,No.ZX20230398)supported by a grant from the Human Resources Development Program(No.20216110100070)of the Korea Institute of Energy Technology Evaluation and Planning(KETEP)。
文摘In the realm of subsurface flow simulations,deep-learning-based surrogate models have emerged as a promising alternative to traditional simulation methods,especially in addressing complex optimization problems.However,a significant challenge lies in the necessity of numerous high-fidelity training simulations to construct these deep-learning models,which limits their application to field-scale problems.To overcome this limitation,we introduce a training procedure that leverages transfer learning with multi-fidelity training data to construct surrogate models efficiently.The procedure begins with the pre-training of the surrogate model using a relatively larger amount of data that can be efficiently generated from upscaled coarse-scale models.Subsequently,the model parameters are finetuned with a much smaller set of high-fidelity simulation data.For the cases considered in this study,this method leads to about a 75%reduction in total computational cost,in comparison with the traditional training approach,without any sacrifice of prediction accuracy.In addition,a dedicated well-control embedding model is introduced to the traditional U-Net architecture to improve the surrogate model's prediction accuracy,which is shown to be particularly effective when dealing with large-scale reservoir models under time-varying well control parameters.Comprehensive results and analyses are presented for the prediction of well rates,pressure and saturation states of a 3D synthetic reservoir system.Finally,the proposed procedure is applied to a field-scale production optimization problem.The trained surrogate model is shown to provide excellent generalization capabilities during the optimization process,in which the final optimized net-present-value is much higher than those from the training data ranges.
基金funded by the Key Research and Development Program of Shaanxi,China(No.2024GX-YBXM-503)the National Natural Science Foundation of China(No.51974254)。
文摘Hydraulic fracturing technology has achieved remarkable results in improving the production of tight gas reservoirs,but its effectiveness is under the joint action of multiple factors of complexity.Traditional analysis methods have limitations in dealing with these complex and interrelated factors,and it is difficult to fully reveal the actual contribution of each factor to the production.Machine learning-based methods explore the complex mapping relationships between large amounts of data to provide datadriven insights into the key factors driving production.In this study,a data-driven PCA-RF-VIM(Principal Component Analysis-Random Forest-Variable Importance Measures)approach of analyzing the importance of features is proposed to identify the key factors driving post-fracturing production.Four types of parameters,including log parameters,geological and reservoir physical parameters,hydraulic fracturing design parameters,and reservoir stimulation parameters,were inputted into the PCA-RF-VIM model.The model was trained using 6-fold cross-validation and grid search,and the relative importance ranking of each factor was finally obtained.In order to verify the validity of the PCA-RF-VIM model,a consolidation model that uses three other independent data-driven methods(Pearson correlation coefficient,RF feature significance analysis method,and XGboost feature significance analysis method)are applied to compare with the PCA-RF-VIM model.A comparison the two models shows that they contain almost the same parameters in the top ten,with only minor differences in one parameter.In combination with the reservoir characteristics,the reasonableness of the PCA-RF-VIM model is verified,and the importance ranking of the parameters by this method is more consistent with the reservoir characteristics of the study area.Ultimately,the ten parameters are selected as the controlling factors that have the potential to influence post-fracturing gas production,as the combined importance of these top ten parameters is 91.95%on driving natural gas production.Analyzing and obtaining these ten controlling factors provides engineers with a new insight into the reservoir selection for fracturing stimulation and fracturing parameter optimization to improve fracturing efficiency and productivity.
文摘Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.
基金supported by the Young Elite Scientist Sponsorship Program by Beijing Association for Science and Technology,China(No.BYESS2023262)the Science Foundation of China University of Petroleum(Beijing),China(No.2462022BJRC004).
文摘Current gas well decline analysis under boundary-dominated flow(BDF)is largely based on the Arps'empirical hyperbolic decline model and the analytical type curve tools associated with pseudo-functions.Due to the nonlinear flow behavior of natural gas,these analysis methods generally require iterative calculations.In this study,the dimensionless gas rate(qg/qgi)is introduced,and an explicit method to determine the average reservoir pressure and the original gas in place(OGIP)for a volumetric gas reservoir is proposed.We show that the dimensionless gas rate in the BDF is only the function of the gas PVT parameters and reservoir pressure.Step-by-step analysis procedures are presented that enable explicit and straightforward estimation of average reservoir pressure and OGIP by straight-line analysis.Compared with current techniques,this methodology avoids the iterative calculation of pseudo-time and pseudo-pressure functions,lowers the multiplicity of type curve analysis,and is applicable in different production situations(constant/variable gas flow rate,constant/variable bottom-hole pressure)with a broad range of applications and ease of use.Reservoir numerical simulation and field examples are thoroughly discussed to highlight the capabilities of the proposed approach.
文摘The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days.
基金supported by the National Natural Science Foundation of China(No.42276108)the Young Scientists Fund of Shandong Provincial Natural Science Foundation(No.ZR2022QD052)。
文摘Viral infections play a crucial role in marine biogeochemical cycles,by regulating bacterial mortality and mediating nutrient and carbon fluxes.However,despite of their ecological significance,existing climate change models generally fail to incorporate virus-mediated ecological processes due to the current limited understanding of marine viral dynamics under global warming.While numerous studies have explored the effect of warming for viral decay and production,how temperature regulates the total abundance of marine viruses remains unclear.In this study,we conducted year-round measurements of viral production and decay rates in Qingdao's coastal waters,with additional experimental warming treatments.The result showed that under in-situ temperature,the viral decay and production rate displayed distinct seasonal variations.With the exception of summer,elevated temperature stimulated both viral decay rate and production rate,and further improved the net viral production rate.While in summer,the net viral production rate turned negative,implying divergent threshold viral decay and viral production rate on warming.Our study deepens the understanding of the effect of global warming on marine viruses and provides scientific data for climate change models.
基金supported by the International Partnership program of the Chinese Academy of Sciences(170GJHZ2023074GC)National Natural Science Foundation of China(42425706 and 42488201)+1 种基金National Key Research and Development Program of China(2024YFF0807902)Beijing Natural Science Foundation(8242041),and China Postdoctoral Science Foundation(2025M770353).
文摘Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research.
基金supported by the National Key R&D Program of China[Grant No.2023YFF0713600]the National Natural Science Foundation of China[Grant No.62275062]+3 种基金Project of Shandong Innovation and Startup Community of High-end Medical Apparatus and Instruments[Grant No.2023-SGTTXM-002 and 2024-SGTTXM-005]the Shandong Province Technology Innovation Guidance Plan(Central Leading Local Science and Technology Development Fund)[Grant No.YDZX2023115]the Taishan Scholar Special Funding Project of Shandong Provincethe Shandong Laboratory of Advanced Biomaterials and Medical Devices in Weihai[Grant No.ZL202402].
文摘Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality.
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(No.XDA0400000),the Youth Innovation Promotion Association of the Chinese Academy of Sciences(No.2021253)+1 种基金the Major Science and Technology Projects of China National Offshore Oil Corporation Limited during the 14th Five Year Plan(No.KJGG-2022-12-CCUS-030500)the Photon Science Center for Carbon Neutrality of Chinese Academy of Science.
文摘When the operating temperature of a solid oxide electrolysis cell(SOEC)is lower than the outlet temperature of a nuclear reactor,the reactor can be directly coupled with the SOEC as a high-temperature heat source.However,the key to the efficiency and return on investment of this hybrid energy system lies in the expected lifetime of the SOEC.This study assessed Ni-YSZ|YSZ|GDC|LSC fuel electrode support cells’long-term stability during electrolysis at 650℃with a current density of−0.5Acm^(−2)over 1818 h.The average voltage degradation rate of 2.63%kh^(−1)unfolded in two phases:an initial rapid decay(90 to 1120 h at 3.58%kh^(−1))and a stable decay(1120 to 1818 h at 2.14%kh^(−1)),emphasizing SOECs’probability coupling with nuclear reactors at 650℃.Post-1818-hour electrolysis revealed nickel particle formation associated with Ni(OH)_(x)diffusion and re-deposition,alongside a strontium-containing layer causing interface cracking.Despite minimal strontium segregation in the EDS,XPS data indicated surface segregation of Sr.This study provides crucial insights into prolonged SOEC operation,highlighting both its potential and challenges.
文摘Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys.
文摘With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause.
基金supported by the National Nature Science Foundation of China(No.U20A2002)。
文摘High-quality silage is the cornerstone to sustainable livestock development and animal food production.As the core fermentation bacteria of silage,Lactobacillus directly regulates silage fermentation by producing lactic acid,enzymes,and other bioactive molecules.However,traditional screening methods for functional strains are labor-intensive and time-consuming.Recent advances in synthetic biology,particularly the development of CRISPR-Cas genome editing technology,offer a revolutionary approach to designing Lactobacillus strains with customized traits.This review systematically reviewed the importance of silage in sustainable agricultural development and the limitations of current silage preparation and promotion.It also discussed the application of strain engineering approaches in optimizing the phenotypic performance of Lactobacillus for better silage.Building on this,we reviewed the research progress of CRISPR-Cas9 gene editing in Lactobacillus and discussed how to leverage its high efficiency and precision to optimize the strain's traits for improved silage quality and functionality.CRISPR-Cas9 toolkits are expected to achieve directed evolution of strain performance,ultimately yielding next-generation silage microbial inoculants with multiple functions,adaptability to multiple substrates,and eco-friendly characteristics.The use of such innovative biotechnologies would facilitate resource-efficient utilization,promote animal performance and health for sustainable development in livestock production.
文摘Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications.
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
文摘With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.
基金supported by Natural Science Foundation of Qinghai Province(2025-ZJ-994M)Scientific Research Innovation Capability Support Project for Young Faculty(SRICSPYF-BS2025007)National Natural Science Foundation of China(62566050).
文摘Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment.
基金supported by the National Key research and Development Program of china(2022YfD2101102)fujian Provincial Natural science foundation Project(2024J01407).
文摘At present,the global tea industry is in a stage of transformation towards intelligent chemical development.Although traditional machine learning methods have achieved good results in the production and processing supervision of flower and fruit tea,it is difficult to improve supervision efficiency due to the limitations of manually extracting features.the automatic feature learning function of convolutional neural network(cNN)solves this limitation and opens up a new perspective for the intelligent development of the flower and fruit tea industry.this article reviews the latest progress in the application advantages of cNN in the flower and fruit tea industry.A systematic review and meta-analysis were conducted on applying cNN in pest control,harvesting,and processing methods of flower and fruit tea raw materials(teas,flowers,fruits).finally,an outlook was made on the relevant advanced progress and prospects.compared with traditional machine learning methods,cNN has significant advantages in supervising flower and fruit tea production and processing.this review is expected to provide new insights into the application of intelligent technology in the tea industry.
基金supported by the National Science Foundation of China(No.62171387)the Science and Technology Program of Sichuan Province(No.2024NSFSC0468)the China Postdoctoral Science Foundation(No.2019M663475).
文摘As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities.
基金supported by Science and Technology Standard Project of Guangdong Electric Power Design Institute(ER11301W,ER11811W).
文摘Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation.
基金funded by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R104)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability.