期刊文献+
共找到401,534篇文章
< 1 2 250 >
每页显示 20 50 100
A small step towards the epistemic decentralization of science:A dataset of journals and publications indexed in African Journals Online
1
作者 Patricia Alonso-Álvarez 《Journal of Data and Information Science》 2025年第4期104-121,共18页
Purpose:This paper examines African Journals Online(AJOL)as a bibliometric resource,providing a structured dataset of journal and publication metadata.In addition,it integrates AJOL data with OpenAlex to enhance metad... Purpose:This paper examines African Journals Online(AJOL)as a bibliometric resource,providing a structured dataset of journal and publication metadata.In addition,it integrates AJOL data with OpenAlex to enhance metadata coverage and improve interoperability with other bibliometric sources.Design/methodology/approach:The journal list and publications indexed in AJOL were retrieved using web scraping techniques.This paper details the database construction process,highlighting its strengths and limitations,and presents a descriptive analysis of AJOL’s indexed journals and publications.Findings:The publication analysis demonstrates a steady growth in the number of publications over time but reveals significant disparities in their distribution across African countries.This paper presents an example of the possibility of integrating both sources using author country data from OpenAlex.The analysis of author contributions reveals that African journals serve as both regional and international venues,confirming that African journals play a dual role in fostering both regional and global research engagement.Research limitations:While AJOL contains relevant information for identifying and providing insights about African publications and journals,its metadata are limited.Therefore,the kind of analysis that can be performed with the database presented here is also limited.The integration with OpenAlex aims to overcome some of the limitations.Finally,although some automatic citation procedures have been performed,the metadata has not been manually curated.Therefore,if errors or inaccuracies are present in the AJOL,they may be reproduced in this database.Practical implications:The database introduced in this article contributes to the accessibility of African scholarly publications by providing structured,accessible metadata derived from the AJOL.It facilitates bibliometric analyses that are more representative of African research activities.This contribution complements ongoing efforts to develop alternative data sources and infrastructure that better reflect the diversity of global knowledge production.Originality/value:This paper presents a novel database for bibliometric analysis and offers a detailed report of the retrieval and construction procedures.The inclusion of matched data with OpenAlex further enhances the database’s utility.By showcasing AJOL’s potential,this study contributes to the broader goal of fostering inclusivity and improving the representation of African research in global bibliometric analyses. 展开更多
关键词 Decentralization of science African journals online African science data paper
在线阅读 下载PDF
On the Riemann-Hilbert problem for the reverse space-time nonlocal Hirota equation with step-like initial data
2
作者 Bei-Bei Hu Ling Zhang +1 位作者 Zu-Yi Shen Ji Lin 《Communications in Theoretical Physics》 2025年第2期30-38,共9页
In this paper,we use the Riemann-Hilbert(RH)method to investigate the Cauchy problem of the reverse space-time nonlocal Hirota equation with step-like initial data:q(z,0)=o(1)as z→-∞and q(z,0)=δ+o(1)as z→∞,where... In this paper,we use the Riemann-Hilbert(RH)method to investigate the Cauchy problem of the reverse space-time nonlocal Hirota equation with step-like initial data:q(z,0)=o(1)as z→-∞and q(z,0)=δ+o(1)as z→∞,whereδis an arbitrary positive constant.We show that the solution of the Cauchy problem can be determined by the solution of the corresponding matrix RH problem established on the plane of complex spectral parameterλ.As an example,we construct an exact solution of the reverse space-time nonlocal Hirota equation in a special case via this RH problem. 展开更多
关键词 nonlocal Hirota equation Cauchy problem Riemann-Hilbert problem step-like initial data
原文传递
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
3
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
Rheological behaviors of step ladder-structured nitrocellulose in solution and gelatinization process
4
作者 Yu Luan Jiayi Du +2 位作者 Teng Ren Chengkai Pu Zhenggang Xiao 《Defence Technology(防务技术)》 2026年第2期110-124,共15页
Step ladder-structured nitrocellulose(LNC)is a novel energetic binder prepared by chemically modifying nitrocellulose(NC)with the introduction of flexible polyethylene glycol(PEG-400)chain segments,with a regular stru... Step ladder-structured nitrocellulose(LNC)is a novel energetic binder prepared by chemically modifying nitrocellulose(NC)with the introduction of flexible polyethylene glycol(PEG-400)chain segments,with a regular structure and good performance of bonding.The step ladder-structured addresses critical limitations of NC-based propellants,including low-temperature brittleness and high sensitivity,while enhancing process safety.Although the structural,thermal,and other properties of LNC have been investigated in our previous research,there is a lack of systematic studies on the rheological properties during solution and gelatinization.The study of the relationship between the structural features and rheological properties of LNC is a key factor in guiding its gelatinization and improving the properties of LNC-based propellants.Steady-state rheology flow experiments revealed that LNC exhibited shear thinning in different solutions,which decreased with increasing concentration.It has desirable solu-bility and dispersion in N,N-dimethylformamide(DMF)solvent.The effect of solvents on the entan-glement or orientation of LNC molecular chains may be reduced.These results can be quantitatively demonstrated using the Herschel-Bulkley model.Dynamic viscoelastic studies identified a critical point of concentration-frequency of 2.5 rad/s.This particular frequency point is a turning point in the law of the effect of concentration on the loss factor(tanδ).For gelatinized systems,increasing the solvent content reduces the temperature sensitivity of the gelatinized materials.The viscosity-temperature correlation based on the Arrhenius equation allowed the optimization of the solvent content through the derived equilibrium relationship.These structure-rheological performance relationships establish basic guidelines for the precision gelatinization of LNC-based propellant,provide theoretical support for the replacement of conventional NC by LNC,and guide the gelatinization process to improve the performance of gun propellants. 展开更多
关键词 step ladder-structured nitrocellulose Rheological properties GELATINIZATION
在线阅读 下载PDF
Effectiveness of a stepped self-care program for stroke survivors:A quasi-experimental study
5
作者 Zihao Ruan Dan Wang +5 位作者 Wenna Wang Yongxia Mei Hui Wang Suyan Chen Qiushi Zhang Zhenxiang Zhang 《International Journal of Nursing Sciences》 2026年第1期45-52,I0004,共9页
Objectives This study aimed to evaluate the effectiveness of the stepped self-care program on the self-care,self-efficacy,and quality of life of stroke survivors.Methods This quasi-experimental study allocated 110 str... Objectives This study aimed to evaluate the effectiveness of the stepped self-care program on the self-care,self-efficacy,and quality of life of stroke survivors.Methods This quasi-experimental study allocated 110 stroke survivors from two neurology wards into an intervention group(n=55)who received the stepped self-care program and a control group(n=55)who received usual care from June to December 2023.The Self-Care of Stroke Inventory,Stroke Self-Efficacy Questionnaire,and the short version of the Stroke Specific Quality of Life Scale were administered at baseline(T0),immediately post-intervention(T_(1)),and at 1-month(T_(2))and 3-month(T_(3))follow-ups.Data were analyzed using repeated measures analyses of variance,and generalized estimating equations.Results A total of 48 participants in the intervention group and 50 participants in the control group completed the study.No statistically significant differences were observed at T0 in any of the measured indicators(all P>0.05).The study showed significant group,time,and group×time interaction effects across the assessed outcomes(all P<0.05).Follow-up between-group comparisons at T_(1),T_(2),and T_(3) indicated that the intervention group had significantly higher scores in self-care maintenance,self-care monitoring,self-care management,self-efficacy,and quality of life than the control group(all P<0.001).Conclusions The stepped self-care program significantly improved self-care behaviors,self-efficacy,and quality of life among stroke survivors.These findings support the broader implementation of this approach in post-discharge home self-care. 展开更多
关键词 Quality of life SELF-CARE SELF-EFFICACY stepped care program STROKE
暂未订购
Combining different climate datasets better reflects the response of warm-temperate forests to climate:a case study from Mt.Dongling,Beijing
6
作者 Shengjie Wang Haiyang Liu +1 位作者 Shuai Yuan Chenxi Xu 《Journal of Forestry Research》 2026年第2期131-143,共13页
Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and... Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research. 展开更多
关键词 Climate data representativeness Alternative climate data selection Response differences Deciduous broad-leaf forest Warm temperate zone
在线阅读 下载PDF
Photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer
7
作者 Jialin Li Tingting Li +2 位作者 Yiming Ma Yi Shen Mingjian Sun 《Journal of Innovative Optical Health Sciences》 2026年第1期110-125,共16页
Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.Howev... Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality. 展开更多
关键词 Photoacoustic-computed tomography data compression TRANSFORMER
原文传递
Toward Secure and Auditable Data Sharing:A Cross-Chain CP-ABE Framework
8
作者 Ye Tian Zhuokun Fan Yifeng Zhang 《Computers, Materials & Continua》 2026年第4期1509-1529,共21页
Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy a... Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys. 展开更多
关键词 data sharing blockchain attribute-based encryption dynamic permissions
在线阅读 下载PDF
Design,Realization,and Evaluation of Faster End-to-End Data Transmission over Voice Channels
9
作者 Jian Huang Ming weiLi +2 位作者 Yulong Tian Yi Yao Hao Han 《Computers, Materials & Continua》 2026年第4期1650-1675,共26页
With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-... With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause. 展开更多
关键词 Deep learning modulation CHIRP data over voice
在线阅读 下载PDF
A Composite Loss-Based Autoencoder for Accurate and Scalable Missing Data Imputation
10
作者 Thierry Mugenzi Cahit Perkgoz 《Computers, Materials & Continua》 2026年第1期1985-2005,共21页
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a... Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications. 展开更多
关键词 Missing data imputation autoencoder deep learning missing mechanisms
在线阅读 下载PDF
ISTIRDA:An Efficient Data Availability Sampling Scheme for Lightweight Nodes in Blockchain
11
作者 Jiaxi Wang Wenbo Sun +3 位作者 Ziyuan Zhou Shihua Wu Jiang Xu Shan Ji 《Computers, Materials & Continua》 2026年第4期685-700,共16页
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re... Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes. 展开更多
关键词 Blockchain scalability data availability sampling lightweight nodes
在线阅读 下载PDF
Research on the Optimal Allocation of Community Elderly Care Service Resources Based on Big Data Technology
12
作者 Shuying Li 《Journal of Clinical and Nursing Research》 2026年第1期241-246,共6页
With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service... With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry. 展开更多
关键词 Big data technology COMMUNITY Elderly care Service resources
在线阅读 下载PDF
Multivariate Data Anomaly Detection Based on Graph Structure Learning
13
作者 Haoxiang Wen Zhaoyang Wang +2 位作者 Zhonglin Ye Haixing Zhao Maosong Sun 《Computer Modeling in Engineering & Sciences》 2026年第1期1174-1206,共33页
Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data co... Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment. 展开更多
关键词 Multivariate data anomaly detection graph structure learning coupled network
在线阅读 下载PDF
Constructions of Control Sequence Set for Hierarchical Access in Data Link Network
14
作者 Niu Xianhua Ma Jiabei +3 位作者 Zhou Enzhi Wang Yaoxuan Zeng Bosen Li Zhiping 《China Communications》 2026年第1期67-80,共14页
As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and ... As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities. 展开更多
关键词 control sequence data link hierarchical access control theoretical bound
在线阅读 下载PDF
Multi-Time Scale Optimization Scheduling of Data Center Considering Workload Shift and Refrigeration Regulation
15
作者 Luyao Liu Xiao Liao +1 位作者 Yiqian Li Shaofeng Zhang 《Energy Engineering》 2026年第2期451-486,共36页
Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable ene... Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation. 展开更多
关键词 data center renewable energy load shift multi-time scale optimization
在线阅读 下载PDF
Motion characteristics of a flexible self-propelled slender particle in a backward-facing step flow
16
作者 Yeyu CHEN Zhenyu OUYANG +1 位作者 Zhaowu LIN Jianzhong LIN 《Applied Mathematics and Mechanics(English Edition)》 2026年第2期401-422,共22页
This study investigates the motion behavior of a slender flexible particle in a backward-facing step(BFS)flow using the direct-forcing fictitious domain method,with a particular focus on the trapping phenomena near th... This study investigates the motion behavior of a slender flexible particle in a backward-facing step(BFS)flow using the direct-forcing fictitious domain method,with a particular focus on the trapping phenomena near the separation vortex region.Three distinct motion modes are identified:periodic rotation or oscillation within the vortex(trapping),downstream transport(escape),and transition state exhibiting unstable trapping.A dynamic balance among inward migration,centrifugal effects,wall interactions,and elastic forces enables the particle to achieve stable orbital motion within two distinct limit cycles.The topology of these orbits is governed by parameters,including the aspect ratio,structural flexibility,deformation intensity,and fluid inertia,all of which are characterized by the Reynolds number(Re).Specifically,fluid inertia plays a dominant role in facilitating particle trapping.At a fixed Re,a particle with a smaller aspect ratio tends to migrate inward and become trapped,whereas one with a larger aspect ratio is more likely to escape.Structural flexibility,especially when enhanced by confinement near the wall,leads to elastic deformation that induces secondary vortices and a weak flipping motion.The deformation intensityαsignificantly influences the lateral migration of the slender particle after the initial release;a largerαcauses it to drift toward the channel centerline,increasing the probability of escape.These findings provide a theoretical foundation for optimizing the transport and capture of slender soft swimmers in complex flow environments. 展开更多
关键词 flexible slender particle SELF-PROPELLED backward-facing step(BFS)flow direct-forcing fictitious domain method
在线阅读 下载PDF
Advances in Machine Learning for Explainable Intrusion Detection Using Imbalance Datasets in Cybersecurity with Harris Hawks Optimization
17
作者 Amjad Rehman Tanzila Saba +2 位作者 Mona M.Jamjoom Shaha Al-Otaibi Muhammad I.Khan 《Computers, Materials & Continua》 2026年第1期1804-1818,共15页
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a... Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability. 展开更多
关键词 Intrusion detection XAI machine learning ensemble method CYBERSECURITY imbalance data
在线阅读 下载PDF
Automated Machine Learning for Fault Diagnosis Using Multimodal Mel-Spectrogram and Vibration Data
18
作者 Zehao Li Xuting Zhang +4 位作者 Hongqi Lin Wu Qin Junyu Qi Zhuyun Chen Qiang Liu 《Computer Modeling in Engineering & Sciences》 2026年第2期471-498,共28页
To ensure the safe and stable operation of rotating machinery,intelligent fault diagnosis methods hold significant research value.However,existing diagnostic approaches largely rely on manual feature extraction and ex... To ensure the safe and stable operation of rotating machinery,intelligent fault diagnosis methods hold significant research value.However,existing diagnostic approaches largely rely on manual feature extraction and expert experience,which limits their adaptability under variable operating conditions and strong noise environments,severely affecting the generalization capability of diagnostic models.To address this issue,this study proposes a multimodal fusion fault diagnosis framework based on Mel-spectrograms and automated machine learning(AutoML).The framework first extracts fault-sensitive Mel time–frequency features from acoustic signals and fuses them with statistical features of vibration signals to construct complementary fault representations.On this basis,automated machine learning techniques are introduced to enable end-to-end diagnostic workflow construction and optimal model configuration acquisition.Finally,diagnostic decisions are achieved by automatically integrating the predictions of multiple high-performance base models.Experimental results on a centrifugal pump vibration and acoustic dataset demonstrate that the proposed framework achieves high diagnostic accuracy under noise-free conditions and maintains strong robustness under noisy interference,validating its efficiency,scalability,and practical value for rotating machinery fault diagnosis. 展开更多
关键词 Automated machine learning mechanical fault diagnosis feature engineering multimodal data
在线阅读 下载PDF
Enhanced Capacity Reversible Data Hiding Based on Pixel Value Ordering in Triple Stego Images
19
作者 Kim Sao Nguyen Ngoc Dung Bui 《Computers, Materials & Continua》 2026年第1期1571-1586,共16页
Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi... Reversible data hiding(RDH)enables secret data embedding while preserving complete cover image recovery,making it crucial for applications requiring image integrity.The pixel value ordering(PVO)technique used in multi-stego images provides good image quality but often results in low embedding capability.To address these challenges,this paper proposes a high-capacity RDH scheme based on PVO that generates three stego images from a single cover image.The cover image is partitioned into non-overlapping blocks with pixels sorted in ascending order.Four secret bits are embedded into each block’s maximum pixel value,while three additional bits are embedded into the second-largest value when the pixel difference exceeds a predefined threshold.A similar embedding strategy is also applied to the minimum side of the block,including the second-smallest pixel value.This design enables each block to embed up to 14 bits of secret data.Experimental results demonstrate that the proposed method achieves significantly higher embedding capacity and improved visual quality compared to existing triple-stego RDH approaches,advancing the field of reversible steganography. 展开更多
关键词 RDH reversible data hiding PVO RDH base three stego images
在线阅读 下载PDF
PEMFC Performance Degradation Prediction Based on CNN-BiLSTM with Data Augmentation by an Improved GAN
20
作者 Xiaolu Wang Haoyu Sun +1 位作者 Aiguo Wang Xin Xia 《Energy Engineering》 2026年第2期417-435,共19页
To address the issues of insufficient and imbalanced data samples in proton exchange membrane fuel cell(PEMFC)performance degradation prediction,this study proposes a data augmentation-based model to predict PEMFC per... To address the issues of insufficient and imbalanced data samples in proton exchange membrane fuel cell(PEMFC)performance degradation prediction,this study proposes a data augmentation-based model to predict PEMFC performance degradation.Firstly,an improved generative adversarial network(IGAN)with adaptive gradient penalty coefficient is proposed to address the problems of excessively fast gradient descent and insufficient diversity of generated samples.Then,the IGANis used to generate datawith a distribution analogous to real data,therebymitigating the insufficiency and imbalance of original PEMFC samples and providing the predictionmodel with training data rich in feature information.Finally,a convolutional neural network-bidirectional long short-termmemory(CNN-BiLSTM)model is adopted to predict PEMFC performance degradation.Experimental results show that the data generated by the proposed IGAN exhibits higher quality than that generated by the original GAN,and can fully characterize and enrich the original data’s features.Using the augmented data,the prediction accuracy of the CNN-BiLSTM model is significantly improved,rendering it applicable to tasks of predicting PEMFC performance degradation. 展开更多
关键词 PEMFC performance degradation prediction data augmentation improved generative adversarial network
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部