期刊文献+
共找到362,200篇文章
< 1 2 250 >
每页显示 20 50 100
data.table和dplyr软件包在数据操作方面效率的评价 被引量:3
1
作者 刘红伟 邓晓伟 +2 位作者 段同庆 李长平 马骏 《中国卫生统计》 CSCD 北大核心 2020年第1期141-144,共4页
目的通过数据验证,比较目前应用广泛、高效的R语言中,data.table软件包和dplyr软件包在数据操作方面的运算效率,为R用户在数据处理效率方面选择合适的软件包提供建议。方法模拟产生不同样本量大小的数据,从选择行列、排序、分组计算、... 目的通过数据验证,比较目前应用广泛、高效的R语言中,data.table软件包和dplyr软件包在数据操作方面的运算效率,为R用户在数据处理效率方面选择合适的软件包提供建议。方法模拟产生不同样本量大小的数据,从选择行列、排序、分组计算、添加更新和合并五个方面比较data.table、dplyr和基本R函数的运算速度。结果data.table在选择行(DT[x==.])、更新、排序、内连接方面运算速度优势明显,在选择行(DT[x<.])、分组计算、左连接、添加方面和dplyr相比没有明显差异,在选择列方面基本R函数最优,data.table表现最差。结论data.table运算效率整体优于dplyr;如果处理数据量在GB级及以上,建议使用data.table软件包,GB级以下,data.table和dplyr两者均可。 展开更多
关键词 R语言 data.table dplyr
在线阅读 下载PDF
基于全球研究数据注册仓储Re3data.org的医学科学数据仓储建设 被引量:9
2
作者 吴思竹 李赞梅 +2 位作者 崔佳伟 修晓蕾 钱庆 《中华医学图书情报杂志》 CAS 2018年第9期20-31,共12页
目的:总结全球医学领域科学数据仓储建设现状、经验与不足,为我国医学科学数据仓储建设提供参考和借鉴。方法:利用统计分析、共现分析和社会网络分析等方法,结合可视化图表对2018年Re3data.org中收录的637个医学科学数据仓储的分布、建... 目的:总结全球医学领域科学数据仓储建设现状、经验与不足,为我国医学科学数据仓储建设提供参考和借鉴。方法:利用统计分析、共现分析和社会网络分析等方法,结合可视化图表对2018年Re3data.org中收录的637个医学科学数据仓储的分布、建设等情况进行分析。结果:美国、英国和国际组织在仓储合作建设中贡献最为突出,其中美国参与建设的数据仓储占50%,独立建设仓储达241个。多数仓储收录3~6类数据,科学和统计数据是被收录最多的类型,收录数据主题丰富,64.36%的仓储提供数据质控。Dspace等7种软件被用于数据仓储建设,数据管理涉及5种唯一标识符、16种元数据。结论:欧美国家占据了数据仓储建设高地,需多方合作共促数据开放共享。开源技术降低了仓储搭建门槛,标准规范建设保障仓储运管,数据分级共享提供接口支持,政策许可可保障多方权益。 展开更多
关键词 医学数据 数据仓储 开放数据 数据共享
在线阅读 下载PDF
基于RDS和Data.V技术、HACCP体系的水产药物检测分析
3
作者 马天雨 封腾望 王洋洋 《乡村科技》 2019年第1期125-126,共2页
目前,我国鱼药市场存在诸多问题,如有些药物言不符实,品质良莠不齐,养殖户在选择鱼药的时候常常得不到有效合理的信息,不能有效地对疾病做出合理控制,加大了养殖户的损失。利用云数据库等相关信息技术和相关的检测技术分析市场上现有的... 目前,我国鱼药市场存在诸多问题,如有些药物言不符实,品质良莠不齐,养殖户在选择鱼药的时候常常得不到有效合理的信息,不能有效地对疾病做出合理控制,加大了养殖户的损失。利用云数据库等相关信息技术和相关的检测技术分析市场上现有的药品,筛选出优质的、有效成分含量高的高品质鱼药,实现针对性治疗。基于此,本文探究将云数据库RDS、Data.V可视化数据与HACCP检测体系相结合,有效预防水产动物疾病。 展开更多
关键词 RDS数据库 data.V技术 HACCP检测体系 水产药物
在线阅读 下载PDF
面向用户服务的美国政府开放数据研究及启示——以美国Data.gov网站为例 被引量:29
4
作者 汪庆怡 高洁 《情报杂志》 CSSCI 北大核心 2016年第7期145-150,共6页
[目的/意义]随着大数据时代的到来,拥有丰富数据的各国政府开始从信息公开走向数据开放。作为一种新的潮流和趋势,各国政府在进行政府数据开放时,应顺应Web2.0时代的要求,以用户为中心,以公众的需求为导向,面向用户提供服务。通过对美... [目的/意义]随着大数据时代的到来,拥有丰富数据的各国政府开始从信息公开走向数据开放。作为一种新的潮流和趋势,各国政府在进行政府数据开放时,应顺应Web2.0时代的要求,以用户为中心,以公众的需求为导向,面向用户提供服务。通过对美国政府开放数据网站面向用户服务的研究,结合我国的实际情况,阐述该网站对我国政府数据开放的启示。[方法/过程]从各国的实践来看,美国的国家级政府开放数据平台Data.gov为各国提供了很好的学习案例。以Data.gov网站的板块设置为依据,从"数据提供""数据检索""数据利用"以及"与用户的交流与互动"4个层面对该网站面向用户提供的服务进行了详细介绍与分析。[结果/结论]对我国的启示有:建设一站式的政府数据开放平台;基于用户需求逐步开放政府数据;以用户为中心,提升用户体验;鼓励用户参与,发挥数据价值;将数据开放纳入政府工作绩效评估体系。 展开更多
关键词 政府数据开放 开放数据 电子政务 大数据
在线阅读 下载PDF
新冠肺炎科学数据集的元数据框架构建及可视化研究——以Re3data.org为例 被引量:2
5
作者 顾子慧 刘桂锋 刘琼 《情报科学》 CSSCI 北大核心 2023年第4期117-126,共10页
【目的/意义】在新冠肺炎疫情中,科学数据为疫情分析、管控和治理提供了重要的依据和支撑,为实现新冠肺炎科学数据的价值最大化,有必要构建新冠肺炎科学数据集元数据框架。【方法/过程】文章以Re3data.org中的新冠肺炎科学数据集为例,... 【目的/意义】在新冠肺炎疫情中,科学数据为疫情分析、管控和治理提供了重要的依据和支撑,为实现新冠肺炎科学数据的价值最大化,有必要构建新冠肺炎科学数据集元数据框架。【方法/过程】文章以Re3data.org中的新冠肺炎科学数据集为例,在对科学数据集元数据进行收集整理后,构建新冠肺炎科学数据集元数据框架,利用Protégé软件实现科学数据集本体构建,并借助图数据库Neo4j对所构建的知识图谱进行存储。【结果/结论】结果表明,对Re3data.org中的新冠肺炎科学数据集元数据进行关联融合,将元数据转化为多元化的数据存储及展示形式。【创新/局限】实现了新冠肺炎科学数据集知识图谱的构建,并且在图谱之中进行实体及其关系的查询检索和推理,细粒度地创建了科学数据集本体中各个部分属性、实体之间的关联,未来还应侧重跨平台科学数据集元数据的关联与融合。 展开更多
关键词 科学数据 元数据 知识本体 数据科学 新冠肺炎
原文传递
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
6
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
Combining different climate datasets better reflects the response of warm-temperate forests to climate:a case study from Mt.Dongling,Beijing
7
作者 Shengjie Wang Haiyang Liu +1 位作者 Shuai Yuan Chenxi Xu 《Journal of Forestry Research》 2026年第2期131-143,共13页
Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and... Accurately assessing the relationship between tree growth and climatic factors is of great importance in dendrochronology.This study evaluated the consistency between alternative climate datasets(including station and gridded data)and actual climate data(fixed-point observations near the sampling sites),in northeastern China’s warm temperate zone and analyzed differences in their correlations with tree-ring width index.The results were:(1)Gridded temperature data,as well as precipitation and relative humidity data from the Huailai meteorological station,was more consistent with the actual climate data;in contrast,gridded soil moisture content data showed significant discrepancies.(2)Horizontal distance had a greater impact on the representativeness of actual climate conditions than vertical elevation differences.(3)Differences in consistency between alternative and actual climate data also affected their correlations with tree-ring width indices.In some growing season months,correlation coefficients,both in magnitude and sign,differed significantly from those based on actual data.The selection of different alternative climate datasets can lead to biased results in assessing forest responses to climate change,which is detrimental to the management of forest ecosystems in harsh environments.Therefore,the scientific and rational selection of alternative climate data is essential for dendroecological and climatological research. 展开更多
关键词 Climate data representativeness Alternative climate data selection Response differences Deciduous broad-leaf forest Warm temperate zone
在线阅读 下载PDF
Advances in Machine Learning for Explainable Intrusion Detection Using Imbalance Datasets in Cybersecurity with Harris Hawks Optimization
8
作者 Amjad Rehman Tanzila Saba +2 位作者 Mona M.Jamjoom Shaha Al-Otaibi Muhammad I.Khan 《Computers, Materials & Continua》 2026年第1期1804-1818,共15页
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a... Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability. 展开更多
关键词 Intrusion detection XAI machine learning ensemble method CYBERSECURITY imbalance data
在线阅读 下载PDF
Toward Secure and Auditable Data Sharing:A Cross-Chain CP-ABE Framework
9
作者 Ye Tian Zhuokun Fan Yifeng Zhang 《Computers, Materials & Continua》 2026年第4期1509-1529,共21页
Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy a... Amid the increasing demand for data sharing,the need for flexible,secure,and auditable access control mechanisms has garnered significant attention in the academic community.However,blockchain-based ciphertextpolicy attribute-based encryption(CP-ABE)schemes still face cumbersome ciphertext re-encryption and insufficient oversight when handling dynamic attribute changes and cross-chain collaboration.To address these issues,we propose a dynamic permission attribute-encryption scheme for multi-chain collaboration.This scheme incorporates a multiauthority architecture for distributed attribute management and integrates an attribute revocation and granting mechanism that eliminates the need for ciphertext re-encryption,effectively reducing both computational and communication overhead.It leverages the InterPlanetary File System(IPFS)for off-chain data storage and constructs a cross-chain regulatory framework—comprising a Hyperledger Fabric business chain and a FISCO BCOS regulatory chain—to record changes in decryption privileges and access behaviors in an auditable manner.Security analysis shows selective indistinguishability under chosen-plaintext attack(sIND-CPA)security under the decisional q-Parallel Bilinear Diffie-Hellman Exponent Assumption(q-PBDHE).In the performance and experimental evaluations,we compared the proposed scheme with several advanced schemes.The results show that,while preserving security,the proposed scheme achieves higher encryption/decryption efficiency and lower storage overhead for ciphertexts and keys. 展开更多
关键词 data sharing blockchain attribute-based encryption dynamic permissions
在线阅读 下载PDF
Design,Realization,and Evaluation of Faster End-to-End Data Transmission over Voice Channels
10
作者 Jian Huang Ming weiLi +2 位作者 Yulong Tian Yi Yao Hao Han 《Computers, Materials & Continua》 2026年第4期1650-1675,共26页
With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-... With the popularization of new technologies,telephone fraud has become the main means of stealing money and personal identity information.Taking inspiration from the website authentication mechanism,we propose an end-to-end datamodem scheme that transmits the caller’s digital certificates through a voice channel for the recipient to verify the caller’s identity.Encoding useful information through voice channels is very difficult without the assistance of telecommunications providers.For example,speech activity detection may quickly classify encoded signals as nonspeech signals and reject input waveforms.To address this issue,we propose a novel modulation method based on linear frequency modulation that encodes 3 bits per symbol by varying its frequency,shape,and phase,alongside a lightweightMobileNetV3-Small-based demodulator for efficient and accurate signal decoding on resource-constrained devices.This method leverages the unique characteristics of linear frequency modulation signals,making them more easily transmitted and decoded in speech channels.To ensure reliable data delivery over unstable voice links,we further introduce a robust framing scheme with delimiter-based synchronization,a sample-level position remedying algorithm,and a feedback-driven retransmission mechanism.We have validated the feasibility and performance of our system through expanded real-world evaluations,demonstrating that it outperforms existing advanced methods in terms of robustness and data transfer rate.This technology establishes the foundational infrastructure for reliable certificate delivery over voice channels,which is crucial for achieving strong caller authentication and preventing telephone fraud at its root cause. 展开更多
关键词 Deep learning modulation CHIRP data over voice
在线阅读 下载PDF
A Composite Loss-Based Autoencoder for Accurate and Scalable Missing Data Imputation
11
作者 Thierry Mugenzi Cahit Perkgoz 《Computers, Materials & Continua》 2026年第1期1985-2005,共21页
Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel a... Missing data presents a crucial challenge in data analysis,especially in high-dimensional datasets,where missing data often leads to biased conclusions and degraded model performance.In this study,we present a novel autoencoder-based imputation framework that integrates a composite loss function to enhance robustness and precision.The proposed loss combines(i)a guided,masked mean squared error focusing on missing entries;(ii)a noise-aware regularization term to improve resilience against data corruption;and(iii)a variance penalty to encourage expressive yet stable reconstructions.We evaluate the proposed model across four missingness mechanisms,such as Missing Completely at Random,Missing at Random,Missing Not at Random,and Missing Not at Random with quantile censorship,under systematically varied feature counts,sample sizes,and missingness ratios ranging from 5%to 60%.Four publicly available real-world datasets(Stroke Prediction,Pima Indians Diabetes,Cardiovascular Disease,and Framingham Heart Study)were used,and the obtained results show that our proposed model consistently outperforms baseline methods,including traditional and deep learning-based techniques.An ablation study reveals the additive value of each component in the loss function.Additionally,we assessed the downstream utility of imputed data through classification tasks,where datasets imputed by the proposed method yielded the highest receiver operating characteristic area under the curve scores across all scenarios.The model demonstrates strong scalability and robustness,improving performance with larger datasets and higher feature counts.These results underscore the capacity of the proposed method to produce not only numerically accurate but also semantically useful imputations,making it a promising solution for robust data recovery in clinical applications. 展开更多
关键词 Missing data imputation autoencoder deep learning missing mechanisms
在线阅读 下载PDF
Photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer
12
作者 Jialin Li Tingting Li +2 位作者 Yiming Ma Yi Shen Mingjian Sun 《Journal of Innovative Optical Health Sciences》 2026年第1期110-125,共16页
Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.Howev... Photoacoustic-computed tomography is a novel imaging technique that combines high absorption contrast and deep tissue penetration capability,enabling comprehensive three-dimensional imaging of biological targets.However,the increasing demand for higher resolution and real-time imaging results in significant data volume,limiting data storage,transmission and processing efficiency of system.Therefore,there is an urgent need for an effective method to compress the raw data without compromising image quality.This paper presents a photoacoustic-computed tomography 3D data compression method and system based on Wavelet-Transformer.This method is based on the cooperative compression framework that integrates wavelet hard coding with deep learning-based soft decoding.It combines the multiscale analysis capability of wavelet transforms with the global feature modeling advantage of Transformers,achieving high-quality data compression and reconstruction.Experimental results using k-wave simulation suggest that the proposed compression system has advantages under extreme compression conditions,achieving a raw data compression ratio of up to 1:40.Furthermore,three-dimensional data compression experiment using in vivo mouse demonstrated that the maximum peak signal-to-noise ratio(PSNR)and structural similarity index(SSIM)values of reconstructed images reached 38.60 and 0.9583,effectively overcoming detail loss and artifacts introduced by raw data compression.All the results suggest that the proposed system can significantly reduce storage requirements and hardware cost,enhancing computational efficiency and image quality.These advantages support the development of photoacoustic-computed tomography toward higher efficiency,real-time performance and intelligent functionality. 展开更多
关键词 Photoacoustic-computed tomography data compression TRANSFORMER
原文传递
Research on the Optimal Allocation of Community Elderly Care Service Resources Based on Big Data Technology
13
作者 Shuying Li 《Journal of Clinical and Nursing Research》 2026年第1期241-246,共6页
With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service... With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry. 展开更多
关键词 Big data technology COMMUNITY Elderly care Service resources
在线阅读 下载PDF
Multivariate Data Anomaly Detection Based on Graph Structure Learning
14
作者 Haoxiang Wen Zhaoyang Wang +2 位作者 Zhonglin Ye Haixing Zhao Maosong Sun 《Computer Modeling in Engineering & Sciences》 2026年第1期1174-1206,共33页
Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data co... Multivariate anomaly detection plays a critical role in maintaining the stable operation of information systems.However,in existing research,multivariate data are often influenced by various factors during the data collection process,resulting in temporal misalignment or displacement.Due to these factors,the node representations carry substantial noise,which reduces the adaptability of the multivariate coupled network structure and subsequently degrades anomaly detection performance.Accordingly,this study proposes a novel multivariate anomaly detection model grounded in graph structure learning.Firstly,a recommendation strategy is employed to identify strongly coupled variable pairs,which are then used to construct a recommendation-driven multivariate coupling network.Secondly,a multi-channel graph encoding layer is used to dynamically optimize the structural properties of the multivariate coupling network,while a multi-head attention mechanism enhances the spatial characteristics of the multivariate data.Finally,unsupervised anomaly detection is conducted using a dynamic threshold selection algorithm.Experimental results demonstrate that effectively integrating the structural and spatial features of multivariate data significantly mitigates anomalies caused by temporal dependency misalignment. 展开更多
关键词 Multivariate data anomaly detection graph structure learning coupled network
在线阅读 下载PDF
Constructions of Control Sequence Set for Hierarchical Access in Data Link Network
15
作者 Niu Xianhua Ma Jiabei +3 位作者 Zhou Enzhi Wang Yaoxuan Zeng Bosen Li Zhiping 《China Communications》 2026年第1期67-80,共14页
As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and ... As an important resource in data link,time slots should be strategically allocated to enhance transmission efficiency and resist eavesdropping,especially considering the tremendous increase in the number of nodes and diverse communication needs.It is crucial to design control sequences with robust randomness and conflict-freeness to properly address differentiated access control in data link.In this paper,we propose a hierarchical access control scheme based on control sequences to achieve high utilization of time slots and differentiated access control.A theoretical bound of the hierarchical control sequence set is derived to characterize the constraints on the parameters of the sequence set.Moreover,two classes of optimal hierarchical control sequence sets satisfying the theoretical bound are constructed,both of which enable the scheme to achieve maximum utilization of time slots.Compared with the fixed time slot allocation scheme,our scheme reduces the symbol error rate by up to 9%,which indicates a significant improvement in anti-interference and eavesdropping capabilities. 展开更多
关键词 control sequence data link hierarchical access control theoretical bound
在线阅读 下载PDF
Multi-Time Scale Optimization Scheduling of Data Center Considering Workload Shift and Refrigeration Regulation
16
作者 Luyao Liu Xiao Liao +1 位作者 Yiqian Li Shaofeng Zhang 《Energy Engineering》 2026年第2期451-486,共36页
Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable ene... Data center industries have been facing huge energy challenges due to escalating power consumption and associated carbon emissions.In the context of carbon neutrality,the integration of data centers with renewable energy has become a prevailing trend.To advance the renewable energy integration in data centers,it is imperative to thoroughly explore the data centers’operational flexibility.Computing workloads and refrigeration systems are recognized as two promising flexible resources for power regulationwithin data centermicro-grids.This paper identifies and categorizes delay-tolerant computing workloads into three types(long-running non-interruptible,long-running interruptible,and short-running)and develops mathematical time-shifting models for each.Additionally,this paper examines the thermal dynamics of the computer room and derives a time-varying temperature model coupled to refrigeration power.Building on these models,this paper proposes a two-stage,multi-time scale optimization scheduling framework that jointly coordinates computing workloads time-shift in day-ahead scheduling and refrigeration power control in intra-day dispatch to mitigate renewable variability.A case study demonstrates that the framework effectively enhances the renewable-energy utilization,improves the operational economy of the data center microgrid,and mitigates the impact of renewable power uncertainty.The results highlight the potential of coordinated computing workloads and thermal system flexibility to support greener,more cost-effective data center operation. 展开更多
关键词 data center renewable energy load shift multi-time scale optimization
在线阅读 下载PDF
Graph-Based Unified Settlement Framework for Complex Electricity Markets:Data Integration and Automated Refund Clearing
17
作者 Xiaozhe Guo Suyan Long +4 位作者 Ziyu Yue Yifan Wang Guanting Yin Yuyang Wang Zhaoyuan Wu 《Energy Engineering》 2026年第1期56-90,共35页
The increasing complexity of China’s electricity market creates substantial challenges for settlement automation,data consistency,and operational scalability.Existing provincial settlement systems are fragmented,lack... The increasing complexity of China’s electricity market creates substantial challenges for settlement automation,data consistency,and operational scalability.Existing provincial settlement systems are fragmented,lack a unified data structure,and depend heavily on manual intervention to process high-frequency and retroactive transactions.To address these limitations,a graph-based unified settlement framework is proposed to enhance automation,flexibility,and adaptability in electricity market settlements.A flexible attribute-graph model is employed to represent heterogeneousmulti-market data,enabling standardized integration,rapid querying,and seamless adaptation to evolving business requirements.An extensible operator library is designed to support configurable settlement rules,and a suite of modular tools—including dataset generation,formula configuration,billing templates,and task scheduling—facilitates end-to-end automated settlement processing.A robust refund-clearing mechanism is further incorporated,utilizing sandbox execution,data-version snapshots,dynamic lineage tracing,and real-time changecapture technologies to enable rapid and accurate recalculations under dynamic policy and data revisions.Case studies based on real-world data from regional Chinese markets validate the effectiveness of the proposed approach,demonstrating marked improvements in computational efficiency,system robustness,and automation.Moreover,enhanced settlement accuracy and high temporal granularity improve price-signal fidelity,promote cost-reflective tariffs,and incentivize energy-efficient and demand-responsive behavior among market participants.The method not only supports equitable and transparent market operations but also provides a generalizable,scalable foundation for modern electricity settlement platforms in increasingly complex and dynamic market environments. 展开更多
关键词 Electricity market market settlement data model graph database market refund clearing
在线阅读 下载PDF
Impact of Data Processing Techniques on AI Models for Attack-Based Imbalanced and Encrypted Traffic within IoT Environments
18
作者 Yeasul Kim Chaeeun Won Hwankuk Kim 《Computers, Materials & Continua》 2026年第1期247-274,共28页
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp... With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy. 展开更多
关键词 Encrypted traffic attack detection data sampling technique AI-based detection IoT environment
在线阅读 下载PDF
Drive-by spatial offset detection for high-speed railway bridges based on fusion analysis of multi-source data from comprehensive inspection train
19
作者 Chuang Wang Jiawang Zhan +4 位作者 Nan Zhang Yujie Wang Xinxiang Xu Zhihang Wang Zhen Ni 《Railway Engineering Science》 2026年第1期128-148,共21页
The spatial offset of bridge has a significant impact on the safety,comfort,and durability of high-speed railway(HSR)operations,so it is crucial to rapidly and effectively detect the spatial offset of operational HSR ... The spatial offset of bridge has a significant impact on the safety,comfort,and durability of high-speed railway(HSR)operations,so it is crucial to rapidly and effectively detect the spatial offset of operational HSR bridges.Drive-by monitoring of bridge uneven settlement demonstrates significant potential due to its practicality,cost-effectiveness,and efficiency.However,existing drive-by methods for detecting bridge offset have limitations such as reliance on a single data source,low detection accuracy,and the inability to identify lateral deformations of bridges.This paper proposes a novel drive-by inspection method for spatial offset of HSR bridge based on multi-source data fusion of comprehensive inspection train.Firstly,dung beetle optimizer-variational mode decomposition was employed to achieve adaptive decomposition of non-stationary dynamic signals,and explore the hidden temporal relationships in the data.Subsequently,a long short-term memory neural network was developed to achieve feature fusion of multi-source signal and accurate prediction of spatial settlement of HSR bridge.A dataset of track irregularities and CRH380A high-speed train responses was generated using a 3D train-track-bridge interaction model,and the accuracy and effectiveness of the proposed hybrid deep learning model were numerically validated.Finally,the reliability of the proposed drive-by inspection method was further validated by analyzing the actual measurement data obtained from comprehensive inspection train.The research findings indicate that the proposed approach enables rapid and accurate detection of spatial offset in HSR bridge,ensuring the long-term operational safety of HSR bridges. 展开更多
关键词 High-speed railway bridge Drive-by inspection Spatial offset Multi-source data fusion Deep learning
在线阅读 下载PDF
Data Processing Solutions on Low Signal-to-noise Data in Loess Plateau Area:A Case Study in Ordos Basin,China
20
作者 GAO Rongtao CHENG Yun +1 位作者 TANG Ziqi LIU Zhao 《CT理论与应用研究(中英文)》 2026年第1期154-162,共9页
While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as... While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors. 展开更多
关键词 loess plateau ACQUISITION low signal to noise ratio data processing depth modeling
原文传递
上一页 1 2 250 下一页 到第
使用帮助 返回顶部