期刊文献+
共找到415,165篇文章
< 1 2 250 >
每页显示 20 50 100
Design of Virtual Driving Test Environment for Collecting and Validating Bad Weather SiLS Data Based on Multi-Source Images Using DCU with V2X-Car Edge Cloud
1
作者 Sun Park JongWon Kim 《Computers, Materials & Continua》 2026年第3期448-467,共20页
In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to... In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to dangerous situations.Furthermore,autonomous vehicles may operate abnormally in bad weather due to limitations of their sensors and GPS.Driving simulators,which replicate driving conditions nearly identical to those in the real world,can drastically reduce the time and cost required for market entry validation;consequently,they have become widely used.In this paper,we design a virtual driving test environment capable of collecting and verifying SiLS data under adverse weather conditions using multi-source images.The proposed method generates a virtual testing environment that incorporates various events,including weather,time of day,and moving objects,that cannot be easily verified in real-world autonomous driving tests.By setting up scenario-based virtual environment events,multi-source image analysis and verification using real-world DCUs(Data Concentrator Units)with V2X-Car edge cloud can effectively address risk factors that may arise in real-world situations.We tested and validated the proposed method with scenarios employing V2X communication and multi-source image analysis. 展开更多
关键词 Virtual driving test DCU bad weather SiLS autonomous environment V2X-Car edge cloud
在线阅读 下载PDF
Design of a Private Cloud Platform for Distributed Logging Big Data Based on a Unified Learning Model of Physics and Data 被引量:1
2
作者 Cheng Xi Fu Haicheng Tursyngazy Mahabbat 《Applied Geophysics》 2025年第2期499-510,560,共13页
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th... Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity. 展开更多
关键词 Unified logging learning model logging big data private cloud machine learning
在线阅读 下载PDF
Multi-sensor missile-borne LiDAR point cloud data augmentation based on Monte Carlo distortion simulation 被引量:1
3
作者 Luda Zhao Yihua Hu +4 位作者 Fei Han Zhenglei Dou Shanshan Li Yan Zhang Qilong Wu 《CAAI Transactions on Intelligence Technology》 2025年第1期300-316,共17页
Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmenta... Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms. 展开更多
关键词 data augmentation LIDAR missile-borne imaging Monte Carlo simulation point cloud
在线阅读 下载PDF
A New Encryption Mechanism Supporting the Update of Encrypted Data for Secure and Efficient Collaboration in the Cloud Environment
4
作者 Chanhyeong Cho Byeori Kim +1 位作者 Haehyun Cho Taek-Young Youn 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期813-834,共22页
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud... With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks. 展开更多
关键词 cloud collaboration mode of operation data update efficiency
在线阅读 下载PDF
Research on Airborne Point Cloud Data Registration Using Urban Buildings as an Example
5
作者 Yajun Fan Yujun Shi +1 位作者 Chengjie Su Kai Wang 《Journal of World Architecture》 2025年第4期35-42,共8页
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)... Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field. 展开更多
关键词 Airborne LiDAR Point cloud registration Point cloud data processing Systematic error
在线阅读 下载PDF
Swarm learning anomaly detection framework for cloud data center using multi-channel BiWGAN-GTN and CEEMDAN
6
作者 Lun Tang Yuchen Zhao +4 位作者 Chengcheng Xue Zhiwei Jiang Wei Zou Yanping Liang Qianbin Chen 《Digital Communications and Networks》 2025年第6期1883-1896,共14页
Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions... Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions among multiple VMs on Key Performance Indicator(KPI)data,e.g.,memory utilization.Furthermore,the nonstationarity,high complexity,and uncertain periodicity of KPI data in VM also bring difficulties to deep learningbased anomaly detection tasks.To settle these challenges,this paper proposes MCBiWGAN-GTN,a multi-channel semi-supervised time series anomaly detection algorithm based on the Bidirectional Wasserstein Generative Adversarial Network with Graph-Time Network(BiWGAN-GTN)and the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise(CEEMDAN).(a)The BiWGAN-GTN algorithm is proposed to extract spatiotemporal information from data.(b)The loss function of BiWGAN-GTN is redesigned to solve the abnormal data intrusion problem during the training process.(c)MCBiWGAN-GTN is designed to reduce data complexity through CEEMDAN for time series decomposition and utilizes BiWGAN-GTN to train different components.(d)To adapt the proposed algorithm for the entire cloud data center,a cloud data center anomaly detection framework based on Swarm Learning(SL)is designed.The evaluation results on a real-world cloud data center dataset show that MCBiWGAN-GTN outperforms the baseline,with an F1-score of 0.96,an accuracy of 0.935,a precision of 0.954,a recall of 0.967,and an FPR of 0.203.The experiments also verify the stability of MCBiWGAN-GTN,the impact of parameter configurations,and the effectiveness of the proposed SL framework. 展开更多
关键词 cloud data center Anomaly detection Bi WGAN-GTN Time series decomposition Swarm learning
在线阅读 下载PDF
Development of a Private Cloud Platform for Distributed Logging Big Data and Its Application to Geo-Engineering Evaluation of Geothermal Fields
7
作者 Cheng Xi Fu Hai-cheng He Jun 《Applied Geophysics》 2025年第4期1205-1219,1497,共16页
The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration an... The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration and development and the requirements of low-carbon development,the focus of exploration anddevelopment in the oil and gas industry is gradually shifting to the exploration and development of renewableenergy sources such as deep sea,deep earth and geothermal energy.The traditional petrophysical evaluation andinterpretation model has encountered great challenges in the face of new evaluation objects.To establish a distributedlogging big data private cloud platform with a unified learning model as the key,which realizes the distributed storageand processing of logging big data,and enables the learning of brand-new knowledge patterns from multi-attributedata in the large function space in the unified logging learning model integrating the expert knowledge and the datamodel,so as to solve the problem of geoengineering evaluation of geothermal fields.Based on the research ideaof“logging big data cloud platform---unified logging learning model---large function space---knowledge learning&discovery---application”,the theoretical foundation of unified learning model,cloud platform architecture,datastorage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storageand processing of data and learning algorithms.New knowledge of geothermal evaluation is found in a large functionspace and applied to Geo-engineering evaluation of geothermal fields.The examples show its good application in theselection of logging series in geothermal fields,quality control of logging data,identification of complex lithologyin geothermal fields,evaluation of reservoir fluids,checking of associated helium,evaluation of cementing quality,evaluation of well-side fractures,and evaluation of geothermal water recharge under the remote logging module ofthe cloud platform.The first and second cementing surfaces of cemented wells in geothermal fields were evaluated,as well as the development of well-side distal fractures,fracture extension orientation.According to the well-sidefracture communication to form a good fluid pathway and large flow rate and long flow diameter of the thermalstorage fi ssure system,the design is conducive to the design of the recharge program of geothermal water. 展开更多
关键词 logging big data private cloud machine learning remote operation geoengineering evaluation of geothermal fields geothermal water recharge
在线阅读 下载PDF
Construction of an Intelligent Early Warning System for a Cloud-Based Laboratory Data Platform under the Medical Consortium Model
8
作者 Jibiao Zhou 《Journal of Electronic Research and Application》 2025年第4期268-275,共8页
With the continuous advancement of the tiered diagnosis and treatment system,the medical consortium model has gained increasing attention as an important approach to promoting the vertical integration of healthcare re... With the continuous advancement of the tiered diagnosis and treatment system,the medical consortium model has gained increasing attention as an important approach to promoting the vertical integration of healthcare resources.Within this context,laboratory data,as a key component of healthcare information systems,urgently requires efficient sharing and intelligent analysis.This paper designs and constructs an intelligent early warning system for laboratory data based on a cloud platform tailored to the medical consortium model.Through standardized data formats and unified access interfaces,the system enables the integration and cleaning of laboratory data across multiple healthcare institutions.By combining medical rule sets with machine learning models,the system achieves graded alerts and rapid responses to abnormal key indicators and potential outbreaks of infectious diseases.Practical deployment results demonstrate that the system significantly improves the utilization efficiency of laboratory data,strengthens public health event monitoring,and optimizes inter-institutional collaboration.The paper also discusses challenges encountered during system implementation,such as inconsistent data standards,security and compliance concerns,and model interpretability,and proposes corresponding optimization strategies.These findings provide a reference for the broader application of intelligent medical early warning systems. 展开更多
关键词 Medical consortium Laboratory data cloud platform Intelligent early warning data standardization Machine learning
在线阅读 下载PDF
Automatic identification of discontinuities and refined modeling of rock blocks from 3D point cloud data of rock surfaces 被引量:1
9
作者 Yaopeng Ji Shengyuan Song +5 位作者 Jianping Chen Jingyu Xue Jianhua Yan Yansong Zhang Di Sun Qing Wang 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第5期3093-3106,共14页
The spatial distribution of discontinuities and the size of rock blocks are the key indicators for rock mass quality evaluation and rockfall risk assessment.Traditional manual measurement is often dangerous or unreach... The spatial distribution of discontinuities and the size of rock blocks are the key indicators for rock mass quality evaluation and rockfall risk assessment.Traditional manual measurement is often dangerous or unreachable at some high-steep rock slopes.In contrast,unmanned aerial vehicle(UAV)photogrammetry is not limited by terrain conditions,and can efficiently collect high-precision three-dimensional(3D)point clouds of rock masses through all-round and multiangle photography for rock mass characterization.In this paper,a new method based on a 3D point cloud is proposed for discontinuity identification and refined rock block modeling.The method is based on four steps:(1)Establish a point cloud spatial topology,and calculate the point cloud normal vector and average point spacing based on several machine learning algorithms;(2)Extract discontinuities using the density-based spatial clustering of applications with noise(DBSCAN)algorithm and fit the discontinuity plane by combining principal component analysis(PCA)with the natural breaks(NB)method;(3)Propose a method of inserting points in the line segment to generate an embedded discontinuity point cloud;and(4)Adopt a Poisson reconstruction method for refined rock block modeling.The proposed method was applied to an outcrop of an ultrahigh steep rock slope and compared with the results of previous studies and manual surveys.The results show that the method can eliminate the influence of discontinuity undulations on the orientation measurement and describe the local concave-convex characteristics on the modeling of rock blocks.The calculation results are accurate and reliable,which can meet the practical requirements of engineering. 展开更多
关键词 Three-dimensional(3D)point cloud Rock mass Automatic identification Refined modeling Unmanned aerial vehicle(UAV)
在线阅读 下载PDF
Adaptive Attribute-Based Honey Encryption: A Novel Solution for Cloud Data Security
10
作者 Reshma Siyal Muhammad Asim +4 位作者 Long Jun Mohammed Elaffendi Sundas Iftikhar Rana Alnashwan Samia Allaoua Chelloug 《Computers, Materials & Continua》 2025年第2期2637-2664,共28页
A basic procedure for transforming readable data into encoded forms is encryption, which ensures security when the right decryption keys are used. Hadoop is susceptible to possible cyber-attacks because it lacks built... A basic procedure for transforming readable data into encoded forms is encryption, which ensures security when the right decryption keys are used. Hadoop is susceptible to possible cyber-attacks because it lacks built-in security measures, even though it can effectively handle and store enormous datasets using the Hadoop Distributed File System (HDFS). The increasing number of data breaches emphasizes how urgently creative encryption techniques are needed in cloud-based big data settings. This paper presents Adaptive Attribute-Based Honey Encryption (AABHE), a state-of-the-art technique that combines honey encryption with Ciphertext-Policy Attribute-Based Encryption (CP-ABE) to provide improved data security. Even if intercepted, AABHE makes sure that sensitive data cannot be accessed by unauthorized parties. With a focus on protecting huge files in HDFS, the suggested approach achieves 98% security robustness and 95% encryption efficiency, outperforming other encryption methods including Ciphertext-Policy Attribute-Based Encryption (CP-ABE), Key-Policy Attribute-Based Encryption (KB-ABE), and Advanced Encryption Standard combined with Attribute-Based Encryption (AES+ABE). By fixing Hadoop’s security flaws, AABHE fortifies its protections against data breaches and enhances Hadoop’s dependability as a platform for processing and storing massive amounts of data. 展开更多
关键词 CYBERSECURITY data security cloud storage hadoop encryption and decryption privacy protection attribute-based honey encryption
在线阅读 下载PDF
Enhancing Healthcare Data Privacy in Cloud IoT Networks Using Anomaly Detection and Optimization with Explainable AI (ExAI)
11
作者 Jitendra Kumar Samriya Virendra Singh +4 位作者 Gourav Bathla Meena Malik Varsha Arya Wadee Alhalabi Brij B.Gupta 《Computers, Materials & Continua》 2025年第8期3893-3910,共18页
The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated cha... The integration of the Internet of Things(IoT)into healthcare systems improves patient care,boosts operational efficiency,and contributes to cost-effective healthcare delivery.However,overcoming several associated challenges,such as data security,interoperability,and ethical concerns,is crucial to realizing the full potential of IoT in healthcare.Real-time anomaly detection plays a key role in protecting patient data and maintaining device integrity amidst the additional security risks posed by interconnected systems.In this context,this paper presents a novelmethod for healthcare data privacy analysis.The technique is based on the identification of anomalies in cloud-based Internet of Things(IoT)networks,and it is optimized using explainable artificial intelligence.For anomaly detection,the Radial Boltzmann Gaussian Temporal Fuzzy Network(RBGTFN)is used in the process of doing information privacy analysis for healthcare data.Remora Colony SwarmOptimization is then used to carry out the optimization of the network.The performance of the model in identifying anomalies across a variety of healthcare data is evaluated by an experimental study.This evaluation suggested that themodel measures the accuracy,precision,latency,Quality of Service(QoS),and scalability of themodel.A remarkable 95%precision,93%latency,89%quality of service,98%detection accuracy,and 96%scalability were obtained by the suggested model,as shown by the subsequent findings. 展开更多
关键词 Healthcare data privacy analysis anomaly detection cloud IoT network explainable artificial intelligence temporal fuzzy network
在线阅读 下载PDF
Dynamic Multi-Objective Gannet Optimization(DMGO):An Adaptive Algorithm for Efficient Data Replication in Cloud Systems
12
作者 P.William Ved Prakash Mishra +3 位作者 Osamah Ibrahim Khalaf Arvind Mukundan Yogeesh N Riya Karmakar 《Computers, Materials & Continua》 2025年第9期5133-5156,共24页
Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple dat... Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance. 展开更多
关键词 cloud computing data replication dynamic optimization multi-objective optimization gannet optimization algorithm adaptive algorithms resource efficiency SCALABILITY latency reduction energy-efficient computing
在线阅读 下载PDF
Accuracy assessment of cloud removal methods for Moderate-resolution Imaging Spectroradiometer(MODIS)snow data in the Tianshan Mountains,China
13
作者 WANG Qingxue MA Yonggang +1 位作者 XU Zhonglin LI Junli 《Journal of Arid Land》 2025年第4期457-480,共24页
Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts... Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts of climate change.Remote sensing has become a vital tool for snow monitoring,with the widely used Moderate-resolution Imaging Spectroradiometer(MODIS)snow products from the Terra and Aqua satellites.However,cloud cover often interferes with snow detection,making cloud removal techniques crucial for reliable snow product generation.This study evaluated the accuracy of four MODIS snow cover datasets generated through different cloud removal algorithms.Using real-time field camera observations from four stations in the Tianshan Mountains,China,this study assessed the performance of these datasets during three distinct snow periods:the snow accumulation period(September-November),snowmelt period(March-June),and stable snow period(December-February in the following year).The findings showed that cloud-free snow products generated using the Hidden Markov Random Field(HMRF)algorithm consistently outperformed the others,particularly under cloud cover,while cloud-free snow products using near-day synthesis and the spatiotemporal adaptive fusion method with error correction(STAR)demonstrated varying performance depending on terrain complexity and cloud conditions.This study highlighted the importance of considering terrain features,land cover types,and snow dynamics when selecting cloud removal methods,particularly in areas with rapid snow accumulation and melting.The results suggested that future research should focus on improving cloud removal algorithms through the integration of machine learning,multi-source data fusion,and advanced remote sensing technologies.By expanding validation efforts and refining cloud removal strategies,more accurate and reliable snow products can be developed,contributing to enhanced snow monitoring and better management of water resources in alpine and arid areas. 展开更多
关键词 real time camera cloud removal algorithm snow cover Moderate-resolution Imaging Spectroradiometer(MODIS)snow data snow monitoring
在线阅读 下载PDF
An Efficient and Secure Data Audit Scheme for Cloud-Based EHRs with Recoverable and Batch Auditing
14
作者 Yuanhang Zhang Xu An Wang +3 位作者 Weiwei Jiang Mingyu Zhou Xiaoxuan Xu Hao Liu 《Computers, Materials & Continua》 2025年第4期1533-1553,共21页
Cloud storage,a core component of cloud computing,plays a vital role in the storage and management of data.Electronic Health Records(EHRs),which document users’health information,are typically stored on cloud servers... Cloud storage,a core component of cloud computing,plays a vital role in the storage and management of data.Electronic Health Records(EHRs),which document users’health information,are typically stored on cloud servers.However,users’sensitive data would then become unregulated.In the event of data loss,cloud storage providers might conceal the fact that data has been compromised to protect their reputation and mitigate losses.Ensuring the integrity of data stored in the cloud remains a pressing issue that urgently needs to be addressed.In this paper,we propose a data auditing scheme for cloud-based EHRs that incorporates recoverability and batch auditing,alongside a thorough security and performance evaluation.Our scheme builds upon the indistinguishability-based privacy-preserving auditing approach proposed by Zhou et al.We identify that this scheme is insecure and vulnerable to forgery attacks on data storage proofs.To address these vulnerabilities,we enhanced the auditing process using masking techniques and designed new algorithms to strengthen security.We also provide formal proof of the security of the signature algorithm and the auditing scheme.Furthermore,our results show that our scheme effectively protects user privacy and is resilient against malicious attacks.Experimental results indicate that our scheme is not only secure and efficient but also supports batch auditing of cloud data.Specifically,when auditing 10,000 users,batch auditing reduces computational overhead by 101 s compared to normal auditing. 展开更多
关键词 SECURITY cloud computing cloud storage recoverable batch auditing
在线阅读 下载PDF
Energy Efficient VM Selection Using CSOA-VM Model in Cloud Data Centers
15
作者 Mandeep Singh Devgan Tajinder Kumar +3 位作者 Purushottam Sharma Xiaochun Cheng Shashi Bhushan Vishal Garg 《CAAI Transactions on Intelligence Technology》 2025年第4期1217-1234,共18页
The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud d... The cloud data centres evolved with an issue of energy management due to the constant increase in size,complexity and enormous consumption of energy.Energy management is a challenging issue that is critical in cloud data centres and an important concern of research for many researchers.In this paper,we proposed a cuckoo search(CS)-based optimisation technique for the virtual machine(VM)selection and a novel placement algorithm considering the different constraints.The energy consumption model and the simulation model have been implemented for the efficient selection of VM.The proposed model CSOA-VM not only lessens the violations at the service level agreement(SLA)level but also minimises the VM migrations.The proposed model also saves energy and the performance analysis shows that energy consumption obtained is 1.35 kWh,SLA violation is 9.2 and VM migration is about 268.Thus,there is an improvement in energy consumption of about 1.8%and a 2.1%improvement(reduction)in violations of SLA in comparison to existing techniques. 展开更多
关键词 cloud computing cloud datacenter energy consumption VM selection
在线阅读 下载PDF
GF-4 high-resolution texture and FY-4A multispectral data fusion:Two case studies for enhancing early convective cloud detection
16
作者 Yang Gao Xin Wang Jun Yang 《Atmospheric and Oceanic Science Letters》 2025年第4期21-26,共6页
Early detection of convective clouds is vital for minimizing hazardous impacts.Forecasting convective initiation(CI)using current multispectral geostationary meteorological satellites is often challenged by high false... Early detection of convective clouds is vital for minimizing hazardous impacts.Forecasting convective initiation(CI)using current multispectral geostationary meteorological satellites is often challenged by high false-alarm rates and missed detections caused by limited resolution.In contrast,high-resolution earth observation satellites offer more detailed texture information,improving early detection capabilities.The authors propose a novel methodology that integrates the advanced features of China’s latest-generation satellites,Gaofen-4(GF-4)and Fengyun-4A(FY-4A).This fusion method retains GF’s high-resolution details and FY-4A’s multispectral information.Two cases from different observational scenarios and weather conditions under GF-4’s staring mode were carried out to compare the CI forecast results based on fused data and solely on FY-4A data.The fused data demonstrated superior performance in detecting smaller-scale convective clouds,enabling earlier forecasting with a lead time of 15–30 minutes,and more accurate location identification.Integrating high-resolution earth observation satellites into early convective cloud detection provides valuable insights for forecasters and decision-makers,particularly given the current resolution limitations of geostationary meteorological satellites. 展开更多
关键词 GaoFen-4 Fengyun-4A Fusion TEXTURE Convective cloud
在线阅读 下载PDF
Observational Study of Cloud Microphysical Characteristics of Tropical Cyclones in Different Environmental Fields Using Multi-Source Satellite Data
17
作者 WANG Rui DUAN Yi-hong FENG Jia-ning 《Journal of Tropical Meteorology》 2025年第2期151-164,共14页
In this study, a variety of high-resolution satellite data were used to analyze the similarities and differences in horizontal and vertical cloud microphysical characteristics of 11 tropical cyclones(TCs) in three dif... In this study, a variety of high-resolution satellite data were used to analyze the similarities and differences in horizontal and vertical cloud microphysical characteristics of 11 tropical cyclones(TCs) in three different ocean basins.The results show that for the 11 TCs in different ocean basins, no matter in what season the TCs were generated when they reached or approached Category 4, their melting layers were all distributed in the vertical direction at the height of about 5 km. The high value of ice water contents in the vertical direction of 11 TCs all reach or approach about 2000 g cm^(–3).The total attenuated scattering coefficient at 532 nm, TAB-532, can successfully characterize the distribution of areas with high ice water content when the vertical distribution was concentrated near 0.1 km^(–1)sr^(–1), possibly because the diameter distribution of the corresponding range of aerosol particles had a more favorable effect on the formation of ice nuclei,indicating that aerosols had a significant impact on the ice-phase processes and characteristics. Moreover, by analyzing the horizontal cloud water content, the distribution analysis of cloud water path(CWP) and ice water path(IWP) shows that when the sea surface temperature was at a relatively high value, and the vertical wind shear was relatively small, the CWP and the IWP can reach a relatively high value, which also proves the importance of environmental field factors on the influence of TC cloud microphysical characteristics. 展开更多
关键词 tropical cyclone cloud microphysical characteristics AEROSOL SST vertical wind shear
在线阅读 下载PDF
An Improved Blockchain-Based Cloud Auditing Scheme Using Dynamic Aggregate Signatures
18
作者 Haibo Lei Xu An Wang +4 位作者 Wenhao Liu Lingling Wu Chao Zhang Weiwei Jiang Xiao Zou 《Computers, Materials & Continua》 2026年第2期1599-1629,共31页
With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes rely... With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems. 展开更多
关键词 cloud auditing cloud storage blockchain data integrity BLS signatures
在线阅读 下载PDF
Point Cloud Method for Detecting Suspended Pipelines Using Multi-Beam Water Column Data
19
作者 YAN Zhenyu ZHOU Tian +3 位作者 ZHU Jianjun LI Tie DU Weidong ZHANG Baihan 《Journal of Ocean University of China》 2025年第6期1683-1691,共9页
In the task of inspecting underwater suspended pipelines,multi-beam sonar(MBS)can provide two-dimensional water column images(WCIs).However,systematic interferences(e.g.,sidelobe effects)may induce misdetection in WCI... In the task of inspecting underwater suspended pipelines,multi-beam sonar(MBS)can provide two-dimensional water column images(WCIs).However,systematic interferences(e.g.,sidelobe effects)may induce misdetection in WCIs.To address this issue and improve the accuracy of detection,we developed a density-based clustering method for three-dimensional water column point clouds.During the processing of WCIs,sidelobe effects are mitigated using a bilateral filter and brightness transformation.The cross-sectional point cloud of the pipeline is then extracted by using the Canny operator.In the detection phase,the target is identified by using density-based spatial clustering of applications with noise(DBSCAN).However,the selection of appropriate DBSCAN parameters is obscured by the uneven distribution of the water column point cloud.To overcome this,we propose an improved DBSCAN based on a parameter interval estimation method(PIE-DBSCAN).First,kernel density estimation(KDE)is used to determine the candidate interval of parameters,after which the exact cluster number is determined via density peak clustering(DPC).Finally,the optimal parameters are selected by comparing the mean silhouette coefficients.To validate the performance of PIE-DBSCAN,we collected water column point clouds from an anechoic tank and the South China Sea.PIE-DBSCAN successfully detected both the target points of the suspended pipeline and non-target points on the seafloor surface.Compared to the K-Means and Mean-Shift algorithms,PIE-DBSCAN demonstrates superior clustering performance and shows feasibility in practical applications. 展开更多
关键词 multi-beam sonar water column image water column point cloud density-based noisy application spatial clustering suspended pipeline detection
在线阅读 下载PDF
基于Spring Cloud微服务架构的工业软件多层级组件平台设计
20
作者 张健 《自动化与仪器仪表》 2026年第1期131-134,139,共5页
针对当前投入应用的工业软件组件平台多人同时操作时容易发生冲突,导致平台吞吐量较低的问题,设计基于Spring Cloud微服务架构的工业软件多层级组件平台。依托Spring Cloud微服务架构,将工业软件系统拆分为多个独立的微服务模块。面向... 针对当前投入应用的工业软件组件平台多人同时操作时容易发生冲突,导致平台吞吐量较低的问题,设计基于Spring Cloud微服务架构的工业软件多层级组件平台。依托Spring Cloud微服务架构,将工业软件系统拆分为多个独立的微服务模块。面向每个微服务模块,考虑兼容性和重用性,构建软件构成组件选择模型。引入粒子群优化算法对模型进行求解,生成最佳组件选择决策。建立基于实体模型的代码和逻辑生成模块,将多层级组件结合起来,完成工业软件开发。结果表明:在并发1000线程的情况下,平台吞吐量达到了580.1 s,远超期望要求,证明该平台应用效果良好。 展开更多
关键词 Spring cloud微服务架构 工业软件 组件选择 兼容性 重用性 代码
原文传递
上一页 1 2 250 下一页 到第
使用帮助 返回顶部