The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the clou...The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.展开更多
With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes rely...With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.展开更多
Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphys...Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.展开更多
In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task schedul...In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.展开更多
Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.Howev...Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.展开更多
Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudass...Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.展开更多
Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to priva...Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.展开更多
This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the pred...This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.展开更多
The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementat...The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementation,and validation of a real-time monitoring framework based on the Internet ofThings(IoT)and cloud computing to enhance the thermal performance of evacuated tube solar water heaters(ETSWHs).A commercial system and a custom-built prototype were instrumented with Industry 4.0 technologies,including platinum resistance temperature detectors(PT100),solar irradiance and wind speed sensors,a programmable logic controller(PLC),a SCADAinterface,and a cloud-connected IoT gateway.Data were processed locally and transmitted to cloud storage for continuous analysis and visualization via amobile application.Experimental results demonstrated the prototype’s superior thermal energy storage capacity−47.4 vs.36.2 MJ for the commercial system,representing a 31%—achieved through the novel integration of Industry 4.0 architecture with an optimized collector design.This improvement is attributed to optimized geometric design parameters,including a reduced tilt angle,increased inter-tube spacing,and the incorporation of an aluminum reflective surface.These modifications collectively enhanced solar heat absorption and reduced optical losses.The framework effectively identified thermal stratification,monitored environmental effects on heat transfer,and enabled real-time system diagnostics.By integrating automation,IoT,and cloud computing,the proposed architecture establishes a scalable and replicable model for the intelligent management of solar thermal systems,facilitating predictive maintenance and future integration with artificial intelligence for performance forecasting.This work provides a practical,data-driven approach to digitizing and optimizing heat transfer systems,promoting more efficient and sustainable solar thermal energy applications.展开更多
Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,...Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,offer scalability and efficiency but also expand the attack surface for adversaries.Ransomware has emerged as a leading threat in this domain,capable of halting workflows and exposing sensitive employee records.Traditional defenses such as static hardening and signature-based detection often fail to address the dynamic requirements of HR Software as a Service(SaaS),where continuous availability and privacy compliance are critical.This paper presents a Moving Target Defense(MTD)framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.Many prior defenses for cloud or IoT rely on static hardening or signature-driven detection and do not meet HR SaaS needs such as uninterrupted sessions,privacy compliance,and live service continuity.This paper presents a MTD framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.The framework runs on Kubernetes and uses a KL-divergence-based anomaly detector that monitors HR access logs across five modules(onboarding,employee records,leave,payroll,and exit).In simulation with realistic HR traffic,the approach reaches 96.9% average detection accuracy with AUC 0.94-0.98,cuts mean time to containment to 91.4 s,and lowers the ransomware encryption rate to 13.2%.Measured overheads for CPU,memory,and per-mutation latency remainmodest.Comparedwith priorMTDand non-MTD baselines,the design provides stronger containment without service interruption and aligns with zero-trust and compliance goals.Its modular implementation and control-plane orchestration support stepwise,enterprise-scale deployment in HR SaaS environments.展开更多
In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to...In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to dangerous situations.Furthermore,autonomous vehicles may operate abnormally in bad weather due to limitations of their sensors and GPS.Driving simulators,which replicate driving conditions nearly identical to those in the real world,can drastically reduce the time and cost required for market entry validation;consequently,they have become widely used.In this paper,we design a virtual driving test environment capable of collecting and verifying SiLS data under adverse weather conditions using multi-source images.The proposed method generates a virtual testing environment that incorporates various events,including weather,time of day,and moving objects,that cannot be easily verified in real-world autonomous driving tests.By setting up scenario-based virtual environment events,multi-source image analysis and verification using real-world DCUs(Data Concentrator Units)with V2X-Car edge cloud can effectively address risk factors that may arise in real-world situations.We tested and validated the proposed method with scenarios employing V2X communication and multi-source image analysis.展开更多
3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with m...3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.展开更多
This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering...This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.展开更多
Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in we...Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in weather forecasting and climate prediction models.Hence,the latest activation and triple-moment condensation schemes were combined to simulate and analyze the evolution characteristics of a cloud droplet spectrum from activation to condensation and compared with a high-resolution Lagrangian bin model and the current double-moment condensation schemes,in which the spectral shape parameter is fixed or diagnosed by an empirical formula.The results demonstrate that the latest schemes effectively capture the evolution characteristics of the cloud droplet spectrum during activation and condensation,which is in line with the performance of the bin model.The simulation of the latest activation and condensation schemes in a parcel model shows that the cloud droplet spectrum gradually widens and exhibits a multimodal distribution during the activation process,accompanied by a decrease in the spectral shape and slope parameters over time.Conversely,during the condensation process,the cloud droplet spectrum gradually narrows,resulting in increases in the spectral shape and slope parameters.However,these double-moment schemes fail to accurately replicate the evolution of the cloud droplet spectrum and its multimodal distribution characteristics.Furthermore,the latest schemes were coupled into a 1.5D cumulus model,and an observation case was simulated.The simulations confirm that the cloud droplet spectrum appears wider at the supersaturated cloud base and cloud top due to activation,while it becomes narrower at the middle altitudes of the cloud due to condensation growth.展开更多
The cloud phase composition of cold clouds in the Antarctic atmosphere is explored using data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Cloud-Aerosol Lidar with Orthogonal Polarization (CAL...The cloud phase composition of cold clouds in the Antarctic atmosphere is explored using data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) instruments for the period 2000-2006. We used the averaged fraction of liquid-phase clouds out of the total cloud amount at the cloud tops since the value is comparable in the two measurements. MODIS data for the winter months (June, July, and August) reveal liquid cloud fraction out of the total cloud amount significantly decreases with decreasing cloud-top temperature below 0°C. In addition, the CALIOP vertical profiles show that below the ice clouds, low-lying liquid clouds are distributed over ~20% of the area. With increasing latitude, the liquid cloud fraction decreases as a function of the local temperature. The MODIS-observed relation between the cloud-top liquid fraction and cloud-top temperature is then applied to evaluate the cloud phase parameterization in climate models, in which condensed cloud water is repartitioned between liquid water and ice on the basis of the grid point temperature. It is found that models assuming overly high cut-offs ( -40°C) for the separation of ice clouds from mixed-phase clouds may significantly underestimate the liquid cloud fraction in the winter Antarctic atmosphere. Correction of the bias in the liquid cloud fraction would serve to reduce the large uncertainty in cloud radiative effects.展开更多
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base...In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.展开更多
基金funded through project NUBOLOSYTI (PID2023149972NB-100) of the Spanish Ministry of Science and Innovation (MICINN)supported by an IFUdG 2022 fellowship。
文摘The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.
基金funded by the National Natural Science Foundation of China(New Design and Analysis of Fully Homomorphic Signatures,Grant No.62172436).
文摘With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.
基金National Natural Science Foundation of China(42375153,42105153,42205157)Development of Science and Technology at Chinese Academy of Meteorological Sciences(2023KJ038)。
文摘Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.
基金supported and funded by theDeanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2503).
文摘In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.
文摘Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.
文摘Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.
基金supported in part by the National Key Research and Development Program of Chinaunder(Grant 2021YFB3101100)in part by the National Natural Science Foundation of Chinaunder(Grant 42461057),(Grant 62272123),and(Grant 42371470)+1 种基金in part by the Fundamental Research Program of Shanxi Province under(Grant 202303021212164)in part by the Postgraduate Education Innovation Program of Shanxi Province under(Grant 2024KY474).
文摘Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.
基金supported by the National Key R&D Program of China[grant number 2023YFC3008004]。
文摘This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.
基金funded by the National Council of Science,Technology,and Technological Innovation(CONCYTEC)the National Program of Scientific Research and Advanced Studies(PROCIENCIA)under the E041-2022-“Applied Research Projects”competition.Contract number:PE501078609-2022-PROCIENCIA.
文摘The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementation,and validation of a real-time monitoring framework based on the Internet ofThings(IoT)and cloud computing to enhance the thermal performance of evacuated tube solar water heaters(ETSWHs).A commercial system and a custom-built prototype were instrumented with Industry 4.0 technologies,including platinum resistance temperature detectors(PT100),solar irradiance and wind speed sensors,a programmable logic controller(PLC),a SCADAinterface,and a cloud-connected IoT gateway.Data were processed locally and transmitted to cloud storage for continuous analysis and visualization via amobile application.Experimental results demonstrated the prototype’s superior thermal energy storage capacity−47.4 vs.36.2 MJ for the commercial system,representing a 31%—achieved through the novel integration of Industry 4.0 architecture with an optimized collector design.This improvement is attributed to optimized geometric design parameters,including a reduced tilt angle,increased inter-tube spacing,and the incorporation of an aluminum reflective surface.These modifications collectively enhanced solar heat absorption and reduced optical losses.The framework effectively identified thermal stratification,monitored environmental effects on heat transfer,and enabled real-time system diagnostics.By integrating automation,IoT,and cloud computing,the proposed architecture establishes a scalable and replicable model for the intelligent management of solar thermal systems,facilitating predictive maintenance and future integration with artificial intelligence for performance forecasting.This work provides a practical,data-driven approach to digitizing and optimizing heat transfer systems,promoting more efficient and sustainable solar thermal energy applications.
文摘Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,offer scalability and efficiency but also expand the attack surface for adversaries.Ransomware has emerged as a leading threat in this domain,capable of halting workflows and exposing sensitive employee records.Traditional defenses such as static hardening and signature-based detection often fail to address the dynamic requirements of HR Software as a Service(SaaS),where continuous availability and privacy compliance are critical.This paper presents a Moving Target Defense(MTD)framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.Many prior defenses for cloud or IoT rely on static hardening or signature-driven detection and do not meet HR SaaS needs such as uninterrupted sessions,privacy compliance,and live service continuity.This paper presents a MTD framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.The framework runs on Kubernetes and uses a KL-divergence-based anomaly detector that monitors HR access logs across five modules(onboarding,employee records,leave,payroll,and exit).In simulation with realistic HR traffic,the approach reaches 96.9% average detection accuracy with AUC 0.94-0.98,cuts mean time to containment to 91.4 s,and lowers the ransomware encryption rate to 13.2%.Measured overheads for CPU,memory,and per-mutation latency remainmodest.Comparedwith priorMTDand non-MTD baselines,the design provides stronger containment without service interruption and aligns with zero-trust and compliance goals.Its modular implementation and control-plane orchestration support stepwise,enterprise-scale deployment in HR SaaS environments.
基金supported by Institute of Information and Communications Technology Planning and Evaluation(IITP)grant funded by the Korean government(MSIT)(No.2019-0-01842,Artificial Intelligence Graduate School Program(GIST))supported by Korea Planning&Evaluation Institute of Industrial Technology(KEIT)grant funded by the Ministry of Trade,Industry&Energy(MOTIE,Republic of Korea)(RS-2025-25448249+1 种基金Automotive Industry Technology Development(R&D)Program)supported by the Regional Innovation System&Education(RISE)programthrough the(Gwangju RISE Center),funded by the Ministry of Education(MOE)and the Gwangju Metropolitan City,Republic of Korea(2025-RISE-05-001).
文摘In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to dangerous situations.Furthermore,autonomous vehicles may operate abnormally in bad weather due to limitations of their sensors and GPS.Driving simulators,which replicate driving conditions nearly identical to those in the real world,can drastically reduce the time and cost required for market entry validation;consequently,they have become widely used.In this paper,we design a virtual driving test environment capable of collecting and verifying SiLS data under adverse weather conditions using multi-source images.The proposed method generates a virtual testing environment that incorporates various events,including weather,time of day,and moving objects,that cannot be easily verified in real-world autonomous driving tests.By setting up scenario-based virtual environment events,multi-source image analysis and verification using real-world DCUs(Data Concentrator Units)with V2X-Car edge cloud can effectively address risk factors that may arise in real-world situations.We tested and validated the proposed method with scenarios employing V2X communication and multi-source image analysis.
基金supported by the National Natural Science Foundation of China(Grant Nos.52304139,52325403)the CCTEG Coal Mining Research Institute funding(Grant No.KCYJY-2024-MS-10).
文摘3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.
文摘This article explores the evolution of cloud computing, its advantages over traditional on-premises infrastructure, and its impact on information security. The study presents a comprehensive literature review covering various cloud infrastructure offerings and security models. Additionally, it deeply analyzes real-life case studies illustrating successful cloud migrations and highlights common information security threats in current cloud computing. The article concludes by offering recommendations to businesses to protect themselves from cloud data breaches and providing insights into selecting a suitable cloud services provider from an information security perspective.
基金supported by the National Natural Science Foundations of China(Grant Nos.42305163 and U22A20577)the Construction Project of Weather Modification Ability in Central China(Grant No.ZQC-H22256)+2 种基金the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB0760300)the Projects of the Earth System Numerical Simulation Facility(Grant Nos.2024-EL-PT-000707,2023-ELPT-000482,2023-EL-ZD-00026,and 2022-EL-PT-00083)the STS Program of the Inner Mongolia Meteorological Service,Chongqing Institute of Green and Intelligent Technology,Chinese Academy of Sciences,and Institute of Atmospheric Physics,Chinese Academy of Sciences(Grant No.2021CG0047)。
文摘Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in weather forecasting and climate prediction models.Hence,the latest activation and triple-moment condensation schemes were combined to simulate and analyze the evolution characteristics of a cloud droplet spectrum from activation to condensation and compared with a high-resolution Lagrangian bin model and the current double-moment condensation schemes,in which the spectral shape parameter is fixed or diagnosed by an empirical formula.The results demonstrate that the latest schemes effectively capture the evolution characteristics of the cloud droplet spectrum during activation and condensation,which is in line with the performance of the bin model.The simulation of the latest activation and condensation schemes in a parcel model shows that the cloud droplet spectrum gradually widens and exhibits a multimodal distribution during the activation process,accompanied by a decrease in the spectral shape and slope parameters over time.Conversely,during the condensation process,the cloud droplet spectrum gradually narrows,resulting in increases in the spectral shape and slope parameters.However,these double-moment schemes fail to accurately replicate the evolution of the cloud droplet spectrum and its multimodal distribution characteristics.Furthermore,the latest schemes were coupled into a 1.5D cumulus model,and an observation case was simulated.The simulations confirm that the cloud droplet spectrum appears wider at the supersaturated cloud base and cloud top due to activation,while it becomes narrower at the middle altitudes of the cloud due to condensation growth.
基金funded by Ko-rean Center for Atmospheric Sciences and Earthquake Re-search 2010–1178, and US Department of Energy grantDE-FG02-01ER63257
文摘The cloud phase composition of cold clouds in the Antarctic atmosphere is explored using data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) instruments for the period 2000-2006. We used the averaged fraction of liquid-phase clouds out of the total cloud amount at the cloud tops since the value is comparable in the two measurements. MODIS data for the winter months (June, July, and August) reveal liquid cloud fraction out of the total cloud amount significantly decreases with decreasing cloud-top temperature below 0°C. In addition, the CALIOP vertical profiles show that below the ice clouds, low-lying liquid clouds are distributed over ~20% of the area. With increasing latitude, the liquid cloud fraction decreases as a function of the local temperature. The MODIS-observed relation between the cloud-top liquid fraction and cloud-top temperature is then applied to evaluate the cloud phase parameterization in climate models, in which condensed cloud water is repartitioned between liquid water and ice on the basis of the grid point temperature. It is found that models assuming overly high cut-offs ( -40°C) for the separation of ice clouds from mixed-phase clouds may significantly underestimate the liquid cloud fraction in the winter Antarctic atmosphere. Correction of the bias in the liquid cloud fraction would serve to reduce the large uncertainty in cloud radiative effects.
基金Shanxi Province Higher Education Science and Technology Innovation Fund Project(2022-676)Shanxi Soft Science Program Research Fund Project(2016041008-6)。
文摘In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.