With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes rely...With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.展开更多
Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphys...Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.展开更多
In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task schedul...In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.展开更多
Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.Howev...Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.展开更多
Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to priva...Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.展开更多
The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation i...The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation inaccurate on the TP.This paper introduces an indirect estimation scheme for the LWC profile obtained using a monochromatic radiative transfer model(MonoRTM)and microwave radiometers(MWRs)on the TP.The LWC estimation method was improved using an optimization of the difference between the simulated and observed brightness temperature(TB)at specific microwave channels that are sensitive to liquid water.The accuracy of the LWC estimation method depends heavily on the value of the cloud-base environment humidity criterion(CBEHC).Our experiment confirmed that the default CBEHC value of 95%is unsuitable for the TP.For the rainfall scenarios,the optimization method suggested the use of CBEHC values of 81%,76%,and 83%for Mangya,Nagqu,and Qamdo stations,respectively.The new CBEHC values produced a 30 K improvement in the TB simulation when compared to that of 95%CBEHC under rainfall conditions.This demonstrates the robustness of the LWC estimation scheme and its significant improvement in LWC estimation on the TP.For no-rainfall scenarios,the original Karstens model remained suitable for Nagqu station.An adjustment of the CBEHC to 94%for Mangya station resulted in a 1 K improvement of its TB simulation.Qamdo station had a 2.5 K improvement when the CBEHC was adjusted to 98%.The relationship between the simulated TB simulation error and the maximum relative humidity of the radiosonde profiles weakened after CBEHC optimization.Thus,the innovative method proposed in this article provides a practical estimation method for LWC in the TP region.This LWC estimation method has a higher potential for rainfall days than no-rainfall days.Under no-rainfall conditions,the accuracy of the proposed LWC estimation method is sensitive to TB errors included in its measurement and simulation.An accurate estimation of LWC for no-rainfall conditions relies more on the equipment and radiation model.展开更多
The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spac...The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spaciousness,encouraging focus and creating room for creativity and reflection.Cloud Dancer embodies a desire for simplicity and renewal-a blank canvas that allows our minds to wander and new ideas to take shape.Its expansive presence fosters environments where tranquility meets inspiration,offering visual calm that supports wellbeing and mental lightness.展开更多
Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudass...Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.展开更多
The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the clou...The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.展开更多
This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the pred...This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.展开更多
The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementat...The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementation,and validation of a real-time monitoring framework based on the Internet ofThings(IoT)and cloud computing to enhance the thermal performance of evacuated tube solar water heaters(ETSWHs).A commercial system and a custom-built prototype were instrumented with Industry 4.0 technologies,including platinum resistance temperature detectors(PT100),solar irradiance and wind speed sensors,a programmable logic controller(PLC),a SCADAinterface,and a cloud-connected IoT gateway.Data were processed locally and transmitted to cloud storage for continuous analysis and visualization via amobile application.Experimental results demonstrated the prototype’s superior thermal energy storage capacity−47.4 vs.36.2 MJ for the commercial system,representing a 31%—achieved through the novel integration of Industry 4.0 architecture with an optimized collector design.This improvement is attributed to optimized geometric design parameters,including a reduced tilt angle,increased inter-tube spacing,and the incorporation of an aluminum reflective surface.These modifications collectively enhanced solar heat absorption and reduced optical losses.The framework effectively identified thermal stratification,monitored environmental effects on heat transfer,and enabled real-time system diagnostics.By integrating automation,IoT,and cloud computing,the proposed architecture establishes a scalable and replicable model for the intelligent management of solar thermal systems,facilitating predictive maintenance and future integration with artificial intelligence for performance forecasting.This work provides a practical,data-driven approach to digitizing and optimizing heat transfer systems,promoting more efficient and sustainable solar thermal energy applications.展开更多
3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with m...3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.展开更多
Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,...Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,offer scalability and efficiency but also expand the attack surface for adversaries.Ransomware has emerged as a leading threat in this domain,capable of halting workflows and exposing sensitive employee records.Traditional defenses such as static hardening and signature-based detection often fail to address the dynamic requirements of HR Software as a Service(SaaS),where continuous availability and privacy compliance are critical.This paper presents a Moving Target Defense(MTD)framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.Many prior defenses for cloud or IoT rely on static hardening or signature-driven detection and do not meet HR SaaS needs such as uninterrupted sessions,privacy compliance,and live service continuity.This paper presents a MTD framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.The framework runs on Kubernetes and uses a KL-divergence-based anomaly detector that monitors HR access logs across five modules(onboarding,employee records,leave,payroll,and exit).In simulation with realistic HR traffic,the approach reaches 96.9% average detection accuracy with AUC 0.94-0.98,cuts mean time to containment to 91.4 s,and lowers the ransomware encryption rate to 13.2%.Measured overheads for CPU,memory,and per-mutation latency remainmodest.Comparedwith priorMTDand non-MTD baselines,the design provides stronger containment without service interruption and aligns with zero-trust and compliance goals.Its modular implementation and control-plane orchestration support stepwise,enterprise-scale deployment in HR SaaS environments.展开更多
In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to...In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to dangerous situations.Furthermore,autonomous vehicles may operate abnormally in bad weather due to limitations of their sensors and GPS.Driving simulators,which replicate driving conditions nearly identical to those in the real world,can drastically reduce the time and cost required for market entry validation;consequently,they have become widely used.In this paper,we design a virtual driving test environment capable of collecting and verifying SiLS data under adverse weather conditions using multi-source images.The proposed method generates a virtual testing environment that incorporates various events,including weather,time of day,and moving objects,that cannot be easily verified in real-world autonomous driving tests.By setting up scenario-based virtual environment events,multi-source image analysis and verification using real-world DCUs(Data Concentrator Units)with V2X-Car edge cloud can effectively address risk factors that may arise in real-world situations.We tested and validated the proposed method with scenarios employing V2X communication and multi-source image analysis.展开更多
基金funded by the National Natural Science Foundation of China(New Design and Analysis of Fully Homomorphic Signatures,Grant No.62172436).
文摘With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.
基金National Natural Science Foundation of China(42375153,42105153,42205157)Development of Science and Technology at Chinese Academy of Meteorological Sciences(2023KJ038)。
文摘Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.
基金supported and funded by theDeanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2503).
文摘In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.
文摘Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.
基金supported in part by the National Key Research and Development Program of Chinaunder(Grant 2021YFB3101100)in part by the National Natural Science Foundation of Chinaunder(Grant 42461057),(Grant 62272123),and(Grant 42371470)+1 种基金in part by the Fundamental Research Program of Shanxi Province under(Grant 202303021212164)in part by the Postgraduate Education Innovation Program of Shanxi Province under(Grant 2024KY474).
文摘Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.
基金supported by the National Natural Science Foundation of China(Grant Nos.41975009 and U2442213).
文摘The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation inaccurate on the TP.This paper introduces an indirect estimation scheme for the LWC profile obtained using a monochromatic radiative transfer model(MonoRTM)and microwave radiometers(MWRs)on the TP.The LWC estimation method was improved using an optimization of the difference between the simulated and observed brightness temperature(TB)at specific microwave channels that are sensitive to liquid water.The accuracy of the LWC estimation method depends heavily on the value of the cloud-base environment humidity criterion(CBEHC).Our experiment confirmed that the default CBEHC value of 95%is unsuitable for the TP.For the rainfall scenarios,the optimization method suggested the use of CBEHC values of 81%,76%,and 83%for Mangya,Nagqu,and Qamdo stations,respectively.The new CBEHC values produced a 30 K improvement in the TB simulation when compared to that of 95%CBEHC under rainfall conditions.This demonstrates the robustness of the LWC estimation scheme and its significant improvement in LWC estimation on the TP.For no-rainfall scenarios,the original Karstens model remained suitable for Nagqu station.An adjustment of the CBEHC to 94%for Mangya station resulted in a 1 K improvement of its TB simulation.Qamdo station had a 2.5 K improvement when the CBEHC was adjusted to 98%.The relationship between the simulated TB simulation error and the maximum relative humidity of the radiosonde profiles weakened after CBEHC optimization.Thus,the innovative method proposed in this article provides a practical estimation method for LWC in the TP region.This LWC estimation method has a higher potential for rainfall days than no-rainfall days.Under no-rainfall conditions,the accuracy of the proposed LWC estimation method is sensitive to TB errors included in its measurement and simulation.An accurate estimation of LWC for no-rainfall conditions relies more on the equipment and radiation model.
文摘The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spaciousness,encouraging focus and creating room for creativity and reflection.Cloud Dancer embodies a desire for simplicity and renewal-a blank canvas that allows our minds to wander and new ideas to take shape.Its expansive presence fosters environments where tranquility meets inspiration,offering visual calm that supports wellbeing and mental lightness.
文摘Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.
基金funded through project NUBOLOSYTI (PID2023149972NB-100) of the Spanish Ministry of Science and Innovation (MICINN)supported by an IFUdG 2022 fellowship。
文摘The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.
基金supported by the National Key R&D Program of China[grant number 2023YFC3008004]。
文摘This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.
基金funded by the National Council of Science,Technology,and Technological Innovation(CONCYTEC)the National Program of Scientific Research and Advanced Studies(PROCIENCIA)under the E041-2022-“Applied Research Projects”competition.Contract number:PE501078609-2022-PROCIENCIA.
文摘The continuous improvement of solar thermal technologies is essential to meet the growing demand for sustainable heat generation and to support global decarbonization efforts.This study presents the design,implementation,and validation of a real-time monitoring framework based on the Internet ofThings(IoT)and cloud computing to enhance the thermal performance of evacuated tube solar water heaters(ETSWHs).A commercial system and a custom-built prototype were instrumented with Industry 4.0 technologies,including platinum resistance temperature detectors(PT100),solar irradiance and wind speed sensors,a programmable logic controller(PLC),a SCADAinterface,and a cloud-connected IoT gateway.Data were processed locally and transmitted to cloud storage for continuous analysis and visualization via amobile application.Experimental results demonstrated the prototype’s superior thermal energy storage capacity−47.4 vs.36.2 MJ for the commercial system,representing a 31%—achieved through the novel integration of Industry 4.0 architecture with an optimized collector design.This improvement is attributed to optimized geometric design parameters,including a reduced tilt angle,increased inter-tube spacing,and the incorporation of an aluminum reflective surface.These modifications collectively enhanced solar heat absorption and reduced optical losses.The framework effectively identified thermal stratification,monitored environmental effects on heat transfer,and enabled real-time system diagnostics.By integrating automation,IoT,and cloud computing,the proposed architecture establishes a scalable and replicable model for the intelligent management of solar thermal systems,facilitating predictive maintenance and future integration with artificial intelligence for performance forecasting.This work provides a practical,data-driven approach to digitizing and optimizing heat transfer systems,promoting more efficient and sustainable solar thermal energy applications.
基金supported by the National Natural Science Foundation of China(Grant Nos.52304139,52325403)the CCTEG Coal Mining Research Institute funding(Grant No.KCYJY-2024-MS-10).
文摘3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.
文摘Human Resource(HR)operations increasingly rely on cloud-based platforms that provide hiring,payroll,employee management,and compliance services.These systems,typically built on multi-tenant microservice architectures,offer scalability and efficiency but also expand the attack surface for adversaries.Ransomware has emerged as a leading threat in this domain,capable of halting workflows and exposing sensitive employee records.Traditional defenses such as static hardening and signature-based detection often fail to address the dynamic requirements of HR Software as a Service(SaaS),where continuous availability and privacy compliance are critical.This paper presents a Moving Target Defense(MTD)framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.Many prior defenses for cloud or IoT rely on static hardening or signature-driven detection and do not meet HR SaaS needs such as uninterrupted sessions,privacy compliance,and live service continuity.This paper presents a MTD framework for HR SaaS that combines container mutation,IP hopping,and node reassignment to randomize the attack surface without pausing services.The framework runs on Kubernetes and uses a KL-divergence-based anomaly detector that monitors HR access logs across five modules(onboarding,employee records,leave,payroll,and exit).In simulation with realistic HR traffic,the approach reaches 96.9% average detection accuracy with AUC 0.94-0.98,cuts mean time to containment to 91.4 s,and lowers the ransomware encryption rate to 13.2%.Measured overheads for CPU,memory,and per-mutation latency remainmodest.Comparedwith priorMTDand non-MTD baselines,the design provides stronger containment without service interruption and aligns with zero-trust and compliance goals.Its modular implementation and control-plane orchestration support stepwise,enterprise-scale deployment in HR SaaS environments.
基金supported by Institute of Information and Communications Technology Planning and Evaluation(IITP)grant funded by the Korean government(MSIT)(No.2019-0-01842,Artificial Intelligence Graduate School Program(GIST))supported by Korea Planning&Evaluation Institute of Industrial Technology(KEIT)grant funded by the Ministry of Trade,Industry&Energy(MOTIE,Republic of Korea)(RS-2025-25448249+1 种基金Automotive Industry Technology Development(R&D)Program)supported by the Regional Innovation System&Education(RISE)programthrough the(Gwangju RISE Center),funded by the Ministry of Education(MOE)and the Gwangju Metropolitan City,Republic of Korea(2025-RISE-05-001).
文摘In real-world autonomous driving tests,unexpected events such as pedestrians or wild animals suddenly entering the driving path can occur.Conducting actual test drives under various weather conditions may also lead to dangerous situations.Furthermore,autonomous vehicles may operate abnormally in bad weather due to limitations of their sensors and GPS.Driving simulators,which replicate driving conditions nearly identical to those in the real world,can drastically reduce the time and cost required for market entry validation;consequently,they have become widely used.In this paper,we design a virtual driving test environment capable of collecting and verifying SiLS data under adverse weather conditions using multi-source images.The proposed method generates a virtual testing environment that incorporates various events,including weather,time of day,and moving objects,that cannot be easily verified in real-world autonomous driving tests.By setting up scenario-based virtual environment events,multi-source image analysis and verification using real-world DCUs(Data Concentrator Units)with V2X-Car edge cloud can effectively address risk factors that may arise in real-world situations.We tested and validated the proposed method with scenarios employing V2X communication and multi-source image analysis.