A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved...A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.展开更多
Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the lo...Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.展开更多
Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applicatio...Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.展开更多
3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with m...3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.展开更多
With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes rely...With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.展开更多
Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphys...Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.展开更多
In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task schedul...In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.展开更多
Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.Howev...Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.展开更多
Evaluating rock mass quality using three-dimensional(3D)point clouds is crucial for discontinuity extraction and is widely applied in various industrial sectors.However,the utilization of this method in geological sur...Evaluating rock mass quality using three-dimensional(3D)point clouds is crucial for discontinuity extraction and is widely applied in various industrial sectors.However,the utilization of this method in geological surveys remains limited.Notable limitations of current research include the scarcity of validation using simple geometric shapes for discontinuity extraction methods,and the lack of studies that target both planar and linear discontinuity.To address these gaps,this study proposes a workflow for identifying discontinuity planes and traces in rock outcrops from photogrammetric 3D modeling,employing the Compass and Facets plugins in the open-source CloudCompare software.Prior to field application,the efficacy of the extraction methods was first evaluated using experimental datasets of a cube and an isosceles triangular prism generated under laboratory-controlled conditions.This validation demonstrated exceptional accuracy,with the dip and dip direction(DDD)of extracted structures consistently within±2°of the actual values.Following this rigorous laboratory validation,this methodology was applied to a more complex natural rock outcrop(Miocene–Pliocene deposits in Japan),demonstrating its applicability in realistic geological settings for identifying structures.The results showed that the dip and dip direction trends of the extracted bedding planes and faults were consistent with field measurements,achieving a time reduction of approximately 40%compared to traditional methods.In conclusion,through strictly controlled initial verification and subsequent successful application to a complex natural setting,this study confirmed that the proposed workflow can effectively and efficiently extract discontinuous geological structures from point clouds.展开更多
Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to priva...Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.展开更多
The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spac...The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spaciousness,encouraging focus and creating room for creativity and reflection.Cloud Dancer embodies a desire for simplicity and renewal-a blank canvas that allows our minds to wander and new ideas to take shape.Its expansive presence fosters environments where tranquility meets inspiration,offering visual calm that supports wellbeing and mental lightness.展开更多
The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation i...The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation inaccurate on the TP.This paper introduces an indirect estimation scheme for the LWC profile obtained using a monochromatic radiative transfer model(MonoRTM)and microwave radiometers(MWRs)on the TP.The LWC estimation method was improved using an optimization of the difference between the simulated and observed brightness temperature(TB)at specific microwave channels that are sensitive to liquid water.The accuracy of the LWC estimation method depends heavily on the value of the cloud-base environment humidity criterion(CBEHC).Our experiment confirmed that the default CBEHC value of 95%is unsuitable for the TP.For the rainfall scenarios,the optimization method suggested the use of CBEHC values of 81%,76%,and 83%for Mangya,Nagqu,and Qamdo stations,respectively.The new CBEHC values produced a 30 K improvement in the TB simulation when compared to that of 95%CBEHC under rainfall conditions.This demonstrates the robustness of the LWC estimation scheme and its significant improvement in LWC estimation on the TP.For no-rainfall scenarios,the original Karstens model remained suitable for Nagqu station.An adjustment of the CBEHC to 94%for Mangya station resulted in a 1 K improvement of its TB simulation.Qamdo station had a 2.5 K improvement when the CBEHC was adjusted to 98%.The relationship between the simulated TB simulation error and the maximum relative humidity of the radiosonde profiles weakened after CBEHC optimization.Thus,the innovative method proposed in this article provides a practical estimation method for LWC in the TP region.This LWC estimation method has a higher potential for rainfall days than no-rainfall days.Under no-rainfall conditions,the accuracy of the proposed LWC estimation method is sensitive to TB errors included in its measurement and simulation.An accurate estimation of LWC for no-rainfall conditions relies more on the equipment and radiation model.展开更多
Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudass...Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.展开更多
The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the clou...The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.展开更多
The virtual preassembly of super-high steel bridge towers faces a challenge in the efficient and precise extraction of complex cross-sectional features.Factors such as fabrication errors,gravity-induced deformations,a...The virtual preassembly of super-high steel bridge towers faces a challenge in the efficient and precise extraction of complex cross-sectional features.Factors such as fabrication errors,gravity-induced deformations,and temperature fluctuations can compromise the accuracy of contour extraction.To address these limitations,an improved Alpha-shape-based point cloud contour extraction method is proposed.The proposed approach uses a hierarchical strategy to process three-dimensional laser scanning point clouds.The processed data are then subjected to curvatureadaptive voxel filtering to reduce acquisition noise.In addition,an enhanced iterative closest point(ICP)variant with correspondence validation accurately aligns the discrete point cloud segments.The proposed curvature-responsive Alpha-shape framework enables multiscale contour delineation through topology-adaptive threshold modulation,which resolves boundary ambiguities in geometrically complex cross-sections.The method was experimentally validated using field-acquired measurement datasets from the Zhangjinggao Yangtze River Bridge tower segments,confirming its capability to reconstruct noncanonical cross-sectional geometries.Three contour extraction methods,including Poisson reconstruction,the conventional Alpha-shape algorithm,and random sample consensus with ICP(RANSAC-ICP),were compared to evaluate the performance of the proposed Alpha-shape algorithm.The results demonstrate that the proposed method achieves superior contour extraction accuracy and data reduction efficiency,highlighting its effectiveness in contour extraction tasks.展开更多
This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the pred...This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.展开更多
Based on the conventional observation data,daily reanalysis data from NCAR/NCEP,and TBB data derived from FY-2G infrared cloud images in April 2018,a heavy snowfall weather process in central Inner Mongolia from April...Based on the conventional observation data,daily reanalysis data from NCAR/NCEP,and TBB data derived from FY-2G infrared cloud images in April 2018,a heavy snowfall weather process in central Inner Mongolia from April 4 to 6 in 2018 was analyzed.The results show that the low trough at 500 hPa,the southerly wind jet stream at 700 hPa,and the inverted trough on the ground were the main influencing systems causing this blizzard.The transportation of warm and humid air by the southerly wind jet stream at 700 hPa and intense water vapor convergence provided sufficient water vapor conditions for the blizzard,and the moist layer in the blizzard area was deep.The low-level MPV in the blizzard area was<0,and the atmosphere was in a conditional symmetric instability state.The coupling of the upper and lower-level jets induced strong ascending motion.With the invasion of cold air,a low-level cold pad was formed,so that the warm and humid air tilted upward.The secondary circulation updraft triggered by the wet Q vector system released the conditional symmetric instability energy,so that the sloping motion was more intense,and the heavy snowfall appeared.Meanwhile,there was a good correspondence relationship between the blizzard area and the large-value area of low-level wet Q vector divergence.The mesoscale cloud clusters continuously generating,merging,and moving eastward in Hetao area were the direct cause of this blizzard,and the TBB of the cloud clusters was≤-56℃.The blizzard happened in the the edge gradient and large-value area of TBB.展开更多
基金This study was jointly supported by the National Science Foundation of China under Grant No.s40233031 and 40221503the National Key Basic Research Project under Grant No.G200078502.
文摘A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.
基金the National Natu-ral Science Foundation of China under Grant No.40023001and No.40233031 and"Innovation Program"under GrantZKCX2-SW-210and the National Key Basic ResearchProject under Grant G200078502.
文摘Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.
文摘Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.
基金supported by the National Natural Science Foundation of China(Grant Nos.52304139,52325403)the CCTEG Coal Mining Research Institute funding(Grant No.KCYJY-2024-MS-10).
文摘3D laser scanning technology is widely used in underground openings for high-precision,rapid,and nondestructive structural evaluations.Segmenting large 3D point cloud datasets,particularly in coal mine roadways with multi-scale targets,remains challenging.This paper proposes an enhanced segmentation method integrating improved PointNet++with a coverage-voted strategy.The coverage-voted strategy reduces data while preserving multi-scale target topology.The segmentation is achieved using an enhanced PointNet++algorithm with a normalization preprocessing head,resulting in a 94%accuracy for common supporting components.Ablation experiments show that the preprocessing head and coverage strategies increase segmentation accuracy by 20%and 2%,respectively,and improve Intersection over Union(IoU)for bearing plate segmentation by 58%and 20%.The accuracy of the current pretraining segmentation model may be affected by variations in surface support components,but it can be readily enhanced through re-optimization with additional labeled point cloud data.This proposed method,combined with a previously developed machine learning model that links rock bolt load and the deformation field of its bearing plate,provides a robust technique for simultaneously measuring the load of multiple rock bolts in a single laser scan.
基金funded by the National Natural Science Foundation of China(New Design and Analysis of Fully Homomorphic Signatures,Grant No.62172436).
文摘With the rapid expansion of the Internet of Things(IoT),user data has experienced exponential growth,leading to increasing concerns about the security and integrity of data stored in the cloud.Traditional schemes relying on untrusted third-party auditors suffer from both security and efficiency issues,while existing decentralized blockchain-based auditing solutions still face shortcomings in correctness and security.This paper proposes an improved blockchain-based cloud auditing scheme,with the following core contributions:Identifying critical logical contradictions in the original scheme,thereby establishing the foundation for the correctness of cloud auditing;Designing an enhanced mechanism that integrates multiple hashing with dynamic aggregate signatures,binding encrypted blocks through bilinear pairings and BLS signatures,and improving the scheme by setting parameters based on the Computational Diffie-Hellman(CDH)problem,significantly strengthening data integrity protection and anti-forgery capabilities;Introducing a random challenge mechanism and dynamic parameter adjustment strategy,effectively resisting various attacks such as forgery,tampering,and deletion,significantly improving the detection probability of malicious Cloud Service Providers(CSPs),and significantly reducing the proof generation overhead for CSPswhilemaintaining the same computational cost forDataOwners.Theoretical analysis and performance evaluation experiments demonstrate that the proposed scheme achieves significant improvements in both security and efficiency.Finally,the paper explores potential applications of the Enhanced Security Scheme in fields such as healthcare,drone swarms,and government office attendance systems,providing an effective approach for building secure,efficient,and decentralized cloud auditing systems.
基金National Natural Science Foundation of China(42375153,42105153,42205157)Development of Science and Technology at Chinese Academy of Meteorological Sciences(2023KJ038)。
文摘Clouds play an important role in global atmospheric energy and water vapor budgets, and the low cloud simulations suffer from large biases in many atmospheric general circulation models. In this study, cloud microphysical processes such as raindrop evaporation and cloud water accretion in a double-moment six-class cloud microphysics scheme were revised to enhance the simulation of low clouds using the Global-Regional Integrated Forecast System(GRIST)model. The validation of the revised scheme using a single-column version of the GRIST demonstrated a reasonable reduction in liquid water biases. The revised parameterization simulated medium-and low-level cloud fractions that were in better agreement with the observations than the original scheme. Long-term global simulations indicate the mitigation of the originally overestimated low-level cloud fraction and cloud-water mixing ratio in mid-to high-latitude regions,primarily owing to enhanced accretion processes and weakened raindrop evaporation. The reduced low clouds with the revised scheme showed better consistency with satellite observations, particularly at mid-and high-latitudes. Further improvements can be observed in the simulated cloud shortwave radiative forcing and vertical distribution of total cloud cover. Annual precipitation in mid-latitude regions has also improved, particularly over the oceans, with significantly increased large-scale and decreased convective precipitation.
基金supported and funded by theDeanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2503).
文摘In recent years,fog computing has become an important environment for dealing with the Internet of Things.Fog computing was developed to handle large-scale big data by scheduling tasks via cloud computing.Task scheduling is crucial for efficiently handling IoT user requests,thereby improving system performance,cost,and energy consumption across nodes in cloud computing.With the large amount of data and user requests,achieving the optimal solution to the task scheduling problem is challenging,particularly in terms of cost and energy efficiency.In this paper,we develop novel strategies to save energy consumption across nodes in fog computing when users execute tasks through the least-cost paths.Task scheduling is developed using modified artificial ecosystem optimization(AEO),combined with negative swarm operators,Salp Swarm Algorithm(SSA),in order to competitively optimize their capabilities during the exploitation phase of the optimal search process.In addition,the proposed strategy,Enhancement Artificial Ecosystem Optimization Salp Swarm Algorithm(EAEOSSA),attempts to find the most suitable solution.The optimization that combines cost and energy for multi-objective task scheduling optimization problems.The backpack problem is also added to improve both cost and energy in the iFogSim implementation as well.A comparison was made between the proposed strategy and other strategies in terms of time,cost,energy,and productivity.Experimental results showed that the proposed strategy improved energy consumption,cost,and time over other algorithms.Simulation results demonstrate that the proposed algorithm increases the average cost,average energy consumption,and mean service time in most scenarios,with average reductions of up to 21.15%in cost and 25.8%in energy consumption.
文摘Task scheduling in cloud computing is a multi-objective optimization problem,often involving conflicting objectives such as minimizing execution time,reducing operational cost,and maximizing resource utilization.However,traditional approaches frequently rely on single-objective optimization methods which are insufficient for capturing the complexity of such problems.To address this limitation,we introduce MDMOSA(Multi-objective Dwarf Mongoose Optimization with Simulated Annealing),a hybrid that integrates multi-objective optimization for efficient task scheduling in Infrastructure-as-a-Service(IaaS)cloud environments.MDMOSA harmonizes the exploration capabilities of the biologically inspired Dwarf Mongoose Optimization(DMO)with the exploitation strengths of Simulated Annealing(SA),achieving a balanced search process.The algorithm aims to optimize task allocation by reducing makespan and financial cost while improving system resource utilization.We evaluate MDMOSA through extensive simulations using the real-world Google Cloud Jobs(GoCJ)dataset within the CloudSim environment.Comparative analysis against benchmarked algorithms such as SMOACO,MOTSGWO,and MFPAGWO reveals that MDMOSA consistently achieves superior performance in terms of scheduling efficiency,cost-effectiveness,and scalability.These results confirm the potential of MDMOSA as a robust and adaptable solution for resource scheduling in dynamic and heterogeneous cloud computing infrastructures.
文摘Evaluating rock mass quality using three-dimensional(3D)point clouds is crucial for discontinuity extraction and is widely applied in various industrial sectors.However,the utilization of this method in geological surveys remains limited.Notable limitations of current research include the scarcity of validation using simple geometric shapes for discontinuity extraction methods,and the lack of studies that target both planar and linear discontinuity.To address these gaps,this study proposes a workflow for identifying discontinuity planes and traces in rock outcrops from photogrammetric 3D modeling,employing the Compass and Facets plugins in the open-source CloudCompare software.Prior to field application,the efficacy of the extraction methods was first evaluated using experimental datasets of a cube and an isosceles triangular prism generated under laboratory-controlled conditions.This validation demonstrated exceptional accuracy,with the dip and dip direction(DDD)of extracted structures consistently within±2°of the actual values.Following this rigorous laboratory validation,this methodology was applied to a more complex natural rock outcrop(Miocene–Pliocene deposits in Japan),demonstrating its applicability in realistic geological settings for identifying structures.The results showed that the dip and dip direction trends of the extracted bedding planes and faults were consistent with field measurements,achieving a time reduction of approximately 40%compared to traditional methods.In conclusion,through strictly controlled initial verification and subsequent successful application to a complex natural setting,this study confirmed that the proposed workflow can effectively and efficiently extract discontinuous geological structures from point clouds.
基金supported in part by the National Key Research and Development Program of Chinaunder(Grant 2021YFB3101100)in part by the National Natural Science Foundation of Chinaunder(Grant 42461057),(Grant 62272123),and(Grant 42371470)+1 种基金in part by the Fundamental Research Program of Shanxi Province under(Grant 202303021212164)in part by the Postgraduate Education Innovation Program of Shanxi Province under(Grant 2024KY474).
文摘Recently,large-scale deep learning models have been increasingly adopted for point cloud classification.However,thesemethods typically require collecting extensive datasets frommultiple clients,which may lead to privacy leaks.Federated learning provides an effective solution to data leakage by eliminating the need for data transmission,relying instead on the exchange of model parameters.However,the uneven distribution of client data can still affect the model’s ability to generalize effectively.To address these challenges,we propose a new framework for point cloud classification called Federated Dynamic Aggregation Selection Strategy-based Multi-Receptive Field Fusion Classification Framework(FDASS-MRFCF).Specifically,we tackle these challenges with two key innovations:(1)During the client local training phase,we propose a Multi-Receptive Field Fusion Classification Model(MRFCM),which captures local and global structures in point cloud data through dynamic convolution and multi-scale feature fusion,enhancing the robustness of point cloud classification.(2)In the server aggregation phase,we introduce a Federated Dynamic Aggregation Selection Strategy(FDASS),which employs a hybrid strategy to average client model parameters,skip aggregation,or reallocate local models to different clients,thereby balancing global consistency and local diversity.We evaluate our framework using the ModelNet40 and ShapeNetPart benchmarks,demonstrating its effectiveness.The proposed method is expected to significantly advance the field of point cloud classification in a secure environment.
文摘The Pantone Color of the Year 2026,PANTONE 11-4201 Cloud Dancer,has been introduced as a soft,lofty white symbolizing calm and clarity in an increasingly noisy world.This gentle shade invites a sense of peace and spaciousness,encouraging focus and creating room for creativity and reflection.Cloud Dancer embodies a desire for simplicity and renewal-a blank canvas that allows our minds to wander and new ideas to take shape.Its expansive presence fosters environments where tranquility meets inspiration,offering visual calm that supports wellbeing and mental lightness.
基金supported by the National Natural Science Foundation of China(Grant Nos.41975009 and U2442213).
文摘The cloud liquid water content(LWC)of the Tibetan Plateau(TP)is crucial for cloud water conversion.There are very few accurate observations of the LWC on the TP.This makes our estimation of the LWC and precipitation inaccurate on the TP.This paper introduces an indirect estimation scheme for the LWC profile obtained using a monochromatic radiative transfer model(MonoRTM)and microwave radiometers(MWRs)on the TP.The LWC estimation method was improved using an optimization of the difference between the simulated and observed brightness temperature(TB)at specific microwave channels that are sensitive to liquid water.The accuracy of the LWC estimation method depends heavily on the value of the cloud-base environment humidity criterion(CBEHC).Our experiment confirmed that the default CBEHC value of 95%is unsuitable for the TP.For the rainfall scenarios,the optimization method suggested the use of CBEHC values of 81%,76%,and 83%for Mangya,Nagqu,and Qamdo stations,respectively.The new CBEHC values produced a 30 K improvement in the TB simulation when compared to that of 95%CBEHC under rainfall conditions.This demonstrates the robustness of the LWC estimation scheme and its significant improvement in LWC estimation on the TP.For no-rainfall scenarios,the original Karstens model remained suitable for Nagqu station.An adjustment of the CBEHC to 94%for Mangya station resulted in a 1 K improvement of its TB simulation.Qamdo station had a 2.5 K improvement when the CBEHC was adjusted to 98%.The relationship between the simulated TB simulation error and the maximum relative humidity of the radiosonde profiles weakened after CBEHC optimization.Thus,the innovative method proposed in this article provides a practical estimation method for LWC in the TP region.This LWC estimation method has a higher potential for rainfall days than no-rainfall days.Under no-rainfall conditions,the accuracy of the proposed LWC estimation method is sensitive to TB errors included in its measurement and simulation.An accurate estimation of LWC for no-rainfall conditions relies more on the equipment and radiation model.
文摘Internet of Things(IoT)interconnects devices via network protocols to enable intelligent sensing and control.Resource-constrained IoT devices rely on cloud servers for data storage and processing.However,this cloudassisted architecture faces two critical challenges:the untrusted cloud services and the separation of data ownership from control.Although Attribute-based Searchable Encryption(ABSE)provides fine-grained access control and keyword search over encrypted data,existing schemes lack of error tolerance in exact multi-keyword matching.In this paper,we proposed an attribute-based multi-keyword fuzzy searchable encryption with forward ciphertext search(FCS-ABMSE)scheme that avoids computationally expensive bilinear pairing operations on the IoT device side.The scheme supportsmulti-keyword fuzzy search without requiring explicit keyword fields,thereby significantly enhancing error tolerance in search operations.It further incorporates forward-secure ciphertext search to mitigate trapdoor abuse,as well as offline encryption and verifiable outsourced decryption to minimize user-side computational costs.Formal security analysis proved that the FCS-ABMSE scheme meets both indistinguishability of ciphertext under the chosen keyword attacks(IND-CKA)and the indistinguishability of ciphertext under the chosen plaintext attacks(IND-CPA).In addition,we constructed an enhanced variant based on type-3 pairings.Results demonstrated that the proposed scheme outperforms existing ABSE approaches in terms of functionalities,computational cost,and communication cost.
基金funded through project NUBOLOSYTI (PID2023149972NB-100) of the Spanish Ministry of Science and Innovation (MICINN)supported by an IFUdG 2022 fellowship。
文摘The interactions between clouds and aerosols represent one of the largest uncertainties in assessing the Earth's radiation budget, highlighting the importance of research on the transition zone(TZ) within the cloud-aerosol continuum.This study assesses the global distribution of TZ conditions, analyzes its optical characteristics, and determines the cloud or aerosol types most commonly associated with them, using the cloud-aerosol discrimination(CAD) score of the CloudAerosol Lidar with Orthogonal Polarization(CALIOP) instrument on the CALIPSO satellite. The CAD score classifies clouds and aerosols by the probability density functions of attenuated backscatter, total color ratio, volume depolarization ratio, altitude, and latitude. After applying several filters to avoid artifacts, the TZ was identified as those atmospheric layers that cannot be clearly classified as clouds or aerosols, layers within the no-confidence range(NCR) of the CAD score, and cirrus fringes. The optical characteristics of NCR layers exhibit two main clusters: Cluster 1, with properties between high-altitude ice clouds and aerosols(e.g., wispy cloud fragments), and Cluster 2, with properties between water clouds and aerosols at lower altitudes(e.g., large hydrated aerosols). Our results highlight the significant ubiquity of TZ conditions, which appear in 9.5% of all profiles and comprise 6.4% of the detected layers. Cluster 1 and cirrus-fringe layers predominate near the ITCZ and in mid-latitudes, whereas Cluster 2 layers are more frequent over the oceans along the central West African and East Asian coasts, where elevated smoke and dusty marine aerosols are common.
基金The National Natural Science Foundation of China(No.52338011)the Start-up Research Fund of Southeast University(No.RF1028624058)+1 种基金the Southeast University Interdisciplinary Research Program for Young Scholarsthe National Key Research and Development Program of China(No.2024YFC3014103).
文摘The virtual preassembly of super-high steel bridge towers faces a challenge in the efficient and precise extraction of complex cross-sectional features.Factors such as fabrication errors,gravity-induced deformations,and temperature fluctuations can compromise the accuracy of contour extraction.To address these limitations,an improved Alpha-shape-based point cloud contour extraction method is proposed.The proposed approach uses a hierarchical strategy to process three-dimensional laser scanning point clouds.The processed data are then subjected to curvatureadaptive voxel filtering to reduce acquisition noise.In addition,an enhanced iterative closest point(ICP)variant with correspondence validation accurately aligns the discrete point cloud segments.The proposed curvature-responsive Alpha-shape framework enables multiscale contour delineation through topology-adaptive threshold modulation,which resolves boundary ambiguities in geometrically complex cross-sections.The method was experimentally validated using field-acquired measurement datasets from the Zhangjinggao Yangtze River Bridge tower segments,confirming its capability to reconstruct noncanonical cross-sectional geometries.Three contour extraction methods,including Poisson reconstruction,the conventional Alpha-shape algorithm,and random sample consensus with ICP(RANSAC-ICP),were compared to evaluate the performance of the proposed Alpha-shape algorithm.The results demonstrate that the proposed method achieves superior contour extraction accuracy and data reduction efficiency,highlighting its effectiveness in contour extraction tasks.
基金supported by the National Key R&D Program of China[grant number 2023YFC3008004]。
文摘This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.
基金Supported by the Meteorological Science and Technology Innovation Project of North China(HBXM202415)Research Project of the Meteorological Bureau of Inner Mongolia Autonomous Region(nmqxkjcx202311).
文摘Based on the conventional observation data,daily reanalysis data from NCAR/NCEP,and TBB data derived from FY-2G infrared cloud images in April 2018,a heavy snowfall weather process in central Inner Mongolia from April 4 to 6 in 2018 was analyzed.The results show that the low trough at 500 hPa,the southerly wind jet stream at 700 hPa,and the inverted trough on the ground were the main influencing systems causing this blizzard.The transportation of warm and humid air by the southerly wind jet stream at 700 hPa and intense water vapor convergence provided sufficient water vapor conditions for the blizzard,and the moist layer in the blizzard area was deep.The low-level MPV in the blizzard area was<0,and the atmosphere was in a conditional symmetric instability state.The coupling of the upper and lower-level jets induced strong ascending motion.With the invasion of cold air,a low-level cold pad was formed,so that the warm and humid air tilted upward.The secondary circulation updraft triggered by the wet Q vector system released the conditional symmetric instability energy,so that the sloping motion was more intense,and the heavy snowfall appeared.Meanwhile,there was a good correspondence relationship between the blizzard area and the large-value area of low-level wet Q vector divergence.The mesoscale cloud clusters continuously generating,merging,and moving eastward in Hetao area were the direct cause of this blizzard,and the TBB of the cloud clusters was≤-56℃.The blizzard happened in the the edge gradient and large-value area of TBB.