A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved...A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.展开更多
Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the lo...Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.展开更多
Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applicatio...Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.展开更多
This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the pred...This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.展开更多
In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the ve...In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then theability of the scheme to simulate cloud fraction at different relative humidity, verticaltemperature profile, and the timescale of the turbulent dissipation is examined by numericalsimulation. It is found that the simulated cloud fraction is sensitive to the parameter used in thestatistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, theintroduced statistical cloud scheme is modified. By combining the modified statistical cloud schemewith a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposedand tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community ClimateModel version 3). It is found that the simulation of low-level cloud fraction is markedly improvedand the centers with maximum low-level cloud fractions are well simulated in the cold oceans off thewestern coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggeststhat the new statistically-based low-level cloud scheme has a great potential in the generalcirculation model for improving the low-level cloud parameterization.展开更多
The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achievi...The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.展开更多
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approach...The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approaches,such as IDS,have been developed to tackle these issues.However,most conventional Intrusion Detection System(IDS)models struggle with unseen cyberattacks and complex high-dimensional data.In fact,this paper introduces the idea of a novel distributed explainable and heterogeneous transformer-based intrusion detection system,named INTRUMER,which offers balanced accuracy,reliability,and security in cloud settings bymultiplemodulesworking together within it.The traffic captured from cloud devices is first passed to the TC&TM module in which the Falcon Optimization Algorithm optimizes the feature selection process,and Naie Bayes algorithm performs the classification of features.The selected features are classified further and are forwarded to the Heterogeneous Attention Transformer(HAT)module.In this module,the contextual interactions of the network traffic are taken into account to classify them as normal or malicious traffic.The classified results are further analyzed by the Explainable Prevention Module(XPM)to ensure trustworthiness by providing interpretable decisions.With the explanations fromthe classifier,emergency alarms are transmitted to nearby IDSmodules,servers,and underlying cloud devices for the enhancement of preventive measures.Extensive experiments on benchmark IDS datasets CICIDS 2017,Honeypots,and NSL-KDD were conducted to demonstrate the efficiency of the INTRUMER model in detecting network trafficwith high accuracy for different types.Theproposedmodel outperforms state-of-the-art approaches,obtaining better performance metrics:98.7%accuracy,97.5%precision,96.3%recall,and 97.8%F1-score.Such results validate the robustness and effectiveness of INTRUMER in securing diverse cloud environments against sophisticated cyber threats.展开更多
Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various doma...Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various domains. This study presents an enhanced cloud detection method based on the U-Net architecture, designed to address the challenges of multi-scale cloud features and long-range dependencies inherent in remote sensing imagery. A Multi-Scale Dilated Attention (MSDA) module is introduced to effectively integrate multi-scale information and model long-range dependencies across different scales, enhancing the model’s ability to detect clouds of varying sizes. Additionally, a Multi-Head Self-Attention (MHSA) mechanism is incorporated to improve the model’s capacity for capturing finer details, particularly in distinguishing thin clouds from surface features. A multi-path supervision mechanism is also devised to ensure the model learns cloud features at multiple scales, further boosting the accuracy and robustness of cloud mask generation. Experimental results demonstrate that the enhanced model achieves superior performance compared to other benchmarked methods in complex scenarios. It significantly improves cloud detection accuracy, highlighting its strong potential for practical applications in cloud detection tasks.展开更多
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base...In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.展开更多
An efficient catalytic system was developed to remove various organic pollutants by simultaneously using low-level cobalt ions,calcium carbonate micro-particles and peroxymonosulfate(PMS).A simple base-induced precipi...An efficient catalytic system was developed to remove various organic pollutants by simultaneously using low-level cobalt ions,calcium carbonate micro-particles and peroxymonosulfate(PMS).A simple base-induced precipitation was used to successfully loaded Co-centered reactive sites onto the surface of CaCO_(3)microparticles.Under optimal conditions at 25°C,10 mg/L methylene blue(MB)could be completely degraded within 10 min with 480μg/L Co^(2+),0.4 g/L CaCO_(3)microparticles(or 0.4 g/L Co@CaCO_(3))and 0.1 g/L PMS.The MB degradation followed the pseudo first order kinetics with a rate constant of 0.583 min^(−1),being 8.3,11.5 and 53.0 times that by using Co-OH(0.07 min^(−1)),Co^(2+)(0.044 min^(−1))and CaCO_(3)(0.011 min^(−1))as the catalyst,respectively.It was confirmed that there was a synergistic effect in the catalytic activity between Co species and the CaCO_(3)particles but the major contributor was the highly dispersed Co species.When Co^(2+)-containing simulated electroplating wastewater was used as the Co^(2+)source,not only the added MB was also completely degraded within 5 min in this catalytic system,but also the coexisting heavy metal ions were substantially removed.The presently developed method was applied to simultaneously treat organic wastewater and heavy metals wastewater.The present method was also successfully used to efficiently degrade other organic pollutants including bisphenol A,sulfamethoxazole,rhodamine B,tetrabromobisphenol A,ofloxacin and benzoic acid.A catalytic mechanism was proposed for the PMS activation by Co@CaCO_(3).The surface of CaCO_(3)particles favors the adsorption of Co^(2+).More importantly,the surface of CaCO_(3)particles provides plentiful surface-OH and-CO_(3)^(2+),and these surface groups complex with Co^(2+)to producemore catalytically active species such as surface[CoOH]^(−),resulting in rapid Co^(2+)/Co^(3+)cycling and electron transfer.These interactions cause the observed synergistic effect between Co species and CaCO_(3)particles in PMS activation.Due to good cycle stability,strong anti-interference ability and wide universality,the new method will have broad application prospects.展开更多
The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tas...The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.展开更多
Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in we...Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in weather forecasting and climate prediction models.Hence,the latest activation and triple-moment condensation schemes were combined to simulate and analyze the evolution characteristics of a cloud droplet spectrum from activation to condensation and compared with a high-resolution Lagrangian bin model and the current double-moment condensation schemes,in which the spectral shape parameter is fixed or diagnosed by an empirical formula.The results demonstrate that the latest schemes effectively capture the evolution characteristics of the cloud droplet spectrum during activation and condensation,which is in line with the performance of the bin model.The simulation of the latest activation and condensation schemes in a parcel model shows that the cloud droplet spectrum gradually widens and exhibits a multimodal distribution during the activation process,accompanied by a decrease in the spectral shape and slope parameters over time.Conversely,during the condensation process,the cloud droplet spectrum gradually narrows,resulting in increases in the spectral shape and slope parameters.However,these double-moment schemes fail to accurately replicate the evolution of the cloud droplet spectrum and its multimodal distribution characteristics.Furthermore,the latest schemes were coupled into a 1.5D cumulus model,and an observation case was simulated.The simulations confirm that the cloud droplet spectrum appears wider at the supersaturated cloud base and cloud top due to activation,while it becomes narrower at the middle altitudes of the cloud due to condensation growth.展开更多
The impact of aerosols on clouds,which remains one of the largest aspects of uncertainty in current weather forecasting and climate change research,can be influenced by various factors,such as the underlying surface t...The impact of aerosols on clouds,which remains one of the largest aspects of uncertainty in current weather forecasting and climate change research,can be influenced by various factors,such as the underlying surface type,cloud type,cloud phase,and aerosol type.To explore the impact of different underlying surfaces on the effect of aerosols on cloud development,this study focused on the Yangtze River Delta(YRD)and its offshore regions(YRD sea)for a comparative analysis based on multi-source satellite data,while also considering the variations in cloud type and cloud phase.The results show lower cloud-top height and depth of single-layer clouds over the ocean than land,and higher liquid cloud in spring over the ocean.Aerosols are found to enhance the cumulus cloud depth through microphysical effects,which is particularly evident over the ocean.Aerosols are also found to decrease the cloud droplet effective radius in the ocean region and during the mature stage of cloud development in the land region,while opposite results are found during the early stage of cloud development in the land region.The quantitative results indicate that the indirect effect is positive(0.05)in the land region at relatively high cloud water path,which is smaller than that in the ocean region(0.11).The findings deepen our understanding of the influence aerosols on cloud development and the mechanisms involved,which could then be applied to improve the ability to simulate cloud-associated weather processes.展开更多
基金This study was jointly supported by the National Science Foundation of China under Grant No.s40233031 and 40221503the National Key Basic Research Project under Grant No.G200078502.
文摘A statistically-based low-level cloud parameterization scheme is introduced, modified, and applied in the Flexible coupled General Circulation Model (FGCM-O). It is found that the low-level cloud scheme makes improved simulations of low-level cloud fractions and net surface shortwave radiation fluxes in the subtropical eastern oceans off western coasts in the model. Accompanying the improvement in the net surface shortwave radiation fluxes, the simulated distribution of SSTs is more reasonably asymmetrical about the equator in the tropical eastern Pacific, which suppresses, to some extent, the development of the double ITCZ in the model. Warm SST biases in the ITCZ north of the equator are more realistically reduced, too. But the equatorial cold tongue is strengthened and extends further westward, which reduces the precipitation rate in the western equatorial Pacific but increases it in the ITCZ north of the equator in the far eastern Pacific. It is demonstrated that the low-level cloud-radiation feedback would enhance the cooperative feedback between the equatorial cold tongue and the ITCZ. Based on surface layer heat budget analyses, it is demonstrated that the reduction of SSTs is attributed to both the thermodynamic cooling process modified by the increase of cloud fractions and the oceanic dynamical cooling processes associated with the strengthened surface wind in the eastern equatorial Pacific, but it is mainly attributed to oceanic dynamical cooling processes associated with the strengthening of surface wind in the central and western equatorial Pacific.
基金the National Natu-ral Science Foundation of China under Grant No.40023001and No.40233031 and"Innovation Program"under GrantZKCX2-SW-210and the National Key Basic ResearchProject under Grant G200078502.
文摘Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.
文摘Cloud computing has created a paradigm shift that affects the way in which business applications are developed. Many business organizations use cloud infrastructures as platforms on which to deploy business applications. Increasing numbers of vendors are supplying the cloud marketplace with a wide range of cloud products. Different vendors offer cloud products in different formats. The cost structures for consuming cloud products can be complex. Finding a suitable set of cloud products that meets an application’s requirements and budget can be a challenging task. In this paper, an ontology-based resource mapping mechanism is proposed. Domain-specific ontologies are used to specify high-level application’s requirements. These are then translated into high-level infrastructure ontologies which then can be mapped onto low-level descriptions of cloud resources. Cost ontologies are proposed for cloud resources. An exemplar media transcoding and delivery service is studied in order to illustrate how high-level requirements can be modeled and mapped onto cloud resources within a budget constraint. The proposed ontologies provide an application-centric mechanism for specifying cloud requirements which can then be used for searching for suitable resources in a multi-provider cloud environment.
基金supported by the National Key R&D Program of China[grant number 2023YFC3008004]。
文摘This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.
基金This study is jointly supported by the Chinese Academy of Sciences "Innovation Program" under Grant ZKCX2-SW-210, theNational Natural Science Foundation of China under Grant Nos. 40233031, 40231004, and 40221503, and the National Key BasicResearch Projec
文摘In this study, a statistical cloud scheme is first introduced and coupledwith a first-order turbulence scheme with second-order turbulence moments parameterized by thetimescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then theability of the scheme to simulate cloud fraction at different relative humidity, verticaltemperature profile, and the timescale of the turbulent dissipation is examined by numericalsimulation. It is found that the simulated cloud fraction is sensitive to the parameter used in thestatistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, theintroduced statistical cloud scheme is modified. By combining the modified statistical cloud schemewith a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposedand tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community ClimateModel version 3). It is found that the simulation of low-level cloud fraction is markedly improvedand the centers with maximum low-level cloud fractions are well simulated in the cold oceans off thewestern coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggeststhat the new statistically-based low-level cloud scheme has a great potential in the generalcirculation model for improving the low-level cloud parameterization.
文摘The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
文摘The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approaches,such as IDS,have been developed to tackle these issues.However,most conventional Intrusion Detection System(IDS)models struggle with unseen cyberattacks and complex high-dimensional data.In fact,this paper introduces the idea of a novel distributed explainable and heterogeneous transformer-based intrusion detection system,named INTRUMER,which offers balanced accuracy,reliability,and security in cloud settings bymultiplemodulesworking together within it.The traffic captured from cloud devices is first passed to the TC&TM module in which the Falcon Optimization Algorithm optimizes the feature selection process,and Naie Bayes algorithm performs the classification of features.The selected features are classified further and are forwarded to the Heterogeneous Attention Transformer(HAT)module.In this module,the contextual interactions of the network traffic are taken into account to classify them as normal or malicious traffic.The classified results are further analyzed by the Explainable Prevention Module(XPM)to ensure trustworthiness by providing interpretable decisions.With the explanations fromthe classifier,emergency alarms are transmitted to nearby IDSmodules,servers,and underlying cloud devices for the enhancement of preventive measures.Extensive experiments on benchmark IDS datasets CICIDS 2017,Honeypots,and NSL-KDD were conducted to demonstrate the efficiency of the INTRUMER model in detecting network trafficwith high accuracy for different types.Theproposedmodel outperforms state-of-the-art approaches,obtaining better performance metrics:98.7%accuracy,97.5%precision,96.3%recall,and 97.8%F1-score.Such results validate the robustness and effectiveness of INTRUMER in securing diverse cloud environments against sophisticated cyber threats.
文摘Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various domains. This study presents an enhanced cloud detection method based on the U-Net architecture, designed to address the challenges of multi-scale cloud features and long-range dependencies inherent in remote sensing imagery. A Multi-Scale Dilated Attention (MSDA) module is introduced to effectively integrate multi-scale information and model long-range dependencies across different scales, enhancing the model’s ability to detect clouds of varying sizes. Additionally, a Multi-Head Self-Attention (MHSA) mechanism is incorporated to improve the model’s capacity for capturing finer details, particularly in distinguishing thin clouds from surface features. A multi-path supervision mechanism is also devised to ensure the model learns cloud features at multiple scales, further boosting the accuracy and robustness of cloud mask generation. Experimental results demonstrate that the enhanced model achieves superior performance compared to other benchmarked methods in complex scenarios. It significantly improves cloud detection accuracy, highlighting its strong potential for practical applications in cloud detection tasks.
基金Shanxi Province Higher Education Science and Technology Innovation Fund Project(2022-676)Shanxi Soft Science Program Research Fund Project(2016041008-6)。
文摘In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.
基金supported by the National Natural Science Foundation of China(Nos.22076052 and 21976063)China Postdoctoral Science Foundation(No.2018M642850)the Research Funding of Wuhan Polytechnic University(No.2023RZ017).
文摘An efficient catalytic system was developed to remove various organic pollutants by simultaneously using low-level cobalt ions,calcium carbonate micro-particles and peroxymonosulfate(PMS).A simple base-induced precipitation was used to successfully loaded Co-centered reactive sites onto the surface of CaCO_(3)microparticles.Under optimal conditions at 25°C,10 mg/L methylene blue(MB)could be completely degraded within 10 min with 480μg/L Co^(2+),0.4 g/L CaCO_(3)microparticles(or 0.4 g/L Co@CaCO_(3))and 0.1 g/L PMS.The MB degradation followed the pseudo first order kinetics with a rate constant of 0.583 min^(−1),being 8.3,11.5 and 53.0 times that by using Co-OH(0.07 min^(−1)),Co^(2+)(0.044 min^(−1))and CaCO_(3)(0.011 min^(−1))as the catalyst,respectively.It was confirmed that there was a synergistic effect in the catalytic activity between Co species and the CaCO_(3)particles but the major contributor was the highly dispersed Co species.When Co^(2+)-containing simulated electroplating wastewater was used as the Co^(2+)source,not only the added MB was also completely degraded within 5 min in this catalytic system,but also the coexisting heavy metal ions were substantially removed.The presently developed method was applied to simultaneously treat organic wastewater and heavy metals wastewater.The present method was also successfully used to efficiently degrade other organic pollutants including bisphenol A,sulfamethoxazole,rhodamine B,tetrabromobisphenol A,ofloxacin and benzoic acid.A catalytic mechanism was proposed for the PMS activation by Co@CaCO_(3).The surface of CaCO_(3)particles favors the adsorption of Co^(2+).More importantly,the surface of CaCO_(3)particles provides plentiful surface-OH and-CO_(3)^(2+),and these surface groups complex with Co^(2+)to producemore catalytically active species such as surface[CoOH]^(−),resulting in rapid Co^(2+)/Co^(3+)cycling and electron transfer.These interactions cause the observed synergistic effect between Co species and CaCO_(3)particles in PMS activation.Due to good cycle stability,strong anti-interference ability and wide universality,the new method will have broad application prospects.
文摘The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.
基金supported by the National Natural Science Foundations of China(Grant Nos.42305163 and U22A20577)the Construction Project of Weather Modification Ability in Central China(Grant No.ZQC-H22256)+2 种基金the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB0760300)the Projects of the Earth System Numerical Simulation Facility(Grant Nos.2024-EL-PT-000707,2023-ELPT-000482,2023-EL-ZD-00026,and 2022-EL-PT-00083)the STS Program of the Inner Mongolia Meteorological Service,Chongqing Institute of Green and Intelligent Technology,Chinese Academy of Sciences,and Institute of Atmospheric Physics,Chinese Academy of Sciences(Grant No.2021CG0047)。
文摘Accurate descriptions of cloud droplet spectra from aerosol activation to vapor condensation using microphysical parameterization schemes are crucial for numerical simulations of precipitation and climate change in weather forecasting and climate prediction models.Hence,the latest activation and triple-moment condensation schemes were combined to simulate and analyze the evolution characteristics of a cloud droplet spectrum from activation to condensation and compared with a high-resolution Lagrangian bin model and the current double-moment condensation schemes,in which the spectral shape parameter is fixed or diagnosed by an empirical formula.The results demonstrate that the latest schemes effectively capture the evolution characteristics of the cloud droplet spectrum during activation and condensation,which is in line with the performance of the bin model.The simulation of the latest activation and condensation schemes in a parcel model shows that the cloud droplet spectrum gradually widens and exhibits a multimodal distribution during the activation process,accompanied by a decrease in the spectral shape and slope parameters over time.Conversely,during the condensation process,the cloud droplet spectrum gradually narrows,resulting in increases in the spectral shape and slope parameters.However,these double-moment schemes fail to accurately replicate the evolution of the cloud droplet spectrum and its multimodal distribution characteristics.Furthermore,the latest schemes were coupled into a 1.5D cumulus model,and an observation case was simulated.The simulations confirm that the cloud droplet spectrum appears wider at the supersaturated cloud base and cloud top due to activation,while it becomes narrower at the middle altitudes of the cloud due to condensation growth.
基金supported by the National Natural Science Foundation of China(Grant No.42230601).
文摘The impact of aerosols on clouds,which remains one of the largest aspects of uncertainty in current weather forecasting and climate change research,can be influenced by various factors,such as the underlying surface type,cloud type,cloud phase,and aerosol type.To explore the impact of different underlying surfaces on the effect of aerosols on cloud development,this study focused on the Yangtze River Delta(YRD)and its offshore regions(YRD sea)for a comparative analysis based on multi-source satellite data,while also considering the variations in cloud type and cloud phase.The results show lower cloud-top height and depth of single-layer clouds over the ocean than land,and higher liquid cloud in spring over the ocean.Aerosols are found to enhance the cumulus cloud depth through microphysical effects,which is particularly evident over the ocean.Aerosols are also found to decrease the cloud droplet effective radius in the ocean region and during the mature stage of cloud development in the land region,while opposite results are found during the early stage of cloud development in the land region.The quantitative results indicate that the indirect effect is positive(0.05)in the land region at relatively high cloud water path,which is smaller than that in the ocean region(0.11).The findings deepen our understanding of the influence aerosols on cloud development and the mechanisms involved,which could then be applied to improve the ability to simulate cloud-associated weather processes.