Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts...Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts of climate change.Remote sensing has become a vital tool for snow monitoring,with the widely used Moderate-resolution Imaging Spectroradiometer(MODIS)snow products from the Terra and Aqua satellites.However,cloud cover often interferes with snow detection,making cloud removal techniques crucial for reliable snow product generation.This study evaluated the accuracy of four MODIS snow cover datasets generated through different cloud removal algorithms.Using real-time field camera observations from four stations in the Tianshan Mountains,China,this study assessed the performance of these datasets during three distinct snow periods:the snow accumulation period(September-November),snowmelt period(March-June),and stable snow period(December-February in the following year).The findings showed that cloud-free snow products generated using the Hidden Markov Random Field(HMRF)algorithm consistently outperformed the others,particularly under cloud cover,while cloud-free snow products using near-day synthesis and the spatiotemporal adaptive fusion method with error correction(STAR)demonstrated varying performance depending on terrain complexity and cloud conditions.This study highlighted the importance of considering terrain features,land cover types,and snow dynamics when selecting cloud removal methods,particularly in areas with rapid snow accumulation and melting.The results suggested that future research should focus on improving cloud removal algorithms through the integration of machine learning,multi-source data fusion,and advanced remote sensing technologies.By expanding validation efforts and refining cloud removal strategies,more accurate and reliable snow products can be developed,contributing to enhanced snow monitoring and better management of water resources in alpine and arid areas.展开更多
This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the pred...This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.展开更多
Snow and cloud discrimination is a main factor contributing to errors in satellite-based snow cover.To address the error,satellite-based snow cover performs snow reclassification tests on the cloud pixels of the cloud...Snow and cloud discrimination is a main factor contributing to errors in satellite-based snow cover.To address the error,satellite-based snow cover performs snow reclassification tests on the cloud pixels of the cloud mask,but the error still remains.Machine Learning(ML)has recently been applied to remote sensing to calculate satellite-based meteorological data,and its utility has been demonstrated.In this study,snow and cloud discrimination errors were analyzed for GK-2A/AMI snow cover,and ML models(Random Forest and Deep Neural Network)were applied to accurately distinguish snow and clouds.The ML-based snow reclassified was integrated with the GK-2A/AMI snow cover through post-processing.We used the S-NPP/VIIRS snow cover and ASOS in situ snow observation data,which are satellite-based snow cover and ground truth data,as validation data to evaluate whether the snow/cloud discrimination is improved.The ML-based integrated snow cover detected 33–53%more snow compared to the GK-2A/AMI snow cover.In terms of performance,the F1-score and overall accuracy of the GK-2A/AMI snow cover was 73.06%and 89.99%,respectively,and those of the integrated snow cover were 76.78–78.28%and 90.93–91.26%,respectively.展开更多
Because of similar reflective characteristics of snow and cloud, the weather status seriously affects snow monitoring using optical remote sensing data. Cloud amount analysis during 2010 to 2011 snow seasons shows tha...Because of similar reflective characteristics of snow and cloud, the weather status seriously affects snow monitoring using optical remote sensing data. Cloud amount analysis during 2010 to 2011 snow seasons shows that cloud cover is the major limitation for snow cover monitoring using MOD10A1 and MYD10A1. By use of MODIS daily snow cover products and AMSR-E snow wa- ter equivalent products (SWE), several cloud elimination methods were integrated to produce a new daily cloud flee snow cover product, and information of snow depth from 85 climate stations in Tibetan Plateau area (TP) were used to validate the accuracy of the new composite snow cover product. The results indicate that snow classification accuracy of the new daily snow cover product reaches 91.7% when snow depth is over 3 cm. This suggests that the new daily snow cover mapping algorithm is suitable for monitoring snow cover dynamic changes in TP.展开更多
The snow enhancement experiments, carried out by injecting AgI and water vapor into orographically enhanced clouds (fog), have been conducted to confirm Li and Pitter's forced condensation process in a natural situ...The snow enhancement experiments, carried out by injecting AgI and water vapor into orographically enhanced clouds (fog), have been conducted to confirm Li and Pitter's forced condensation process in a natural situation. Nine ground-based experiments have been conducted at Daegwallyeong in the Taebaek Mountains for the easterly foggy days from January-February 2006. We then obtained the optimized conditions for the Daegwallyeong region as follows: the small seeding rate (1.04 g min-1) of AgI for the easterly cold fog with the high humidity of Gangneung. Additional experiments are needed to statistically estimate the snowfall increment caused by the small AgI seeding into the orographical fog (cloud) over the Taeback Mountains.展开更多
The understanding of the cloud processes of snowfall is essential to the artificial enhancement of snow and the numerical simulation of snowfall. The mesoscale model MM5 is used to simulate a moderate snowfall event i...The understanding of the cloud processes of snowfall is essential to the artificial enhancement of snow and the numerical simulation of snowfall. The mesoscale model MM5 is used to simulate a moderate snowfall event in North China that occurred during 20-21 December 2002. Thirteen experiments are performed to test the sensitivity of the simulation to the cloud physics with different cumulus parameterization schemes and different options for the Goddard cloud microphysics parameterization schemes. It is shown that the cumulus parameterization scheme has little to do with the simulation result. The results also show that there are only four classes of water substances, namely the cloud water, cloud ice, snow, and vapor, in the simulation of the moderate snowfall event. The analysis of the cloud microphysics budgets in the explicit experiment shows that the condensation of supersaturated vapor, the depositional growth of cloud ice, the initiation of cloud ice, the accretion of cloud ice by snow, the accretion of cloud water by snow, the deposition growth of snow, and the Bergeron process of cloud ice are the dominant cloud microphysical processes in the simulation. The accretion of cloud water by snow and the deposition growth of the snow are equally important in the development of the snow.展开更多
In polar regions, cloud and underlying ice-snow areas are difficult to distinguish in satellite images because of their high albedo in the visible band and low surface temperature of ice-snow areas in the infrared ban...In polar regions, cloud and underlying ice-snow areas are difficult to distinguish in satellite images because of their high albedo in the visible band and low surface temperature of ice-snow areas in the infrared band. A cloud detection method over ice-snow covered areas in Antarctica is presented. On account of different texture features of cloud and ice-snow areas, five texture features are extracted based on GLCM. Nonlinear SVM is then used to obtain the optimal classification hyperplane from training data. The experiment results indicate that this algorithm performs well in cloud detection in Antarctica, especially for thin cirrus detection. Furthermore, when images are resampled to a quarter or 1/16 of the full size, cloud percentages are still at the same level, while the processing time decreases exponentially.展开更多
The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achievi...The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.展开更多
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approach...The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approaches,such as IDS,have been developed to tackle these issues.However,most conventional Intrusion Detection System(IDS)models struggle with unseen cyberattacks and complex high-dimensional data.In fact,this paper introduces the idea of a novel distributed explainable and heterogeneous transformer-based intrusion detection system,named INTRUMER,which offers balanced accuracy,reliability,and security in cloud settings bymultiplemodulesworking together within it.The traffic captured from cloud devices is first passed to the TC&TM module in which the Falcon Optimization Algorithm optimizes the feature selection process,and Naie Bayes algorithm performs the classification of features.The selected features are classified further and are forwarded to the Heterogeneous Attention Transformer(HAT)module.In this module,the contextual interactions of the network traffic are taken into account to classify them as normal or malicious traffic.The classified results are further analyzed by the Explainable Prevention Module(XPM)to ensure trustworthiness by providing interpretable decisions.With the explanations fromthe classifier,emergency alarms are transmitted to nearby IDSmodules,servers,and underlying cloud devices for the enhancement of preventive measures.Extensive experiments on benchmark IDS datasets CICIDS 2017,Honeypots,and NSL-KDD were conducted to demonstrate the efficiency of the INTRUMER model in detecting network trafficwith high accuracy for different types.Theproposedmodel outperforms state-of-the-art approaches,obtaining better performance metrics:98.7%accuracy,97.5%precision,96.3%recall,and 97.8%F1-score.Such results validate the robustness and effectiveness of INTRUMER in securing diverse cloud environments against sophisticated cyber threats.展开更多
This is an in-depth journey to experience the ice and snow of Changbai Mountain.In these few days,you will explore Changbai Mountain and enjoy powder skiing;gallop on the ski trail;watch the stunning wonders of snow r...This is an in-depth journey to experience the ice and snow of Changbai Mountain.In these few days,you will explore Changbai Mountain and enjoy powder skiing;gallop on the ski trail;watch the stunning wonders of snow rime on thousands of trees;conquer the ice and snow wilderness on a snowmobile and start an in depth magical mystery tour in lilin Province.展开更多
The snow density is a fundamental variable of the snow physical evolution processes,which can reflect the snowpack condition due to the thermal and gravitational compaction.Snow density is a bridge to transfer snow de...The snow density is a fundamental variable of the snow physical evolution processes,which can reflect the snowpack condition due to the thermal and gravitational compaction.Snow density is a bridge to transfer snow depth to snow water equivalent(SWE)for the snow water resources research.Therefore,it is important to understand the spatiotemporal distribution of snow density for the appropriate estimation of SWE.In this study,in situ snow densities from more than 6,000 stations in the Northern Hemisphere were used to analyze the spatial and temporal variations in snow density.The results displayed that snow density varied spatially and temporally in the Northern Hemisphere,with range of below 0.1 to over 0.4 g/cm^(3).The average snow densities in the mountainous regions of western North America,southeastern Canada,and Europe range from approximately 0.24 to 0.26 g/cm^(3),which is significantly greater than the values of 0.16–0.17 g/cm^(3)observed in Siberia,central Canada,the Great Plains of the United States,and China.The seasonal growth rates also present large spatial heterogeneity.The rates are over 0.024 g/cm^(3)per month in Southeastern Canada,the west mountain of North America and Europe,approximately 0.017 g/cm^(3)per month in Siberia,much larger than approximately 0.004 g/cm^(3)per month in other regions.Snow cover duration is a critical factor to determine the snow density.This study endorses the small snow density in China based on meteorological station observations,which results from that the meteorological stations are dominantly distributed in plain areas with relative short snow cover duration and shallow snow.展开更多
Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various doma...Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various domains. This study presents an enhanced cloud detection method based on the U-Net architecture, designed to address the challenges of multi-scale cloud features and long-range dependencies inherent in remote sensing imagery. A Multi-Scale Dilated Attention (MSDA) module is introduced to effectively integrate multi-scale information and model long-range dependencies across different scales, enhancing the model’s ability to detect clouds of varying sizes. Additionally, a Multi-Head Self-Attention (MHSA) mechanism is incorporated to improve the model’s capacity for capturing finer details, particularly in distinguishing thin clouds from surface features. A multi-path supervision mechanism is also devised to ensure the model learns cloud features at multiple scales, further boosting the accuracy and robustness of cloud mask generation. Experimental results demonstrate that the enhanced model achieves superior performance compared to other benchmarked methods in complex scenarios. It significantly improves cloud detection accuracy, highlighting its strong potential for practical applications in cloud detection tasks.展开更多
In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-base...In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.展开更多
The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tas...The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.展开更多
基金funded by the Third Xinjiang Scientific Expedition Program(2021xjkk1400)the National Natural Science Foundation of China(42071049)+2 种基金the Natural Science Foundation of Xinjiang Uygur Autonomous Region(2019D01C022)the Xinjiang Uygur Autonomous Region Innovation Environment Construction Special Project&Science and Technology Innovation Base Construction Project(PT2107)the Tianshan Talent-Science and Technology Innovation Team(2022TSYCTD0006).
文摘Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts of climate change.Remote sensing has become a vital tool for snow monitoring,with the widely used Moderate-resolution Imaging Spectroradiometer(MODIS)snow products from the Terra and Aqua satellites.However,cloud cover often interferes with snow detection,making cloud removal techniques crucial for reliable snow product generation.This study evaluated the accuracy of four MODIS snow cover datasets generated through different cloud removal algorithms.Using real-time field camera observations from four stations in the Tianshan Mountains,China,this study assessed the performance of these datasets during three distinct snow periods:the snow accumulation period(September-November),snowmelt period(March-June),and stable snow period(December-February in the following year).The findings showed that cloud-free snow products generated using the Hidden Markov Random Field(HMRF)algorithm consistently outperformed the others,particularly under cloud cover,while cloud-free snow products using near-day synthesis and the spatiotemporal adaptive fusion method with error correction(STAR)demonstrated varying performance depending on terrain complexity and cloud conditions.This study highlighted the importance of considering terrain features,land cover types,and snow dynamics when selecting cloud removal methods,particularly in areas with rapid snow accumulation and melting.The results suggested that future research should focus on improving cloud removal algorithms through the integration of machine learning,multi-source data fusion,and advanced remote sensing technologies.By expanding validation efforts and refining cloud removal strategies,more accurate and reliable snow products can be developed,contributing to enhanced snow monitoring and better management of water resources in alpine and arid areas.
基金supported by the National Key R&D Program of China[grant number 2023YFC3008004]。
文摘This study introduces a new ocean surface friction velocity scheme and a modified Thompson cloud microphysics parameterization scheme into the CMA-TYM model.The impact of these two parameterization schemes on the prediction of the movement track and intensity of Typhoon Kompasu in 2021 is examined.Additionally,the possible reasons for their effects on tropical cyclone(TC)intensity prediction are analyzed.Statistical results show that both parameterization schemes improve the predictions of Typhoon Kompasu’s track and intensity.The influence on track prediction becomes evident after 60 h of model integration,while the significant positive impact on intensity prediction is observed after 66 h.Further analysis reveals that these two schemes affect the timing and magnitude of extreme TC intensity values by influencing the evolution of the TC’s warm-core structure.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korean government(MSIT)[grant number 2021R1A2C2010976].
文摘Snow and cloud discrimination is a main factor contributing to errors in satellite-based snow cover.To address the error,satellite-based snow cover performs snow reclassification tests on the cloud pixels of the cloud mask,but the error still remains.Machine Learning(ML)has recently been applied to remote sensing to calculate satellite-based meteorological data,and its utility has been demonstrated.In this study,snow and cloud discrimination errors were analyzed for GK-2A/AMI snow cover,and ML models(Random Forest and Deep Neural Network)were applied to accurately distinguish snow and clouds.The ML-based snow reclassified was integrated with the GK-2A/AMI snow cover through post-processing.We used the S-NPP/VIIRS snow cover and ASOS in situ snow observation data,which are satellite-based snow cover and ground truth data,as validation data to evaluate whether the snow/cloud discrimination is improved.The ML-based integrated snow cover detected 33–53%more snow compared to the GK-2A/AMI snow cover.In terms of performance,the F1-score and overall accuracy of the GK-2A/AMI snow cover was 73.06%and 89.99%,respectively,and those of the integrated snow cover were 76.78–78.28%and 90.93–91.26%,respectively.
基金supported by the China State Kay Basic Research Project(2013CBA01802)Chinese National Natural Science Foundation(41101337+2 种基金41001197and 31228021)the Fundamental Research Funds for the Central Universities(lzujbky-2013-103)
文摘Because of similar reflective characteristics of snow and cloud, the weather status seriously affects snow monitoring using optical remote sensing data. Cloud amount analysis during 2010 to 2011 snow seasons shows that cloud cover is the major limitation for snow cover monitoring using MOD10A1 and MYD10A1. By use of MODIS daily snow cover products and AMSR-E snow wa- ter equivalent products (SWE), several cloud elimination methods were integrated to produce a new daily cloud flee snow cover product, and information of snow depth from 85 climate stations in Tibetan Plateau area (TP) were used to validate the accuracy of the new composite snow cover product. The results indicate that snow classification accuracy of the new daily snow cover product reaches 91.7% when snow depth is over 3 cm. This suggests that the new daily snow cover mapping algorithm is suitable for monitoring snow cover dynamic changes in TP.
基金supported by the Korea Science and Engineering Foundation (KOSEF)grant funded by the Korea government (MOST) R01-2006-000-10470-0 and R01-2006-000-11233-0 from the Basic Research Program of the Korea Science & Engineering Foundationby "Maintenance and Research of Cloud Phys-ical Observation System" and "Research for the Meteo-rological Observation Technology and its Application" ofMETRI, KMA project.
文摘The snow enhancement experiments, carried out by injecting AgI and water vapor into orographically enhanced clouds (fog), have been conducted to confirm Li and Pitter's forced condensation process in a natural situation. Nine ground-based experiments have been conducted at Daegwallyeong in the Taebaek Mountains for the easterly foggy days from January-February 2006. We then obtained the optimized conditions for the Daegwallyeong region as follows: the small seeding rate (1.04 g min-1) of AgI for the easterly cold fog with the high humidity of Gangneung. Additional experiments are needed to statistically estimate the snowfall increment caused by the small AgI seeding into the orographical fog (cloud) over the Taeback Mountains.
基金The authors benefited from discussions with Professors C.-H.Sui and Xu Huanbin.The comments of the three anonymous reviewers are acknowledged.This research was supported by the National Natural Science Foundation of China.(Grant Nos.40375036 and 40105006).
文摘The understanding of the cloud processes of snowfall is essential to the artificial enhancement of snow and the numerical simulation of snowfall. The mesoscale model MM5 is used to simulate a moderate snowfall event in North China that occurred during 20-21 December 2002. Thirteen experiments are performed to test the sensitivity of the simulation to the cloud physics with different cumulus parameterization schemes and different options for the Goddard cloud microphysics parameterization schemes. It is shown that the cumulus parameterization scheme has little to do with the simulation result. The results also show that there are only four classes of water substances, namely the cloud water, cloud ice, snow, and vapor, in the simulation of the moderate snowfall event. The analysis of the cloud microphysics budgets in the explicit experiment shows that the condensation of supersaturated vapor, the depositional growth of cloud ice, the initiation of cloud ice, the accretion of cloud ice by snow, the accretion of cloud water by snow, the deposition growth of snow, and the Bergeron process of cloud ice are the dominant cloud microphysical processes in the simulation. The accretion of cloud water by snow and the deposition growth of the snow are equally important in the development of the snow.
基金Supported by the Antarctic Geography Information Acquisition and Environmental Change Research of China (No.14601402024-04-06).
文摘In polar regions, cloud and underlying ice-snow areas are difficult to distinguish in satellite images because of their high albedo in the visible band and low surface temperature of ice-snow areas in the infrared band. A cloud detection method over ice-snow covered areas in Antarctica is presented. On account of different texture features of cloud and ice-snow areas, five texture features are extracted based on GLCM. Nonlinear SVM is then used to obtain the optimal classification hyperplane from training data. The experiment results indicate that this algorithm performs well in cloud detection in Antarctica, especially for thin cirrus detection. Furthermore, when images are resampled to a quarter or 1/16 of the full size, cloud percentages are still at the same level, while the processing time decreases exponentially.
文摘The rapid advent in artificial intelligence and big data has revolutionized the dynamic requirement in the demands of the computing resource for executing specific tasks in the cloud environment.The process of achieving autonomic resource management is identified to be a herculean task due to its huge distributed and heterogeneous environment.Moreover,the cloud network needs to provide autonomic resource management and deliver potential services to the clients by complying with the requirements of Quality-of-Service(QoS)without impacting the Service Level Agreements(SLAs).However,the existing autonomic cloud resource managing frameworks are not capable in handling the resources of the cloud with its dynamic requirements.In this paper,Coot Bird Behavior Model-based Workload Aware Autonomic Resource Management Scheme(CBBM-WARMS)is proposed for handling the dynamic requirements of cloud resources through the estimation of workload that need to be policed by the cloud environment.This CBBM-WARMS initially adopted the algorithm of adaptive density peak clustering for workloads clustering of the cloud.Then,it utilized the fuzzy logic during the process of workload scheduling for achieving the determining the availability of cloud resources.It further used CBBM for potential Virtual Machine(VM)deployment that attributes towards the provision of optimal resources.It is proposed with the capability of achieving optimal QoS with minimized time,energy consumption,SLA cost and SLA violation.The experimental validation of the proposed CBBMWARMS confirms minimized SLA cost of 19.21%and reduced SLA violation rate of 18.74%,better than the compared autonomic cloud resource managing frameworks.
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
文摘The increasing use of cloud-based devices has reached the critical point of cybersecurity and unwanted network traffic.Cloud environments pose significant challenges in maintaining privacy and security.Global approaches,such as IDS,have been developed to tackle these issues.However,most conventional Intrusion Detection System(IDS)models struggle with unseen cyberattacks and complex high-dimensional data.In fact,this paper introduces the idea of a novel distributed explainable and heterogeneous transformer-based intrusion detection system,named INTRUMER,which offers balanced accuracy,reliability,and security in cloud settings bymultiplemodulesworking together within it.The traffic captured from cloud devices is first passed to the TC&TM module in which the Falcon Optimization Algorithm optimizes the feature selection process,and Naie Bayes algorithm performs the classification of features.The selected features are classified further and are forwarded to the Heterogeneous Attention Transformer(HAT)module.In this module,the contextual interactions of the network traffic are taken into account to classify them as normal or malicious traffic.The classified results are further analyzed by the Explainable Prevention Module(XPM)to ensure trustworthiness by providing interpretable decisions.With the explanations fromthe classifier,emergency alarms are transmitted to nearby IDSmodules,servers,and underlying cloud devices for the enhancement of preventive measures.Extensive experiments on benchmark IDS datasets CICIDS 2017,Honeypots,and NSL-KDD were conducted to demonstrate the efficiency of the INTRUMER model in detecting network trafficwith high accuracy for different types.Theproposedmodel outperforms state-of-the-art approaches,obtaining better performance metrics:98.7%accuracy,97.5%precision,96.3%recall,and 97.8%F1-score.Such results validate the robustness and effectiveness of INTRUMER in securing diverse cloud environments against sophisticated cyber threats.
文摘This is an in-depth journey to experience the ice and snow of Changbai Mountain.In these few days,you will explore Changbai Mountain and enjoy powder skiing;gallop on the ski trail;watch the stunning wonders of snow rime on thousands of trees;conquer the ice and snow wilderness on a snowmobile and start an in depth magical mystery tour in lilin Province.
基金funded by the National Natural Science Foundation of China(Grant Nos:42125604&42171143)the Second Tibetan Plateau Scientific Expedition and Research Program(STEP)(Grant No.2019QZKK0201)。
文摘The snow density is a fundamental variable of the snow physical evolution processes,which can reflect the snowpack condition due to the thermal and gravitational compaction.Snow density is a bridge to transfer snow depth to snow water equivalent(SWE)for the snow water resources research.Therefore,it is important to understand the spatiotemporal distribution of snow density for the appropriate estimation of SWE.In this study,in situ snow densities from more than 6,000 stations in the Northern Hemisphere were used to analyze the spatial and temporal variations in snow density.The results displayed that snow density varied spatially and temporally in the Northern Hemisphere,with range of below 0.1 to over 0.4 g/cm^(3).The average snow densities in the mountainous regions of western North America,southeastern Canada,and Europe range from approximately 0.24 to 0.26 g/cm^(3),which is significantly greater than the values of 0.16–0.17 g/cm^(3)observed in Siberia,central Canada,the Great Plains of the United States,and China.The seasonal growth rates also present large spatial heterogeneity.The rates are over 0.024 g/cm^(3)per month in Southeastern Canada,the west mountain of North America and Europe,approximately 0.017 g/cm^(3)per month in Siberia,much larger than approximately 0.004 g/cm^(3)per month in other regions.Snow cover duration is a critical factor to determine the snow density.This study endorses the small snow density in China based on meteorological station observations,which results from that the meteorological stations are dominantly distributed in plain areas with relative short snow cover duration and shallow snow.
文摘Cloud detection is a critical preprocessing step in remote sensing image processing, as the presence of clouds significantly affects the accuracy of remote sensing data and limits its applicability across various domains. This study presents an enhanced cloud detection method based on the U-Net architecture, designed to address the challenges of multi-scale cloud features and long-range dependencies inherent in remote sensing imagery. A Multi-Scale Dilated Attention (MSDA) module is introduced to effectively integrate multi-scale information and model long-range dependencies across different scales, enhancing the model’s ability to detect clouds of varying sizes. Additionally, a Multi-Head Self-Attention (MHSA) mechanism is incorporated to improve the model’s capacity for capturing finer details, particularly in distinguishing thin clouds from surface features. A multi-path supervision mechanism is also devised to ensure the model learns cloud features at multiple scales, further boosting the accuracy and robustness of cloud mask generation. Experimental results demonstrate that the enhanced model achieves superior performance compared to other benchmarked methods in complex scenarios. It significantly improves cloud detection accuracy, highlighting its strong potential for practical applications in cloud detection tasks.
基金Shanxi Province Higher Education Science and Technology Innovation Fund Project(2022-676)Shanxi Soft Science Program Research Fund Project(2016041008-6)。
文摘In order to improve the efficiency of cloud-based web services,an improved plant growth simulation algorithm scheduling model.This model first used mathematical methods to describe the relationships between cloud-based web services and the constraints of system resources.Then,a light-induced plant growth simulation algorithm was established.The performance of the algorithm was compared through several plant types,and the best plant model was selected as the setting for the system.Experimental results show that when the number of test cloud-based web services reaches 2048,the model being 2.14 times faster than PSO,2.8 times faster than the ant colony algorithm,2.9 times faster than the bee colony algorithm,and a remarkable 8.38 times faster than the genetic algorithm.
文摘The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.