The prevailing idea so far about why the rainfall occurs was that after agglutination of water droplets with condensation nuclei, the size of the particle formed by the condensation nuclei connected with droplets of w...The prevailing idea so far about why the rainfall occurs was that after agglutination of water droplets with condensation nuclei, the size of the particle formed by the condensation nuclei connected with droplets of water increased considerably and caused its fall. This idea has led to numerous scientific publications in which empirical distribution functions of clouds’ water droplets sizes were proposed. Estimates values provided by these empirical distribution functions, in most cases, were validated by comparison with UHF Radar measurements. The condensation nuclei concept has not been sufficiently exploited and this has led meteorologists to error, in their attempt to describe the clouds, thinking that clouds were formed by liquid water droplets. Indeed, MBANE BIOUELE paradox (2005) confirms this embarrassing situation. In fact, when applying Archimedes theorem to a liquid water droplet suspended in the atmosphere, we obtain a meaningless inequality ?which makes believe that the densities of pure water in liquid and solid phases are much lower than that of the atmosphere considered at the sea level. This meaningless inequality is easy to contradict: of course, if you empty a bottle of pure liquid water in the ocean (where z is equal to 0), this water will not remain suspended in the air, i.e., application of Archimedes’ theorem allows realizing that there is no liquid (or solid) water droplet, suspended in the clouds. Indeed, all liquid (or solid) water droplets which are formed in clouds, fall under the effect of gravity and produce rains. This means that our current description of the clouds is totally wrong. In this study, we describe the clouds as a gas composed of dry air and saturated water vapor whose optical properties depend on temperature, i.e., when the temperature of a cloud decreases, the color of this gaseous system tends towards white.展开更多
Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildi...Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildings, etc. Non-ground objects need to be removed for creation of a Digital Terrain Model (DTM) which is a continuous surface representing only ground surface points. This study aimed at comparative analysis of three main filtering approaches for stripping off non-ground objects namely;Gaussian low pass filter, focal analysis mean filter and DTM slope-based filter of varying window sizes in creation of a reliable DTM from airborne LiDAR point clouds. A sample of LiDAR data provided by the ISPRS WG III/4 captured at Vaihingen in Germany over a pure residential area has been used in the analysis. Visual analysis has indicated that Gaussian low pass filter has given blurred DTMs of attenuated high-frequency objects and emphasized low-frequency objects while it has achieved improved removal of non-ground object at larger window sizes. Focal analysis mean filter has shown better removal of nonground objects compared to Gaussian low pass filter especially at large window sizes where details of non-ground objects almost have diminished in the DTMs from window sizes of 25 × 25 and greater. DTM slope-based filter has created bare earth models that have been full of gabs at the positions of the non-ground objects where the sizes and numbers of that gabs have increased with increasing the window sizes of filter. Those gaps have been closed through exploitation of the spline interpolation method in order to get continuous surface representing bare earth landscape. Comparative analysis has shown that the minimum elevations of the DTMs increase with increasing the filter widow sizes till 21 × 21 and 31 × 31 for the Gaussian low pass filter and the focal analysis mean filter respectively. On the other hand, the DTM slope-based filter has kept the minimum elevation of the original data, that could be due to noise in the LiDAR data unchanged. Alternatively, the three approaches have produced DTMs of decreasing maximum elevation values and consequently decreasing ranges of elevations due to increases in the filter window sizes. Moreover, the standard deviations of the created DTMs from the three filters have decreased with increasing the filter window sizes however, the decreases have been continuous and steady in the cases of the Gaussian low pass filter and the focal analysis mean filters while in the case of the DTM slope-based filter the standard deviations of the created DTMs have decreased with high rates till window size of 31 × 31 then they have kept unchanged due to more increases in the filter window sizes.展开更多
During daylight laser polarization sensing of high-level clouds(HLCs),the lidar receiving system generates a signal caused by not only backscattered laser radiation,but also scattered solar radiation,the intensity and...During daylight laser polarization sensing of high-level clouds(HLCs),the lidar receiving system generates a signal caused by not only backscattered laser radiation,but also scattered solar radiation,the intensity and polarization of which depends on the Sun’s location.If a cloud contains spatially oriented ice particles,then it becomes anisotropic,that is,the coefficients of directional light scattering of such a cloud depend on the Sun’s zenith and azimuth angles.In this work,the possibility of using the effect of anisotropic scattering of solar radiation on the predictive ability of machine learning algorithms in solving the problem of predicting the HLC backscattering phase matrix(BSPM)was evaluated.The hypothesis that solar radiation scattered on HLCs has no effect on the BSPM elements of such clouds determined with a polarization lidar was tested.The operation of two algorithms for predicting the BSPM elements is evaluated.To train the first one,meteorological data were used as input parameters;for the second algorithm,the azi-muthal and zenith angles of the Sun’s position were added to the meteorological parameters.It is shown that there is no significant improvement in the predictive ability of the algorithm.展开更多
An analysis of global radiation measurements and fractional cloud cover observations made in the Israel Meteorological Service’s network of climate stations demonstrated a significant decrease in the transmittance of...An analysis of global radiation measurements and fractional cloud cover observations made in the Israel Meteorological Service’s network of climate stations demonstrated a significant decrease in the transmittance of solar radiation through the atmosphere during the last 60 years. The major cause was the reduced transparency of clouds. Under completely overcast skies with complete cloud cover transmission in the industrialized central coastal region decreased from 0.41 in the mid-20th century to 0.21 in the first decade of the 21st century. Under cloudless skies the reduction in the transmission of global radiation was less, from 0.79 to 0.71, and not statistically significant. Similar but somewhat smaller changes were observed in the less industrialized central hill region. Multi-linear analysis showed that since 1970, 61% of the measured decline in global radiation was attributable to changes in fractional cloud cover but only 2% to the marked increase in local fuel combustion;there was no statistically significant interaction between the two parameters.展开更多
This study numerically investigates the formation of high-velocity molecular clouds(HVMCs)in the Galactic Center(GC)based on the X-ray emission analysis.We employ three-dimensional magnetohydrodynamic simulations to e...This study numerically investigates the formation of high-velocity molecular clouds(HVMCs)in the Galactic Center(GC)based on the X-ray emission analysis.We employ three-dimensional magnetohydrodynamic simulations to explore the propagation and acceleration of HVMCs with starburst-driven winds,considering vertical,horizontal,and no magnetic field scenarios.Our results reveal that the envelope gas(with a typical T~10~8 K and density~10^(-2)cm^(-3))of molecular clouds(MCs)as a result of the shock interaction is responsible for X-ray emission.Additionally,some clear boundary exists between the interstellar medium(ISM),envelope gas and MCs,and the envelope gas protects the MCs in the heated environment of the shock wave.In theory,it is challenging to distinguish between the envelope gas,MCs and ISM in terms of X-ray emission.Our simulations suggest that the envelope gas has a significant impact on the survival and emission characteristics of MCs,providing insights into the complex interactions from the supernova feedback mechanisms in the GC.展开更多
Cloud is essential in the atmosphere, condensing water vapor and generating strong convective or large-scale persistent precipitation. In this work, the relationships between cloud vertical macro- or microphysical pro...Cloud is essential in the atmosphere, condensing water vapor and generating strong convective or large-scale persistent precipitation. In this work, the relationships between cloud vertical macro- or microphysical properties, radiative heating rate, and precipitation for convective and stratiform clouds in boreal summer over the Tibetan Plateau (TP) are analyzed and compared with its neighboring land and tropical oceans based on CloudSat/CALIPSO satellite measurements and TRMM precipitation data. The precipitation intensity caused by convective clouds is twofold stronger than that by stratiform clouds. The vertical macrophysics of both cloud types show similar features over the TP, with the region weakening the precipitation intensity and compressing the cloud vertical expansion and variation in cloud top height, but having an uplift effect on the average cloud top height. The vertical microphysics of both cloud types under conditions of no rain over the TP are characterized by lower-level ice water, ice particles with a relatively larger range of sizes, and a relatively lower occurrence of denser ice particles. The features are similar to other regions when precipitation enhances, but convective clouds gather denser and larger ice particles than stratiform clouds over the TP. The atmospheric shortwave (longwave) heating (cooling) rate strengthens with increased precipitation for both cloud types. The longwave cooling layer is thicker when the rainfall rate is less than 100 mm d?1, but the net heating layer is typically compressed for the profiles of both cloud types over the TP. This study provides insights into the associations between clouds and precipitation, and an observational basis for improving the simulation of convective and stratiform clouds over the TP in climate models.展开更多
To calculate the diffusion law of chaff cloud launched by aircraft,taking rectangular chaff as an example,the diffusion model of chaff cloud is established in this paper.Firstly,the coordinate systems of chaff are def...To calculate the diffusion law of chaff cloud launched by aircraft,taking rectangular chaff as an example,the diffusion model of chaff cloud is established in this paper.Firstly,the coordinate systems of chaff are defined and the motion model of chaff is established.The motion model mainly includes chaff motion equation and rotation equation,which are obtained by combining the aerodynamic moment and aerodynamic damping.Then,the influence of multi-chaff aerodynamic interference on the movement of chaff is analyzed.Finally,considering the influence of overlap area between chaffs and chaff spacing on the aerodynamic coefficients,the multi-chaff motion model is obtained,and the simulation results are compared with the test results to verify the credibility of the model.展开更多
Forest is one of the most challenging environments to be recorded in a three-dimensional(3D)digitized geometrical representation,because of the size and the complexity of the environment and the data-acquisition const...Forest is one of the most challenging environments to be recorded in a three-dimensional(3D)digitized geometrical representation,because of the size and the complexity of the environment and the data-acquisition constraints brought by on-site conditions.Previous studies have indicated that the data-acquisition pattern can have more influence on the registration results than other factors.In practice,the ideal short-baseline observations,i.e.,the dense collection mode,is rarely feasible,considering the low accessibility in forest environments and the commonly limited labor and time resources.The wide-baseline observations that cover a forest site using a few folds less observations than short-baseline observations,are therefore more preferable and commonly applied.Nevertheless,the wide-baseline approach is more challenging for data registration since it typically lacks the required sufficient overlaps between datasets.Until now,a robust automated registration solution that is independent of special hardware requirements has still been missing.That is,the registration accuracy is still far from the required level,and the information extractable from the merged point cloud using automated registration could not match that from the merged point cloud using manual registration.This paper proposes a discrete overlap search(DOS)method to find correspondences in the point clouds to solve the low-overlap problem in the wide-baseline point clouds.The proposed automatic method uses potential correspondences from both original data and selected feature points to reconstruct rough observation geometries without external knowledge and to retrieve precise registration parameters at data-level.An extensive experiment was carried out with 24 forest datasets of different conditions categorized in three difficulty levels.The performance of the proposed method was evaluated using various accuracy criteria,as well as based on data acquired from different hardware,platforms,viewing perspectives,and at different points of time.The proposed method achieved a 3D registration accuracy at a 0.50-cm level in all difficulty categories using static terrestrial acquisitions.In the terrestrial-aerial registration,data sets were collected from different sensors and at different points of time with scene changes,and a registration accuracy at the raw data geometric accuracy level was achieved.These results represent the highest automated registration accuracy and the strictest evaluation so far.The proposed method is applicable in multiple scenarios,such as 1)the global positioning of individual under-canopy observations,which is one of the main challenges in applying terrestrial observations lacking a global context,2)the fusion of point clouds acquired from terrestrial and aerial perspectives,which is required in order to achieve a complete forest observation,3)mobile mapping using a new stop-and-go approach,which solves the problems of lacking mobility and slow data collection in static terrestrial measurements as well as the data-quality issue in the continuous mobile approach.Furthermore,this work proposes a new error estimate that units all parameter-level errors into a single quantity and compensates for the downsides of the widely used parameter-and object-level error estimates;it also proposes a new deterministic point sets registration method as an alternative to the popular sampling methods.展开更多
Efficient three-dimensional(3D)building reconstruction from drone imagery often faces data acquisition,storage,and computational challenges because of its reliance on dense point clouds.In this study,we introduced a n...Efficient three-dimensional(3D)building reconstruction from drone imagery often faces data acquisition,storage,and computational challenges because of its reliance on dense point clouds.In this study,we introduced a novel method for efficient and lightweight 3D building reconstruction from drone imagery using line clouds and sparse point clouds.Our approach eliminates the need to generate dense point clouds,and thus significantly reduces the computational burden by reconstructing 3D models directly from sparse data.We addressed the limitations of line clouds for plane detection and reconstruction by using a new algorithm.This algorithm projects 3D line clouds onto a 2D plane,clusters the projections to identify potential planes,and refines them using sparse point clouds to ensure an accurate and efficient model reconstruction.Extensive qualitative and quantitative experiments demonstrated the effectiveness of our method,demonstrating its superiority over existing techniques in terms of simplicity and efficiency.展开更多
Mixed-phase clouds(MPCs)involve complex microphysical and dynamical processes of cloud formation and dissipation,which are crucial for numerical weather prediction and cloud-climate feedback.However,satellite remote s...Mixed-phase clouds(MPCs)involve complex microphysical and dynamical processes of cloud formation and dissipation,which are crucial for numerical weather prediction and cloud-climate feedback.However,satellite remote sensing of MPC properties is still challenging,and there is seldom MPC result inferred from passive spectral observations.This study examines the spectral characteristics of MPCs in the shortwave-infrared(SWIR)channels over the wavelength of 0.4–2.5μm,and evaluates the potential of current operational satellite spectroradiometer channels for MPC retrievals.With optical properties of MPCs based on the assumption of uniform mixing of both ice and liquid water particles,the effects of MPC ice optical thickness fraction(IOTF)and effective radius on associated optical properties are analyzed.As expected,results indicate that the MPC optical properties show features for ice and liquid water clouds,and their spectral variations show noticeable differences from those for homogeneous cases.A radiative transfer method is employed to examine the sensitivity of SWIR channels to given MPC cloud water path(CWP)and IOTF.MPCs have unique signal characteristics in the SWIR spectrum.The 0.87-μm channel is most sensitive to CWP.Meanwhile,the 1.61-and 2.13-μm channels are more sensitive to water-dominated MPCs(IOTF approaching 0),and the 2.25-μm channel is sensitive to both water-dominated and ice-dominated MPCs(IOTF approaching 1).Such spectral differences are potentially possible to be used to infer MPC properties based on radiometer observations,which will be investigated in future studies.展开更多
Rock discontinuities control rock mechanical behaviors and significantly influence the stability of rock masses.However,existing discontinuity mapping algorithms are susceptible to noise,and the calculation results ca...Rock discontinuities control rock mechanical behaviors and significantly influence the stability of rock masses.However,existing discontinuity mapping algorithms are susceptible to noise,and the calculation results cannot be fed back to users timely.To address this issue,we proposed a human-machine interaction(HMI)method for discontinuity mapping.Users can help the algorithm identify the noise and make real-time result judgments and parameter adjustments.For this,a regular cube was selected to illustrate the workflows:(1)point cloud was acquired using remote sensing;(2)the HMI method was employed to select reference points and angle thresholds to detect group discontinuity;(3)individual discontinuities were extracted from the group discontinuity using a density-based cluster algorithm;and(4)the orientation of each discontinuity was measured based on a plane fitting algorithm.The method was applied to a well-studied highway road cut and a complex natural slope.The consistency of the computational results with field measurements demonstrates its good accuracy,and the average error in the dip direction and dip angle for both cases was less than 3.Finally,the computational time of the proposed method was compared with two other popular algorithms,and the reduction in computational time by tens of times proves its high computational efficiency.This method provides geologists and geological engineers with a new idea to map rapidly and accurately rock structures under large amounts of noises or unclear features.展开更多
DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive...DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive exists in a molten liquid state, where high-temperature gases expand and react in the form of bubble clouds within the liquid explosive;this process is distinctly different from the dynamic crack propagation process observed in the case of solid explosives. In this study, a control model for the reaction evolution of burning-bubble clouds was established to describe the reaction process and quantify the reaction violence of DNAN-based melt-cast explosives, considering the size distribution and activation mechanism of the burning-bubble clouds. The feasibility of the model was verified through experimental results. The results revealed that under geometrically similar conditions, with identical confinement strength and aspect ratio, larger charge structures led to extended initial gas flow and surface burning processes, resulting in greater reaction equivalence and violence at the casing fracture.Under constant charge volume and size, a stronger casing confinement accelerated self-enhanced burning, increasing the internal pressure, reaction degree, and reaction violence. Under a constant casing thickness and radius, higher aspect ratios led to a greater reaction violence at the casing fracture.Moreover, under a constant charge volume and casing thickness, higher aspect ratios resulted in a higher internal pressure, increased reaction degree, and greater reaction violence at the casing fracture. Further,larger ullage volumes extended the reaction evolution time and increased the reaction violence under constant casing dimensions. Through a matching design of the opening threshold of the pressure relief holes and the relief structure area, a stable burning reaction could be maintained until completion,thereby achieving a control of the reaction violence. The proposed model could effectively reflect the effects of the intrinsic burning rate, casing confinement strength, charge size, ullage volume, and pressure relief structure on the reaction evolution process and reaction violence, providing a theoretical method for the thermal safety design and reaction violence evaluation of melt-cast explosives.展开更多
The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity d...The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity degree/relation in multi-scale map spaces and then proposes a model for calculating the degree of spatial similarity between a point cloud at one scale and its gener- alized counterpart at another scale. After validation, the new model features 16 points with map scale change as the x coordinate and the degree of spatial similarity as the y coordinate. Finally, using an application for curve fitting, the model achieves an empirical formula that can calculate the degree of spatial similarity using map scale change as the sole independent variable, and vice versa. This formula can be used to automate algorithms for point feature generalization and to determine when to terminate them during the generalization.展开更多
The remote data integrity auditing technology can guarantee the integrity of outsourced data in clouds. Users can periodically run an integrity auditing protocol by interacting with cloud server, to verify the latest ...The remote data integrity auditing technology can guarantee the integrity of outsourced data in clouds. Users can periodically run an integrity auditing protocol by interacting with cloud server, to verify the latest status of outsourced data. Integrity auditing requires user to take massive time-consuming computations, which would not be affordable by weak devices. In this paper, we propose a privacy-preserving TPA-aided remote data integrity auditing scheme based on Li et al.’s data integrity auditing scheme without bilinear pairings, where a third party auditor (TPA) is employed to perform integrity auditing on outsourced data for users. The privacy of outsourced data can be guaranteed against TPA in the sense that TPA could not infer its contents from the returned proofs in the integrity auditing phase. Our construction is as efficient as Li et al.’s scheme, that is, each procedure takes the same time-consuming operations in both schemes, and our solution does not increase the sizes of processed data, challenge and proof.展开更多
Recognizing discontinuities within rock masses is a critical aspect of rock engineering.The development of remote sensing technologies has significantly enhanced the quality and quantity of the point clouds collected ...Recognizing discontinuities within rock masses is a critical aspect of rock engineering.The development of remote sensing technologies has significantly enhanced the quality and quantity of the point clouds collected from rock outcrops.In response,we propose a workflow that balances accuracy and efficiency to extract discontinuities from massive point clouds.The proposed method employs voxel filtering to downsample point clouds,constructs a point cloud topology using K-d trees,utilizes principal component analysis to calculate the point cloud normals,and employs the pointwise clustering(PWC)algorithm to extract discontinuities from rock outcrop point clouds.This method provides information on the location and orientation(dip direction and dip angle)of the discontinuities,and the modified whale optimization algorithm(MWOA)is utilized to identify major discontinuity sets and their average orientations.Performance evaluations based on three real cases demonstrate that the proposed method significantly reduces computational time costs without sacrificing accuracy.In particular,the method yields more reasonable extraction results for discontinuities with certain undulations.The presented approach offers a novel tool for efficiently extracting discontinuities from large-scale point clouds.展开更多
Existing reverse-engineering methods struggle to directly generate editable,parametric CAD models from scanned data.To address this limitation,this paper proposes a reverse-modeling approach that reconstructs parametr...Existing reverse-engineering methods struggle to directly generate editable,parametric CAD models from scanned data.To address this limitation,this paper proposes a reverse-modeling approach that reconstructs parametric CAD models from multi-view RGB-D point clouds.Multi-frame point-cloud registration and fusion are first employed to obtain a complete 3-D point cloud of the target object.A region-growing algorithm that jointly exploits color and geometric information segments the cloud,while RANSAC robustly detects and fits basic geometric primitives.These primitives serve as nodes in a graph whose edge features are inferred by a graph neural network to capture spatial constraints.From the detected primitives and their constraints,a high-accuracy,fully editable parametric CAD model is finally exported.Experiments show an average parameter error of 0.3 mm for key dimensions and an overall geometric reconstruction accuracy of 0.35 mm.The work offers an effective technical route toward automated,intelligent 3-D reverse modeling.展开更多
An increasing number of enterprises have adopted cloud computing to manage their important business applications in distributed green cloud(DGC)systems for low response time and high cost-effectiveness in recent years...An increasing number of enterprises have adopted cloud computing to manage their important business applications in distributed green cloud(DGC)systems for low response time and high cost-effectiveness in recent years.Task scheduling and resource allocation in DGCs have gained more attention in both academia and industry as they are costly to manage because of high energy consumption.Many factors in DGCs,e.g.,prices of power grid,and the amount of green energy express strong spatial variations.The dramatic increase of arriving tasks brings a big challenge to minimize the energy cost of a DGC provider in a market where above factors all possess spatial variations.This work adopts a G/G/1 queuing system to analyze the performance of servers in DGCs.Based on it,a single-objective constrained optimization problem is formulated and solved by a proposed simulated-annealing-based bees algorithm(SBA)to find SBA can minimize the energy cost of a DGC provider by optimally allocating tasks of heterogeneous applications among multiple DGCs,and specifying the running speed of each server and the number of powered-on servers in each GC while strictly meeting response time limits of tasks of all applications.Realistic databased experimental results prove that SBA achieves lower energy cost than several benchmark scheduling methods do.展开更多
Many enterprises and personals are inclining to outsource their data to public clouds, but security and privacy are two critical problems cannot be ignored. The door of cloud provider may be broken, and the data may a...Many enterprises and personals are inclining to outsource their data to public clouds, but security and privacy are two critical problems cannot be ignored. The door of cloud provider may be broken, and the data may also be dug into by providers to find valuable information. In this paper, a secure and efficient storage file (SES FS) system is proposed to distribute files in several clouds and allows users to search the files securely and efficiently. In the proposed system, keywords were transformed into integers and secretly shared in a defined finite field, then the shares were mapped to random numbers in specified random domain in each cloud. Files were encrypted with distinct secret key and scattered within different clouds. Information about keyword/file was secretly shared among cloud providers. Legal users can search in the clouds to find correct encrypted files and reconstruct corresponding secret key. No adversary can find or detect the real file information even they can collude all the servers. Manipulation on shares by one or more clouds can be detected with high probability. The system can also detect malicious servers through introduced virtual points. One interesting property for the scheme is that new keywords can be added easily, which is difficult and usually not efficient for many searchable symmetric encryption systems. Detailed experimental result shows, with tolerable uploading delay, the scheme exhibits excellent performance on data retrieving aspect.展开更多
Data-intensive computing is expected to be the next-generation IT computing paradigm. Data-intensive workflows in clouds are becoming more and more popular. How to schedule data-intensive workflow efficiently has beco...Data-intensive computing is expected to be the next-generation IT computing paradigm. Data-intensive workflows in clouds are becoming more and more popular. How to schedule data-intensive workflow efficiently has become the key issue. In this paper, first, we build a directed hypergraph model for data-intensive workflow, since Hypergraphs can more accurately model communication volume and better represent asymmetric problems, and the cut metric of hypergraphs is well suited for minimizing the total volume of communication.Second, we propose a concept data supportive ability to help the presentation of data-intensive workflow application and provide the merge operation details considering the data supportive ability. Third, we present an optimized hypergraph multi-level partitioning algorithm. Finally we bring a data reduced scheduling policy HEFT-P for data-intensive workflow. Through simulation,we compare HEFT-P with three typical workflow scheduling policies.The results indicate that HEFT-P could obtain reduced data scheduling and reduce the makespan of executing data-intensive展开更多
文摘The prevailing idea so far about why the rainfall occurs was that after agglutination of water droplets with condensation nuclei, the size of the particle formed by the condensation nuclei connected with droplets of water increased considerably and caused its fall. This idea has led to numerous scientific publications in which empirical distribution functions of clouds’ water droplets sizes were proposed. Estimates values provided by these empirical distribution functions, in most cases, were validated by comparison with UHF Radar measurements. The condensation nuclei concept has not been sufficiently exploited and this has led meteorologists to error, in their attempt to describe the clouds, thinking that clouds were formed by liquid water droplets. Indeed, MBANE BIOUELE paradox (2005) confirms this embarrassing situation. In fact, when applying Archimedes theorem to a liquid water droplet suspended in the atmosphere, we obtain a meaningless inequality ?which makes believe that the densities of pure water in liquid and solid phases are much lower than that of the atmosphere considered at the sea level. This meaningless inequality is easy to contradict: of course, if you empty a bottle of pure liquid water in the ocean (where z is equal to 0), this water will not remain suspended in the air, i.e., application of Archimedes’ theorem allows realizing that there is no liquid (or solid) water droplet, suspended in the clouds. Indeed, all liquid (or solid) water droplets which are formed in clouds, fall under the effect of gravity and produce rains. This means that our current description of the clouds is totally wrong. In this study, we describe the clouds as a gas composed of dry air and saturated water vapor whose optical properties depend on temperature, i.e., when the temperature of a cloud decreases, the color of this gaseous system tends towards white.
文摘Light Detection And Ranging (LiDAR) is a well-established active remote sensing technology that can provide accurate digital elevation measurements for the terrain and non-ground objects such as vegetations and buildings, etc. Non-ground objects need to be removed for creation of a Digital Terrain Model (DTM) which is a continuous surface representing only ground surface points. This study aimed at comparative analysis of three main filtering approaches for stripping off non-ground objects namely;Gaussian low pass filter, focal analysis mean filter and DTM slope-based filter of varying window sizes in creation of a reliable DTM from airborne LiDAR point clouds. A sample of LiDAR data provided by the ISPRS WG III/4 captured at Vaihingen in Germany over a pure residential area has been used in the analysis. Visual analysis has indicated that Gaussian low pass filter has given blurred DTMs of attenuated high-frequency objects and emphasized low-frequency objects while it has achieved improved removal of non-ground object at larger window sizes. Focal analysis mean filter has shown better removal of nonground objects compared to Gaussian low pass filter especially at large window sizes where details of non-ground objects almost have diminished in the DTMs from window sizes of 25 × 25 and greater. DTM slope-based filter has created bare earth models that have been full of gabs at the positions of the non-ground objects where the sizes and numbers of that gabs have increased with increasing the window sizes of filter. Those gaps have been closed through exploitation of the spline interpolation method in order to get continuous surface representing bare earth landscape. Comparative analysis has shown that the minimum elevations of the DTMs increase with increasing the filter widow sizes till 21 × 21 and 31 × 31 for the Gaussian low pass filter and the focal analysis mean filter respectively. On the other hand, the DTM slope-based filter has kept the minimum elevation of the original data, that could be due to noise in the LiDAR data unchanged. Alternatively, the three approaches have produced DTMs of decreasing maximum elevation values and consequently decreasing ranges of elevations due to increases in the filter window sizes. Moreover, the standard deviations of the created DTMs from the three filters have decreased with increasing the filter window sizes however, the decreases have been continuous and steady in the cases of the Gaussian low pass filter and the focal analysis mean filters while in the case of the DTM slope-based filter the standard deviations of the created DTMs have decreased with high rates till window size of 31 × 31 then they have kept unchanged due to more increases in the filter window sizes.
基金supported by the Government of the Russian Federation grant number 075-15-2025-009 of 28 February 2025 and by the Russian Science Foundation,Grant No.24-72-10127.
文摘During daylight laser polarization sensing of high-level clouds(HLCs),the lidar receiving system generates a signal caused by not only backscattered laser radiation,but also scattered solar radiation,the intensity and polarization of which depends on the Sun’s location.If a cloud contains spatially oriented ice particles,then it becomes anisotropic,that is,the coefficients of directional light scattering of such a cloud depend on the Sun’s zenith and azimuth angles.In this work,the possibility of using the effect of anisotropic scattering of solar radiation on the predictive ability of machine learning algorithms in solving the problem of predicting the HLC backscattering phase matrix(BSPM)was evaluated.The hypothesis that solar radiation scattered on HLCs has no effect on the BSPM elements of such clouds determined with a polarization lidar was tested.The operation of two algorithms for predicting the BSPM elements is evaluated.To train the first one,meteorological data were used as input parameters;for the second algorithm,the azi-muthal and zenith angles of the Sun’s position were added to the meteorological parameters.It is shown that there is no significant improvement in the predictive ability of the algorithm.
文摘An analysis of global radiation measurements and fractional cloud cover observations made in the Israel Meteorological Service’s network of climate stations demonstrated a significant decrease in the transmittance of solar radiation through the atmosphere during the last 60 years. The major cause was the reduced transparency of clouds. Under completely overcast skies with complete cloud cover transmission in the industrialized central coastal region decreased from 0.41 in the mid-20th century to 0.21 in the first decade of the 21st century. Under cloudless skies the reduction in the transmission of global radiation was less, from 0.79 to 0.71, and not statistically significant. Similar but somewhat smaller changes were observed in the less industrialized central hill region. Multi-linear analysis showed that since 1970, 61% of the measured decline in global radiation was attributable to changes in fractional cloud cover but only 2% to the marked increase in local fuel combustion;there was no statistically significant interaction between the two parameters.
基金the cosmology simulation database(CSD)in the National Basic Science Data Center(NBSDC)and its funds the NBSDC-DB-10the support from the National Key Research and Development Program of China(2022YFA1602930)+3 种基金the National Natural Science Foundation of China(NSFC,grant Nos.11825303 and 11861131006)the science research grants from the China Manned Space project with No.CMS-CSST 2021-A03,CMSCSST-2021-A04the Fundamental Research Funds for the Central Universities of China(2262022-00216)the startup funding of Zhejiang University。
文摘This study numerically investigates the formation of high-velocity molecular clouds(HVMCs)in the Galactic Center(GC)based on the X-ray emission analysis.We employ three-dimensional magnetohydrodynamic simulations to explore the propagation and acceleration of HVMCs with starburst-driven winds,considering vertical,horizontal,and no magnetic field scenarios.Our results reveal that the envelope gas(with a typical T~10~8 K and density~10^(-2)cm^(-3))of molecular clouds(MCs)as a result of the shock interaction is responsible for X-ray emission.Additionally,some clear boundary exists between the interstellar medium(ISM),envelope gas and MCs,and the envelope gas protects the MCs in the heated environment of the shock wave.In theory,it is challenging to distinguish between the envelope gas,MCs and ISM in terms of X-ray emission.Our simulations suggest that the envelope gas has a significant impact on the survival and emission characteristics of MCs,providing insights into the complex interactions from the supernova feedback mechanisms in the GC.
基金jointly supported by the National Natural Science Foundation of China (Grant Nos. 91437219, 91637312 and 91637101)the Key Research Program of Frontier Sciences, Chinese Academy of Sciences (Grant No. QYZDY-SSWDQC018)The CloudSat/CALIPSO data were obtained from the CloudSat Data Processing Center (http://www.cloudsat.cira. colostate.edu/order-data) funded by NASA’s CloudSat project
文摘Cloud is essential in the atmosphere, condensing water vapor and generating strong convective or large-scale persistent precipitation. In this work, the relationships between cloud vertical macro- or microphysical properties, radiative heating rate, and precipitation for convective and stratiform clouds in boreal summer over the Tibetan Plateau (TP) are analyzed and compared with its neighboring land and tropical oceans based on CloudSat/CALIPSO satellite measurements and TRMM precipitation data. The precipitation intensity caused by convective clouds is twofold stronger than that by stratiform clouds. The vertical macrophysics of both cloud types show similar features over the TP, with the region weakening the precipitation intensity and compressing the cloud vertical expansion and variation in cloud top height, but having an uplift effect on the average cloud top height. The vertical microphysics of both cloud types under conditions of no rain over the TP are characterized by lower-level ice water, ice particles with a relatively larger range of sizes, and a relatively lower occurrence of denser ice particles. The features are similar to other regions when precipitation enhances, but convective clouds gather denser and larger ice particles than stratiform clouds over the TP. The atmospheric shortwave (longwave) heating (cooling) rate strengthens with increased precipitation for both cloud types. The longwave cooling layer is thicker when the rainfall rate is less than 100 mm d?1, but the net heating layer is typically compressed for the profiles of both cloud types over the TP. This study provides insights into the associations between clouds and precipitation, and an observational basis for improving the simulation of convective and stratiform clouds over the TP in climate models.
基金This work is supported by the National Natural Science Foundation of China(grant number 61471390).
文摘To calculate the diffusion law of chaff cloud launched by aircraft,taking rectangular chaff as an example,the diffusion model of chaff cloud is established in this paper.Firstly,the coordinate systems of chaff are defined and the motion model of chaff is established.The motion model mainly includes chaff motion equation and rotation equation,which are obtained by combining the aerodynamic moment and aerodynamic damping.Then,the influence of multi-chaff aerodynamic interference on the movement of chaff is analyzed.Finally,considering the influence of overlap area between chaffs and chaff spacing on the aerodynamic coefficients,the multi-chaff motion model is obtained,and the simulation results are compared with the test results to verify the credibility of the model.
基金financial support from the National Natural Science Foundation of China(Grant Nos.32171789,32211530031)Wuhan University(No.WHUZZJJ202220)Academy of Finland(Nos.334060,334829,331708,344755,337656,334830,293389/314312,334830,319011)。
文摘Forest is one of the most challenging environments to be recorded in a three-dimensional(3D)digitized geometrical representation,because of the size and the complexity of the environment and the data-acquisition constraints brought by on-site conditions.Previous studies have indicated that the data-acquisition pattern can have more influence on the registration results than other factors.In practice,the ideal short-baseline observations,i.e.,the dense collection mode,is rarely feasible,considering the low accessibility in forest environments and the commonly limited labor and time resources.The wide-baseline observations that cover a forest site using a few folds less observations than short-baseline observations,are therefore more preferable and commonly applied.Nevertheless,the wide-baseline approach is more challenging for data registration since it typically lacks the required sufficient overlaps between datasets.Until now,a robust automated registration solution that is independent of special hardware requirements has still been missing.That is,the registration accuracy is still far from the required level,and the information extractable from the merged point cloud using automated registration could not match that from the merged point cloud using manual registration.This paper proposes a discrete overlap search(DOS)method to find correspondences in the point clouds to solve the low-overlap problem in the wide-baseline point clouds.The proposed automatic method uses potential correspondences from both original data and selected feature points to reconstruct rough observation geometries without external knowledge and to retrieve precise registration parameters at data-level.An extensive experiment was carried out with 24 forest datasets of different conditions categorized in three difficulty levels.The performance of the proposed method was evaluated using various accuracy criteria,as well as based on data acquired from different hardware,platforms,viewing perspectives,and at different points of time.The proposed method achieved a 3D registration accuracy at a 0.50-cm level in all difficulty categories using static terrestrial acquisitions.In the terrestrial-aerial registration,data sets were collected from different sensors and at different points of time with scene changes,and a registration accuracy at the raw data geometric accuracy level was achieved.These results represent the highest automated registration accuracy and the strictest evaluation so far.The proposed method is applicable in multiple scenarios,such as 1)the global positioning of individual under-canopy observations,which is one of the main challenges in applying terrestrial observations lacking a global context,2)the fusion of point clouds acquired from terrestrial and aerial perspectives,which is required in order to achieve a complete forest observation,3)mobile mapping using a new stop-and-go approach,which solves the problems of lacking mobility and slow data collection in static terrestrial measurements as well as the data-quality issue in the continuous mobile approach.Furthermore,this work proposes a new error estimate that units all parameter-level errors into a single quantity and compensates for the downsides of the widely used parameter-and object-level error estimates;it also proposes a new deterministic point sets registration method as an alternative to the popular sampling methods.
基金Supported by the Guangdong Major Project of Basic and Applied Basic Research (2023B0303000016)the National Natural Science Foundation of China (U21A20515)。
文摘Efficient three-dimensional(3D)building reconstruction from drone imagery often faces data acquisition,storage,and computational challenges because of its reliance on dense point clouds.In this study,we introduced a novel method for efficient and lightweight 3D building reconstruction from drone imagery using line clouds and sparse point clouds.Our approach eliminates the need to generate dense point clouds,and thus significantly reduces the computational burden by reconstructing 3D models directly from sparse data.We addressed the limitations of line clouds for plane detection and reconstruction by using a new algorithm.This algorithm projects 3D line clouds onto a 2D plane,clusters the projections to identify potential planes,and refines them using sparse point clouds to ensure an accurate and efficient model reconstruction.Extensive qualitative and quantitative experiments demonstrated the effectiveness of our method,demonstrating its superiority over existing techniques in terms of simplicity and efficiency.
基金supported by the National Natural Science Foundation of China[Grant Nos.42205086 and 42122038]。
文摘Mixed-phase clouds(MPCs)involve complex microphysical and dynamical processes of cloud formation and dissipation,which are crucial for numerical weather prediction and cloud-climate feedback.However,satellite remote sensing of MPC properties is still challenging,and there is seldom MPC result inferred from passive spectral observations.This study examines the spectral characteristics of MPCs in the shortwave-infrared(SWIR)channels over the wavelength of 0.4–2.5μm,and evaluates the potential of current operational satellite spectroradiometer channels for MPC retrievals.With optical properties of MPCs based on the assumption of uniform mixing of both ice and liquid water particles,the effects of MPC ice optical thickness fraction(IOTF)and effective radius on associated optical properties are analyzed.As expected,results indicate that the MPC optical properties show features for ice and liquid water clouds,and their spectral variations show noticeable differences from those for homogeneous cases.A radiative transfer method is employed to examine the sensitivity of SWIR channels to given MPC cloud water path(CWP)and IOTF.MPCs have unique signal characteristics in the SWIR spectrum.The 0.87-μm channel is most sensitive to CWP.Meanwhile,the 1.61-and 2.13-μm channels are more sensitive to water-dominated MPCs(IOTF approaching 0),and the 2.25-μm channel is sensitive to both water-dominated and ice-dominated MPCs(IOTF approaching 1).Such spectral differences are potentially possible to be used to infer MPC properties based on radiometer observations,which will be investigated in future studies.
基金supported by the National Key R&D Program of China(No.2023YFC3081200)the National Natural Science Foundation of China(No.42077264)the Scientific Research Project of PowerChina Huadong Engineering Corporation Limited(HDEC-2022-0301).
文摘Rock discontinuities control rock mechanical behaviors and significantly influence the stability of rock masses.However,existing discontinuity mapping algorithms are susceptible to noise,and the calculation results cannot be fed back to users timely.To address this issue,we proposed a human-machine interaction(HMI)method for discontinuity mapping.Users can help the algorithm identify the noise and make real-time result judgments and parameter adjustments.For this,a regular cube was selected to illustrate the workflows:(1)point cloud was acquired using remote sensing;(2)the HMI method was employed to select reference points and angle thresholds to detect group discontinuity;(3)individual discontinuities were extracted from the group discontinuity using a density-based cluster algorithm;and(4)the orientation of each discontinuity was measured based on a plane fitting algorithm.The method was applied to a well-studied highway road cut and a complex natural slope.The consistency of the computational results with field measurements demonstrates its good accuracy,and the average error in the dip direction and dip angle for both cases was less than 3.Finally,the computational time of the proposed method was compared with two other popular algorithms,and the reduction in computational time by tens of times proves its high computational efficiency.This method provides geologists and geological engineers with a new idea to map rapidly and accurately rock structures under large amounts of noises or unclear features.
基金supported by the National Natural Science Foundation of China (Grant No. 12002044)。
文摘DNAN-based insensitive melt-cast explosives have been widely utilized in insensitive munition in recent years. When constrained DNAN-based melt-cast explosives are ignited under thermal stimulation, the base explosive exists in a molten liquid state, where high-temperature gases expand and react in the form of bubble clouds within the liquid explosive;this process is distinctly different from the dynamic crack propagation process observed in the case of solid explosives. In this study, a control model for the reaction evolution of burning-bubble clouds was established to describe the reaction process and quantify the reaction violence of DNAN-based melt-cast explosives, considering the size distribution and activation mechanism of the burning-bubble clouds. The feasibility of the model was verified through experimental results. The results revealed that under geometrically similar conditions, with identical confinement strength and aspect ratio, larger charge structures led to extended initial gas flow and surface burning processes, resulting in greater reaction equivalence and violence at the casing fracture.Under constant charge volume and size, a stronger casing confinement accelerated self-enhanced burning, increasing the internal pressure, reaction degree, and reaction violence. Under a constant casing thickness and radius, higher aspect ratios led to a greater reaction violence at the casing fracture.Moreover, under a constant charge volume and casing thickness, higher aspect ratios resulted in a higher internal pressure, increased reaction degree, and greater reaction violence at the casing fracture. Further,larger ullage volumes extended the reaction evolution time and increased the reaction violence under constant casing dimensions. Through a matching design of the opening threshold of the pressure relief holes and the relief structure area, a stable burning reaction could be maintained until completion,thereby achieving a control of the reaction violence. The proposed model could effectively reflect the effects of the intrinsic burning rate, casing confinement strength, charge size, ullage volume, and pressure relief structure on the reaction evolution process and reaction violence, providing a theoretical method for the thermal safety design and reaction violence evaluation of melt-cast explosives.
基金funded by the Natural Science Foundation Committee,China(41364001,41371435)
文摘The degree of spatial similarity plays an important role in map generalization, yet there has been no quantitative research into it. To fill this gap, this study first defines map scale change and spatial similarity degree/relation in multi-scale map spaces and then proposes a model for calculating the degree of spatial similarity between a point cloud at one scale and its gener- alized counterpart at another scale. After validation, the new model features 16 points with map scale change as the x coordinate and the degree of spatial similarity as the y coordinate. Finally, using an application for curve fitting, the model achieves an empirical formula that can calculate the degree of spatial similarity using map scale change as the sole independent variable, and vice versa. This formula can be used to automate algorithms for point feature generalization and to determine when to terminate them during the generalization.
基金the National Natural Science Foundation of China under projects 61772150 and 61862012the Guangxi Key R&D Program under project AB17195025+3 种基金the Guangxi Natural Science Foundation under grants 2018GXNSFDA281054 and 2018GXNSFAA281232the National Cryptography Development Fund of China under project MMJJ20170217the Guangxi Young Teachers’ Basic Ability Improvement Program under Grant 2018KY0194and the open program of Guangxi Key Laboratory of Cryptography and Information Security under projects GCIS201621 and GCIS201702.
文摘The remote data integrity auditing technology can guarantee the integrity of outsourced data in clouds. Users can periodically run an integrity auditing protocol by interacting with cloud server, to verify the latest status of outsourced data. Integrity auditing requires user to take massive time-consuming computations, which would not be affordable by weak devices. In this paper, we propose a privacy-preserving TPA-aided remote data integrity auditing scheme based on Li et al.’s data integrity auditing scheme without bilinear pairings, where a third party auditor (TPA) is employed to perform integrity auditing on outsourced data for users. The privacy of outsourced data can be guaranteed against TPA in the sense that TPA could not infer its contents from the returned proofs in the integrity auditing phase. Our construction is as efficient as Li et al.’s scheme, that is, each procedure takes the same time-consuming operations in both schemes, and our solution does not increase the sizes of processed data, challenge and proof.
基金supported by the National Natural Science Foundation of China(Grant No.42407232)the Sichuan Science and Technology Program(Grant No.2024NSFSC0826).
文摘Recognizing discontinuities within rock masses is a critical aspect of rock engineering.The development of remote sensing technologies has significantly enhanced the quality and quantity of the point clouds collected from rock outcrops.In response,we propose a workflow that balances accuracy and efficiency to extract discontinuities from massive point clouds.The proposed method employs voxel filtering to downsample point clouds,constructs a point cloud topology using K-d trees,utilizes principal component analysis to calculate the point cloud normals,and employs the pointwise clustering(PWC)algorithm to extract discontinuities from rock outcrop point clouds.This method provides information on the location and orientation(dip direction and dip angle)of the discontinuities,and the modified whale optimization algorithm(MWOA)is utilized to identify major discontinuity sets and their average orientations.Performance evaluations based on three real cases demonstrate that the proposed method significantly reduces computational time costs without sacrificing accuracy.In particular,the method yields more reasonable extraction results for discontinuities with certain undulations.The presented approach offers a novel tool for efficiently extracting discontinuities from large-scale point clouds.
文摘Existing reverse-engineering methods struggle to directly generate editable,parametric CAD models from scanned data.To address this limitation,this paper proposes a reverse-modeling approach that reconstructs parametric CAD models from multi-view RGB-D point clouds.Multi-frame point-cloud registration and fusion are first employed to obtain a complete 3-D point cloud of the target object.A region-growing algorithm that jointly exploits color and geometric information segments the cloud,while RANSAC robustly detects and fits basic geometric primitives.These primitives serve as nodes in a graph whose edge features are inferred by a graph neural network to capture spatial constraints.From the detected primitives and their constraints,a high-accuracy,fully editable parametric CAD model is finally exported.Experiments show an average parameter error of 0.3 mm for key dimensions and an overall geometric reconstruction accuracy of 0.35 mm.The work offers an effective technical route toward automated,intelligent 3-D reverse modeling.
基金supported in part by the National Natural Science Foundation of China(61802015,61703011)the Major Science and Technology Program for Water Pollution Control and Treatment of China(2018ZX07111005)+1 种基金the National Defense Pre-Research Foundation of China(41401020401,41401050102)the Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah(D-422-135-1441)。
文摘An increasing number of enterprises have adopted cloud computing to manage their important business applications in distributed green cloud(DGC)systems for low response time and high cost-effectiveness in recent years.Task scheduling and resource allocation in DGCs have gained more attention in both academia and industry as they are costly to manage because of high energy consumption.Many factors in DGCs,e.g.,prices of power grid,and the amount of green energy express strong spatial variations.The dramatic increase of arriving tasks brings a big challenge to minimize the energy cost of a DGC provider in a market where above factors all possess spatial variations.This work adopts a G/G/1 queuing system to analyze the performance of servers in DGCs.Based on it,a single-objective constrained optimization problem is formulated and solved by a proposed simulated-annealing-based bees algorithm(SBA)to find SBA can minimize the energy cost of a DGC provider by optimally allocating tasks of heterogeneous applications among multiple DGCs,and specifying the running speed of each server and the number of powered-on servers in each GC while strictly meeting response time limits of tasks of all applications.Realistic databased experimental results prove that SBA achieves lower energy cost than several benchmark scheduling methods do.
基金Demonstration on the Construction of Guangdong Survey and Geomatics Industry Technology Innovation Alliance (2017B090907030)The Demonstration of Big Data Application for Land Resource Management and Service (2015B010110006)+3 种基金Qiong Huang is supported by Guangdong Natural Science Funds for Distinguished Young Scholar (No. 2014A030306021)Guangdong Program for Special Support of Top-notch Young Professionals (No. 2015TQ01X796)Pearl River Nova Program of Guangzhou (No. 201610010037)and the National Natural Science Foundation of China (Nos. 61472146, 61672242).
文摘Many enterprises and personals are inclining to outsource their data to public clouds, but security and privacy are two critical problems cannot be ignored. The door of cloud provider may be broken, and the data may also be dug into by providers to find valuable information. In this paper, a secure and efficient storage file (SES FS) system is proposed to distribute files in several clouds and allows users to search the files securely and efficiently. In the proposed system, keywords were transformed into integers and secretly shared in a defined finite field, then the shares were mapped to random numbers in specified random domain in each cloud. Files were encrypted with distinct secret key and scattered within different clouds. Information about keyword/file was secretly shared among cloud providers. Legal users can search in the clouds to find correct encrypted files and reconstruct corresponding secret key. No adversary can find or detect the real file information even they can collude all the servers. Manipulation on shares by one or more clouds can be detected with high probability. The system can also detect malicious servers through introduced virtual points. One interesting property for the scheme is that new keywords can be added easily, which is difficult and usually not efficient for many searchable symmetric encryption systems. Detailed experimental result shows, with tolerable uploading delay, the scheme exhibits excellent performance on data retrieving aspect.
文摘Data-intensive computing is expected to be the next-generation IT computing paradigm. Data-intensive workflows in clouds are becoming more and more popular. How to schedule data-intensive workflow efficiently has become the key issue. In this paper, first, we build a directed hypergraph model for data-intensive workflow, since Hypergraphs can more accurately model communication volume and better represent asymmetric problems, and the cut metric of hypergraphs is well suited for minimizing the total volume of communication.Second, we propose a concept data supportive ability to help the presentation of data-intensive workflow application and provide the merge operation details considering the data supportive ability. Third, we present an optimized hypergraph multi-level partitioning algorithm. Finally we bring a data reduced scheduling policy HEFT-P for data-intensive workflow. Through simulation,we compare HEFT-P with three typical workflow scheduling policies.The results indicate that HEFT-P could obtain reduced data scheduling and reduce the makespan of executing data-intensive