Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow com...Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow computational speed.Progress in heterogeneous computing has allowed for substantial acceleration in the computation of mesh-type phantoms by utilizing hardware accelerators.In this study,a GPU-accelerated Monte Carlo method was developed to expedite the dose calculation for mesh-type computational phantoms.This involved designing and implementing the entire procedural flow of a GPUaccelerated Monte Carlo program.We employed acceleration structures to process the mesh-type phantom,optimized the traversal methodology,and achieved a flattened structure to overcome the limitations of GPU stack depths.Particle transport methods were realized within the mesh-type phantom,encompassing particle location and intersection techniques.In response to typical external irradiation scenarios,we utilized Geant4 along with the GPU program and its CPU serial code for dose calculations,assessing both computational accuracy and efficiency.In comparison with the benchmark simulated using Geant4 on the CPU using one thread,the relative differences in the organ dose calculated by the GPU program predominantly lay within a margin of 5%,whereas the computational time was reduced by a factor ranging from 120 to 2700.To the best of our knowledge,this study achieved a GPU-accelerated dose calculation method for mesh-type phantoms for the first time,reducing the computational time from hours to seconds per simulation of ten million particles and offering a swift and precise Monte Carlo method for dose calculation in mesh-type computational phantoms.展开更多
Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmenta...Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.展开更多
The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation ...The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation shielding design,and medical physics.However,with the rapid advancement of new nuclear energy systems,the Monte Carlo method faces challenges in efficiency,accuracy,and adaptability,limiting its effectiveness in meeting modern design requirements.Overcoming technical obstacles related to high-fidelity coupling,high-resolution computation,and intelligent design is essential for using the Monte Carlo method as a reliable tool in numerical analysis for these new nuclear energy systems.To address these challenges,the Nuclear Energy and Application Laboratory(NEAL)team at the University of South China developed a multifunctional and generalized intelligent code platform called MagicMC,based on the Monte Carlo particle transport method.MagicMC is a developing tool dedicated to nuclear applications,incorporating intelligent methodologies.It consists of two primary components:a basic unit and a functional unit.The basic unit,which functions similarly to a standard Monte Carlo particle transport code,includes seven modules:geometry,source,transport,database,tally,output,and auxiliary.The functional unit builds on the basic unit by adding functional modules to address complex and diverse applications in nuclear analysis.MagicMC introduces a dynamic Monte Carlo particle transport algorithm to address time-space particle transport problems within emerging nuclear energy systems and incorporates a CPU-GPU heterogeneous parallel framework to enable high-efficiency,high-resolution simulations for large-scale computational problems.Anticipating future trends in intelligent design,MagicMC integrates several advanced features,including CAD-based geometry modeling,global variance reduction methods,multi-objective shielding optimization,high-resolution activation analysis,multi-physics coupling,and radiation therapy.In this paper,various numerical benchmarks-spanning reactor transient simulations,material activation analysis,radiation shielding optimization,and medical dosimetry analysis-are presented to validate MagicMC.The numerical results demonstrate MagicMC's efficiency,accuracy,and reliability in these preliminary applications,underscoring its potential to support technological advancements in developing high-fidelity,high-resolution,and high-intelligence MC-based tools for advanced nuclear applications.展开更多
A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system ...A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system at atomic resolution remains challenging.The objective is to efficiently model and predict a phys-ically informed segregated solute distribution rather than simulating a series of diffusion kinetics.To ad-dress this objective,we coupled molecular dynamics(MD)and Monte Carlo(MC)methods using a novel method based on virtual atoms technique.We applied our MD-MC coupling approach to model off-lattice carbon(C)solute segregation in nanoindented Fe-C samples containing complex dislocation networks.Our coupling framework yielded the final configuration through efficient parallelization and localized en-ergy computations,showing C Cottrell atmospheres near dislocations.Different initial C concentrations resulted in a consistent trend of C atoms migrating from less crystalline distortion to high crystalline distortion regions.Besides unraveling the strong spatial correlation between local C concentration and defect regions,our results revealed two crucial aspects of solute segregation preferences:(1)defect ener-getics hierarchy and(2)tensile strain fields near dislocations.The proposed approach is generic and can be applied to other material systems as well.展开更多
Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work pr...Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work proposes an Adaptive Reverse Monte Carlo(ARMC)method and develops an analytical model for the IR radiation of scramjet considering gaseous kerosene and hydrogen fueled conditions.The evaluation studies show that at a global equivalence ratio of 0.8,the IR radiation from hydrogen-fueled plume is predominantly from H_(2)O and spectral peak is 1.53 kW·Sr^(-1)·μm^(-1)at the 2.7μm band,while the kerosene-fueled plume exhibits a spectral intensity approaching 7.0 kW·Sr^(-1)·μm^(-1)at the 4.3μm band.At the backward detection angle,both types of scramjets exhibit spectral peaks within the 1.3-1.4μm band,with intensities around10 kW·Sr^(-1)·μm^(-1).The integral radiation intensity of hydrogen-fueled scramjet is generally higher than kerosene-fueled scramjet,particularly in 1-3μm band.Meanwhile,at wide detection angles,the solid walls become the predominant radiation source.The radiation intensity is highest in1-3μm and weakest in 8-14μm band,with values of 21.5 kW·Sr^(-1)and 0.57 kW·Sr^(-1)at the backward detection angles,respectively.Significant variations in the radiation contributions from gases and solids are observed across different bands under the two fuel conditions,especially within 3-5μm band.This research provides valuable insights into the IR radiation characteristics of scramjets,which can aid in the development of IR detection systems for AHV.展开更多
In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper fi...In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper first proposes a semi-empirical model for calculating boundary layer thickness based on dimensional analysis,using published experimental data on microcapillary flow.Furthermore,considering the non-uniform distribution of fluid viscosity in the flow channels of tight reservoirs,a theoretical model for boundary layer thickness is established based on fractal theory,and permeability predictions are conducted through Monte Carlo simulations.Finally,sensitivity analyses of various influencing parameters are performed.The results show that,compared to other fractal-based analytical models,the proposed permeability probabilistic model integrates parameters affecting fluid flow with random numbers,reflecting both the fractal and randomness characteristics of capillary size distribution.The computational results exhibit the highest consistency with experimental data.Among the factors affecting the boundary layer,in addition to certain conventional physical and mechanical parameters,different microstructure parameters significantly influence the boundary layer as well.A higher tortuosity fractal dimension results in a thicker boundary layer,while increases in pore fractal dimension,porosity,and maximum capillary size help mitigate the boundary layer effect.It is also observed that the permeability of large pores exhibits greater sensitivity to changes in various influencing parameters.Considering micro-scale flow effects,the proposed model enhances the understanding of the physical mechanisms of fluid transport in dense porous media.展开更多
Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patte...Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patterns.This complexity poses significant challenges for slope stability analysis,requiring the development of specialized techniques to address these issues.This research presents a numerical methodology that incorporates spatial variability,nonlinear material characterization,and probabilistic analysis using a Monte Carlo framework to address this issue.The heterogeneous structure is represented by randomly assigning different lithotypes across the slope,while maintaining predefined global proportions.This contrasts with the more common approach of applying probabilistic variability to mechanical parameters within a homogeneous slope model.The material behavior is defined using complex nonlinear failure criteria,such as the Hoek-Brown model and a parabolic model with collapse,both implemented through linearization techniques.The Discontinuity Layout Optimization(DLO)method,a novel numerical approach based on limit analysis,is employed to efficiently incorporate these advances and compute the factor of safety of the slope.Within this framework,the Monte Carlo procedure is used to assess slope stability by conducting a large number of simulations,each with a different lithotype distribution.Based on the results,a hybrid method is proposed that combines probabilistic modeling with deterministic design principles for the slope stability assessment.As a case study,the methodology is applied to a 20-m-high vertical slope composed of three lithotypes(altered scoria,welded scoria,and basalt)randomly distributed in proportions of 15%,60%,and 25%,respectively.The results show convergence of mean values after approximately 400 simulations and highlight the significant influence of spatial heterogeneity,with variations of the factor of safety between 5 and 12 in 85%of cases.They also reveal non-circular and mid-slope failure wedges not captured by traditional stability methods.Finally,an equivalent normal probability distribution is proposed as a reliable approximation of the factor of safety for use in risk analysis and engineering decision-making.展开更多
GPU-based Monte Carlo(MC)simulations are highly valued for their potential to improve both the computational efficiency and accuracy of radiotherapy.However,in proton therapy,these methods often simplify human tissues...GPU-based Monte Carlo(MC)simulations are highly valued for their potential to improve both the computational efficiency and accuracy of radiotherapy.However,in proton therapy,these methods often simplify human tissues as water for nuclear reactions,disregarding their true elemental composition and thereby potentially compromising calculation accuracy.Consequently,this study developed the program g MCAP(GPU-based proton MC Algorithm for Proton therapy),incorporating precise discrete interactions,and established a refined nuclear reaction model(REFINED)that considers the actual materials of the human body.Compared to the approximate water model(APPROX),the REFINED model demonstrated an improvement in calculation accuracy of 3%.In particular,in high-density tissue regions,the maximum dose deviation between the REFINED and APPROX models was up to 15%.In summary,the g MCAP program can efficiently simulate 1 million protons within 1 s while significantly enhancing dose calculation accuracy in high-density tissues,thus providing a more precise and efficient engine for proton radiotherapy dose calculations in clinical practice.展开更多
Monte Carlo(MC) simulations have been performed to refine the estimation of the correction-toscaling exponent ω in the 2D φ^(4)model,which belongs to one of the most fundamental universality classes.If corrections h...Monte Carlo(MC) simulations have been performed to refine the estimation of the correction-toscaling exponent ω in the 2D φ^(4)model,which belongs to one of the most fundamental universality classes.If corrections have the form ∝ L^(-ω),then we find ω=1.546(30) andω=1.509(14) as the best estimates.These are obtained from the finite-size scaling of the susceptibility data in the range of linear lattice sizes L ∈[128,2048] at the critical value of the Binder cumulant and from the scaling of the corresponding pseudocritical couplings within L∈[64,2048].These values agree with several other MC estimates at the assumption of the power-law corrections and are comparable with the known results of the ε-expansion.In addition,we have tested the consistency with the scaling corrections of the form ∝ L^(-4/3),∝L^(-4/3)In L and ∝L^(-4/3)/ln L,which might be expected from some considerations of the renormalization group and Coulomb gas model.The latter option is consistent with our MC data.Our MC results served as a basis for a critical reconsideration of some earlier theoretical conjectures and scaling assumptions.In particular,we have corrected and refined our previous analysis by grouping Feynman diagrams.The renewed analysis gives ω≈4-d-2η as some approximation for spatial dimensions d <4,or ω≈1.5 in two dimensions.展开更多
Gassy soils are distributed in relatively shallow layers the Quaternary deposit in Hangzhou Bay area. The shallow gassy soils significantly affect the construction of underground projects. Proper characterization of s...Gassy soils are distributed in relatively shallow layers the Quaternary deposit in Hangzhou Bay area. The shallow gassy soils significantly affect the construction of underground projects. Proper characterization of spatial distribution of shallow gassy soils is indispensable prior to construction of underground projects in the area. Due to the costly conditions required in the site investigation for gassy soils, only a limited number of gas pressure data can be obtained in engineering practice, which leads to the uncertainty in characterizing spatial distribution of gassy soils. Determining the number of boreholes for investigating gassy soils and their corresponding locations is pivotal to reducing construction risk induced by gassy soils. However, this primarily relies on the engineering experience in the current site investigation practice. This study develops a probabilistic site investigation optimization method for planning investigation schemes (including the number and locations of boreholes) of gassy soils based on the conditional random field and Monte Carlo simulation. The proposed method aims to provide an optimal investigation scheme before the site investigation based on prior knowledge. Finally, the proposed approach is illustrated using a case study.展开更多
This study presents the results of a Monte Carlo simulation to compare the statistical power of Siegel-Tukey and Savage tests.The main purpose of the study is to evaluate the statistical power of both tests in scenari...This study presents the results of a Monte Carlo simulation to compare the statistical power of Siegel-Tukey and Savage tests.The main purpose of the study is to evaluate the statistical power of both tests in scenarios involving Normal,Platykurtic and Skewed distributions over different sample sizes and standard deviation values.In the study,standard deviation ratios were set as 2,3,4,1/2,1/3 and 1/4 and power comparisons were made between small and large sample sizes.For equal sample sizes,small sample sizes of 5,8,10,12,16 and 20 and large sample sizes of 25,50,75 and 100 were used.For different sample sizes,the combinations of(4,16),(8,16),(10,20),(16,4),(16,8)and(20,10)small sample sizes and(10,30),(30,10),(50,75),(50,100),(75,50),(75,100),(100,50)and(100,75)large sample sizes were examined in detail.According to the findings,the power analysis under variance heterogeneity conditions shows that the Siegel-Tukey test has a higher statistical power than the other nonparametric Savage test at small and large sample sizes.In particular,the Siegel-Tukey test was reported to offer higher precision and power under variance heterogeneity,regardless of having equal or different sample sizes.展开更多
基金supported by the National Natural Science Foundation of China(Nos.U2167209 and 12375312)Open-end Fund Projects of China Institute for Radiation Protection Scientific Research Platform(CIRP-HYYFZH-2023ZD001).
文摘Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow computational speed.Progress in heterogeneous computing has allowed for substantial acceleration in the computation of mesh-type phantoms by utilizing hardware accelerators.In this study,a GPU-accelerated Monte Carlo method was developed to expedite the dose calculation for mesh-type computational phantoms.This involved designing and implementing the entire procedural flow of a GPUaccelerated Monte Carlo program.We employed acceleration structures to process the mesh-type phantom,optimized the traversal methodology,and achieved a flattened structure to overcome the limitations of GPU stack depths.Particle transport methods were realized within the mesh-type phantom,encompassing particle location and intersection techniques.In response to typical external irradiation scenarios,we utilized Geant4 along with the GPU program and its CPU serial code for dose calculations,assessing both computational accuracy and efficiency.In comparison with the benchmark simulated using Geant4 on the CPU using one thread,the relative differences in the organ dose calculated by the GPU program predominantly lay within a margin of 5%,whereas the computational time was reduced by a factor ranging from 120 to 2700.To the best of our knowledge,this study achieved a GPU-accelerated dose calculation method for mesh-type phantoms for the first time,reducing the computational time from hours to seconds per simulation of ten million particles and offering a swift and precise Monte Carlo method for dose calculation in mesh-type computational phantoms.
基金Postgraduate Innovation Top notch Talent Training Project of Hunan Province,Grant/Award Number:CX20220045Scientific Research Project of National University of Defense Technology,Grant/Award Number:22-ZZCX-07+2 种基金New Era Education Quality Project of Anhui Province,Grant/Award Number:2023cxcysj194National Natural Science Foundation of China,Grant/Award Numbers:62201597,62205372,1210456foundation of Hefei Comprehensive National Science Center,Grant/Award Number:KY23C502。
文摘Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.
基金supported by the National Natural Science Foundation of China(Nos.12475174 and U2267207)YueLuShan Center Industrial Innovation(No.2024YCII0108)+2 种基金Natural Science Foundation of Hunan Province(No.2022JJ40345)Science and Technology Innovation Project of Hengyang(No.202250045336)the Project of State Key Laboratory of Radiation Medicine and Protection,Soochow University(No.GZK12023031)。
文摘The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation shielding design,and medical physics.However,with the rapid advancement of new nuclear energy systems,the Monte Carlo method faces challenges in efficiency,accuracy,and adaptability,limiting its effectiveness in meeting modern design requirements.Overcoming technical obstacles related to high-fidelity coupling,high-resolution computation,and intelligent design is essential for using the Monte Carlo method as a reliable tool in numerical analysis for these new nuclear energy systems.To address these challenges,the Nuclear Energy and Application Laboratory(NEAL)team at the University of South China developed a multifunctional and generalized intelligent code platform called MagicMC,based on the Monte Carlo particle transport method.MagicMC is a developing tool dedicated to nuclear applications,incorporating intelligent methodologies.It consists of two primary components:a basic unit and a functional unit.The basic unit,which functions similarly to a standard Monte Carlo particle transport code,includes seven modules:geometry,source,transport,database,tally,output,and auxiliary.The functional unit builds on the basic unit by adding functional modules to address complex and diverse applications in nuclear analysis.MagicMC introduces a dynamic Monte Carlo particle transport algorithm to address time-space particle transport problems within emerging nuclear energy systems and incorporates a CPU-GPU heterogeneous parallel framework to enable high-efficiency,high-resolution simulations for large-scale computational problems.Anticipating future trends in intelligent design,MagicMC integrates several advanced features,including CAD-based geometry modeling,global variance reduction methods,multi-objective shielding optimization,high-resolution activation analysis,multi-physics coupling,and radiation therapy.In this paper,various numerical benchmarks-spanning reactor transient simulations,material activation analysis,radiation shielding optimization,and medical dosimetry analysis-are presented to validate MagicMC.The numerical results demonstrate MagicMC's efficiency,accuracy,and reliability in these preliminary applications,underscoring its potential to support technological advancements in developing high-fidelity,high-resolution,and high-intelligence MC-based tools for advanced nuclear applications.
基金the funding from the Ger-man Research Foundation(DFG)-BE 5360/1-1 and ThyssenKrupp Europe.
文摘A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system at atomic resolution remains challenging.The objective is to efficiently model and predict a phys-ically informed segregated solute distribution rather than simulating a series of diffusion kinetics.To ad-dress this objective,we coupled molecular dynamics(MD)and Monte Carlo(MC)methods using a novel method based on virtual atoms technique.We applied our MD-MC coupling approach to model off-lattice carbon(C)solute segregation in nanoindented Fe-C samples containing complex dislocation networks.Our coupling framework yielded the final configuration through efficient parallelization and localized en-ergy computations,showing C Cottrell atmospheres near dislocations.Different initial C concentrations resulted in a consistent trend of C atoms migrating from less crystalline distortion to high crystalline distortion regions.Besides unraveling the strong spatial correlation between local C concentration and defect regions,our results revealed two crucial aspects of solute segregation preferences:(1)defect ener-getics hierarchy and(2)tensile strain fields near dislocations.The proposed approach is generic and can be applied to other material systems as well.
基金supported by the National Natural Science Foundation of China(No.12102356)。
文摘Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work proposes an Adaptive Reverse Monte Carlo(ARMC)method and develops an analytical model for the IR radiation of scramjet considering gaseous kerosene and hydrogen fueled conditions.The evaluation studies show that at a global equivalence ratio of 0.8,the IR radiation from hydrogen-fueled plume is predominantly from H_(2)O and spectral peak is 1.53 kW·Sr^(-1)·μm^(-1)at the 2.7μm band,while the kerosene-fueled plume exhibits a spectral intensity approaching 7.0 kW·Sr^(-1)·μm^(-1)at the 4.3μm band.At the backward detection angle,both types of scramjets exhibit spectral peaks within the 1.3-1.4μm band,with intensities around10 kW·Sr^(-1)·μm^(-1).The integral radiation intensity of hydrogen-fueled scramjet is generally higher than kerosene-fueled scramjet,particularly in 1-3μm band.Meanwhile,at wide detection angles,the solid walls become the predominant radiation source.The radiation intensity is highest in1-3μm and weakest in 8-14μm band,with values of 21.5 kW·Sr^(-1)and 0.57 kW·Sr^(-1)at the backward detection angles,respectively.Significant variations in the radiation contributions from gases and solids are observed across different bands under the two fuel conditions,especially within 3-5μm band.This research provides valuable insights into the IR radiation characteristics of scramjets,which can aid in the development of IR detection systems for AHV.
基金supported by the Hebei Provincial Natural Science Foundation of China(No.D2023402012)the Major Science and Technology Project of China National Petroleum Corporation(No.2024DJ87).
文摘In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper first proposes a semi-empirical model for calculating boundary layer thickness based on dimensional analysis,using published experimental data on microcapillary flow.Furthermore,considering the non-uniform distribution of fluid viscosity in the flow channels of tight reservoirs,a theoretical model for boundary layer thickness is established based on fractal theory,and permeability predictions are conducted through Monte Carlo simulations.Finally,sensitivity analyses of various influencing parameters are performed.The results show that,compared to other fractal-based analytical models,the proposed permeability probabilistic model integrates parameters affecting fluid flow with random numbers,reflecting both the fractal and randomness characteristics of capillary size distribution.The computational results exhibit the highest consistency with experimental data.Among the factors affecting the boundary layer,in addition to certain conventional physical and mechanical parameters,different microstructure parameters significantly influence the boundary layer as well.A higher tortuosity fractal dimension results in a thicker boundary layer,while increases in pore fractal dimension,porosity,and maximum capillary size help mitigate the boundary layer effect.It is also observed that the permeability of large pores exhibits greater sensitivity to changes in various influencing parameters.Considering micro-scale flow effects,the proposed model enhances the understanding of the physical mechanisms of fluid transport in dense porous media.
基金the project PID2022-139202OB-I00Neural Networks and Optimization Techniques for the Design and Safe Maintenance of Transportation Infrastructures:Volcanic Rock Geotechnics and Slope Stability(IA-Pyroslope),funded by the Spanish State Research Agency of the Ministry of Science,Innovation and Universities of Spain and the European Regional Development Fund,MCIN/AEI/10.13039/501100011033/FEDER,EU。
文摘Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patterns.This complexity poses significant challenges for slope stability analysis,requiring the development of specialized techniques to address these issues.This research presents a numerical methodology that incorporates spatial variability,nonlinear material characterization,and probabilistic analysis using a Monte Carlo framework to address this issue.The heterogeneous structure is represented by randomly assigning different lithotypes across the slope,while maintaining predefined global proportions.This contrasts with the more common approach of applying probabilistic variability to mechanical parameters within a homogeneous slope model.The material behavior is defined using complex nonlinear failure criteria,such as the Hoek-Brown model and a parabolic model with collapse,both implemented through linearization techniques.The Discontinuity Layout Optimization(DLO)method,a novel numerical approach based on limit analysis,is employed to efficiently incorporate these advances and compute the factor of safety of the slope.Within this framework,the Monte Carlo procedure is used to assess slope stability by conducting a large number of simulations,each with a different lithotype distribution.Based on the results,a hybrid method is proposed that combines probabilistic modeling with deterministic design principles for the slope stability assessment.As a case study,the methodology is applied to a 20-m-high vertical slope composed of three lithotypes(altered scoria,welded scoria,and basalt)randomly distributed in proportions of 15%,60%,and 25%,respectively.The results show convergence of mean values after approximately 400 simulations and highlight the significant influence of spatial heterogeneity,with variations of the factor of safety between 5 and 12 in 85%of cases.They also reveal non-circular and mid-slope failure wedges not captured by traditional stability methods.Finally,an equivalent normal probability distribution is proposed as a reliable approximation of the factor of safety for use in risk analysis and engineering decision-making.
文摘GPU-based Monte Carlo(MC)simulations are highly valued for their potential to improve both the computational efficiency and accuracy of radiotherapy.However,in proton therapy,these methods often simplify human tissues as water for nuclear reactions,disregarding their true elemental composition and thereby potentially compromising calculation accuracy.Consequently,this study developed the program g MCAP(GPU-based proton MC Algorithm for Proton therapy),incorporating precise discrete interactions,and established a refined nuclear reaction model(REFINED)that considers the actual materials of the human body.Compared to the approximate water model(APPROX),the REFINED model demonstrated an improvement in calculation accuracy of 3%.In particular,in high-density tissue regions,the maximum dose deviation between the REFINED and APPROX models was up to 15%.In summary,the g MCAP program can efficiently simulate 1 million protons within 1 s while significantly enhancing dose calculation accuracy in high-density tissues,thus providing a more precise and efficient engine for proton radiotherapy dose calculations in clinical practice.
文摘Monte Carlo(MC) simulations have been performed to refine the estimation of the correction-toscaling exponent ω in the 2D φ^(4)model,which belongs to one of the most fundamental universality classes.If corrections have the form ∝ L^(-ω),then we find ω=1.546(30) andω=1.509(14) as the best estimates.These are obtained from the finite-size scaling of the susceptibility data in the range of linear lattice sizes L ∈[128,2048] at the critical value of the Binder cumulant and from the scaling of the corresponding pseudocritical couplings within L∈[64,2048].These values agree with several other MC estimates at the assumption of the power-law corrections and are comparable with the known results of the ε-expansion.In addition,we have tested the consistency with the scaling corrections of the form ∝ L^(-4/3),∝L^(-4/3)In L and ∝L^(-4/3)/ln L,which might be expected from some considerations of the renormalization group and Coulomb gas model.The latter option is consistent with our MC data.Our MC results served as a basis for a critical reconsideration of some earlier theoretical conjectures and scaling assumptions.In particular,we have corrected and refined our previous analysis by grouping Feynman diagrams.The renewed analysis gives ω≈4-d-2η as some approximation for spatial dimensions d <4,or ω≈1.5 in two dimensions.
文摘Gassy soils are distributed in relatively shallow layers the Quaternary deposit in Hangzhou Bay area. The shallow gassy soils significantly affect the construction of underground projects. Proper characterization of spatial distribution of shallow gassy soils is indispensable prior to construction of underground projects in the area. Due to the costly conditions required in the site investigation for gassy soils, only a limited number of gas pressure data can be obtained in engineering practice, which leads to the uncertainty in characterizing spatial distribution of gassy soils. Determining the number of boreholes for investigating gassy soils and their corresponding locations is pivotal to reducing construction risk induced by gassy soils. However, this primarily relies on the engineering experience in the current site investigation practice. This study develops a probabilistic site investigation optimization method for planning investigation schemes (including the number and locations of boreholes) of gassy soils based on the conditional random field and Monte Carlo simulation. The proposed method aims to provide an optimal investigation scheme before the site investigation based on prior knowledge. Finally, the proposed approach is illustrated using a case study.
文摘This study presents the results of a Monte Carlo simulation to compare the statistical power of Siegel-Tukey and Savage tests.The main purpose of the study is to evaluate the statistical power of both tests in scenarios involving Normal,Platykurtic and Skewed distributions over different sample sizes and standard deviation values.In the study,standard deviation ratios were set as 2,3,4,1/2,1/3 and 1/4 and power comparisons were made between small and large sample sizes.For equal sample sizes,small sample sizes of 5,8,10,12,16 and 20 and large sample sizes of 25,50,75 and 100 were used.For different sample sizes,the combinations of(4,16),(8,16),(10,20),(16,4),(16,8)and(20,10)small sample sizes and(10,30),(30,10),(50,75),(50,100),(75,50),(75,100),(100,50)and(100,75)large sample sizes were examined in detail.According to the findings,the power analysis under variance heterogeneity conditions shows that the Siegel-Tukey test has a higher statistical power than the other nonparametric Savage test at small and large sample sizes.In particular,the Siegel-Tukey test was reported to offer higher precision and power under variance heterogeneity,regardless of having equal or different sample sizes.