This study aimed to integrate Monte Carlo(MC)simulation with deep learning(DL)-based denoising techniques to achieve fast and accurate prediction of high-quality electronic portal imaging device(EPID)transmission dose...This study aimed to integrate Monte Carlo(MC)simulation with deep learning(DL)-based denoising techniques to achieve fast and accurate prediction of high-quality electronic portal imaging device(EPID)transmission dose(TD)for patientspecific quality assurance(PSQA).A total of 100 lung cases were used to obtain the noisy EPID TD by the ARCHER MC code under four kinds of particle numbers(1×10^(6),1×10^(7),1×10^(8)and 1×10^(9)),and the original EPID TD was denoised by the SUNet neural network.The denoised EPID TD was assessed both qualitatively and quantitatively using the structural similarity(SSIM),peak signal-to-noise ratio(PSNR),and gamma passing rate(GPR)with respect to 1×10^(9)as a reference.The computation times for both the MC simulation and DL-based denoising were recorded.As the number of particles increased,both the quality of the noisy EPID TD and computation time increased significantly(1×10^(6):1.12 s,1×10^(7):1.72 s,1×10^(8):8.62 s,and 1×10^(9):73.89 s).In contrast,the DL-based denoising time remained at 0.13-0.16 s.The denoised EPID TD shows a smoother visual appearance and profile curves,but differences between 1×10^(6)and 1×10^(9)still remain.SSIM improves from 0.61 to 0.95 for 1×10^(6),0.70 to 0.96 for 1×10^(7),and 0.90 to 0.97 for 1×10^(8).PSNR increases by>20%for 1×10^(6)and 1×10^(7),and>10%for 1×10^(8).GPR improves from 48.47%to 89.10%for 1×10^(6),61.04%to 94.35%for 1×10^(7),and 91.88%to 99.55%for 1×10^(8).The method that combines MC simulation with DL-based denoising for EPID TD generation can accelerate TD prediction and maintain high accuracy,offering a promising solution for efficient PSQA.展开更多
Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow com...Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow computational speed.Progress in heterogeneous computing has allowed for substantial acceleration in the computation of mesh-type phantoms by utilizing hardware accelerators.In this study,a GPU-accelerated Monte Carlo method was developed to expedite the dose calculation for mesh-type computational phantoms.This involved designing and implementing the entire procedural flow of a GPUaccelerated Monte Carlo program.We employed acceleration structures to process the mesh-type phantom,optimized the traversal methodology,and achieved a flattened structure to overcome the limitations of GPU stack depths.Particle transport methods were realized within the mesh-type phantom,encompassing particle location and intersection techniques.In response to typical external irradiation scenarios,we utilized Geant4 along with the GPU program and its CPU serial code for dose calculations,assessing both computational accuracy and efficiency.In comparison with the benchmark simulated using Geant4 on the CPU using one thread,the relative differences in the organ dose calculated by the GPU program predominantly lay within a margin of 5%,whereas the computational time was reduced by a factor ranging from 120 to 2700.To the best of our knowledge,this study achieved a GPU-accelerated dose calculation method for mesh-type phantoms for the first time,reducing the computational time from hours to seconds per simulation of ten million particles and offering a swift and precise Monte Carlo method for dose calculation in mesh-type computational phantoms.展开更多
Stand age plays a crucial role in forest biomass estimation and carbon cycle modeling.Assessing the uncertainty of stand age prediction models and identifying the key driving factors in the modeling process have becom...Stand age plays a crucial role in forest biomass estimation and carbon cycle modeling.Assessing the uncertainty of stand age prediction models and identifying the key driving factors in the modeling process have become major challenges in forestry research.In this study,we selected the Shaanxi-Gansu-Ningxia region of Northeast China as the research area and utilized multi-source datasets from the summer of 2019 to extract information on spectral,textural,climatic,water balance,and stand characteristics.By integrating the Random Forest(RF)model with Monte Carlo(MC)simulation,we constructed six regression models based on different combina-tions of features and evaluated the uncertainty of each model.Furthermore,we investigated the driving factors influencing stand age modeling by analyzing the effects of different types of features on age inversion.Model performance and accuracy were assessed using the root mean square error(RMSE),mean absolute error(MAE),and the coefficient of determination(R^(2)),while the relative root mean square error(rRMSE)was employed to quantify model uncertainty.The results indicate that the scenarios with more obvious improve-ment in accuracy and effective reduction in uncertainty were Scenario 3 with the inclusion of climate and water balance information(RMSE=25.54 yr,MAE=18.03 yr,R^(2)=0.51,rRMSE=19.17%)and Scenario 5 with the inclusion of stand characterization informa-tion(RMSE=18.47 yr,MAE=13.05 yr,R^(2)=0.74,rRMSE=16.99%).Scenario 6,incorporating all feature types,achieved the highest accuracy(RMSE=17.60 yr,MAE=12.06 yr,R^(2)=0.77,rRMSE=14.19%).In this study,elevation,minimum temperature,and diameter at breast height(DBH)emerged as the key drivers of stand-age modeling.The proposed method can be used to identify drivers and to quantify uncertainty in stand-age estimation,providing a useful reference for improving model accuracy and uncertainty assessment.展开更多
Underwater images frequently suffer from chromatic distortion,blurred details,and low contrast,posing significant challenges for enhancement.This paper introduces AquaTree,a novel underwater image enhancement(UIE)meth...Underwater images frequently suffer from chromatic distortion,blurred details,and low contrast,posing significant challenges for enhancement.This paper introduces AquaTree,a novel underwater image enhancement(UIE)method that reformulates the task as a Markov Decision Process(MDP)through the integration of Monte Carlo Tree Search(MCTS)and deep reinforcement learning(DRL).The framework employs an action space of 25 enhancement operators,strategically grouped for basic attribute adjustment,color component balance,correction,and deblurring.Exploration within MCTS is guided by a dual-branch convolutional network,enabling intelligent sequential operator selection.Our core contributions include:(1)a multimodal state representation combining CIELab color histograms with deep perceptual features,(2)a dual-objective reward mechanism optimizing chromatic fidelity and perceptual consistency,and(3)an alternating training strategy co-optimizing enhancement sequences and network parameters.We further propose two inference schemes:an MCTS-based approach prioritizing accuracy at higher computational cost,and an efficient network policy enabling real-time processing with minimal quality loss.Comprehensive evaluations on the UIEB Dataset and Color correction and haze removal comparisons on the U45 Dataset demonstrate AquaTree’s superiority,significantly outperforming nine state-of-the-art methods across five established underwater image quality metrics.展开更多
Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmenta...Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.展开更多
The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation ...The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation shielding design,and medical physics.However,with the rapid advancement of new nuclear energy systems,the Monte Carlo method faces challenges in efficiency,accuracy,and adaptability,limiting its effectiveness in meeting modern design requirements.Overcoming technical obstacles related to high-fidelity coupling,high-resolution computation,and intelligent design is essential for using the Monte Carlo method as a reliable tool in numerical analysis for these new nuclear energy systems.To address these challenges,the Nuclear Energy and Application Laboratory(NEAL)team at the University of South China developed a multifunctional and generalized intelligent code platform called MagicMC,based on the Monte Carlo particle transport method.MagicMC is a developing tool dedicated to nuclear applications,incorporating intelligent methodologies.It consists of two primary components:a basic unit and a functional unit.The basic unit,which functions similarly to a standard Monte Carlo particle transport code,includes seven modules:geometry,source,transport,database,tally,output,and auxiliary.The functional unit builds on the basic unit by adding functional modules to address complex and diverse applications in nuclear analysis.MagicMC introduces a dynamic Monte Carlo particle transport algorithm to address time-space particle transport problems within emerging nuclear energy systems and incorporates a CPU-GPU heterogeneous parallel framework to enable high-efficiency,high-resolution simulations for large-scale computational problems.Anticipating future trends in intelligent design,MagicMC integrates several advanced features,including CAD-based geometry modeling,global variance reduction methods,multi-objective shielding optimization,high-resolution activation analysis,multi-physics coupling,and radiation therapy.In this paper,various numerical benchmarks-spanning reactor transient simulations,material activation analysis,radiation shielding optimization,and medical dosimetry analysis-are presented to validate MagicMC.The numerical results demonstrate MagicMC's efficiency,accuracy,and reliability in these preliminary applications,underscoring its potential to support technological advancements in developing high-fidelity,high-resolution,and high-intelligence MC-based tools for advanced nuclear applications.展开更多
A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system ...A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system at atomic resolution remains challenging.The objective is to efficiently model and predict a phys-ically informed segregated solute distribution rather than simulating a series of diffusion kinetics.To ad-dress this objective,we coupled molecular dynamics(MD)and Monte Carlo(MC)methods using a novel method based on virtual atoms technique.We applied our MD-MC coupling approach to model off-lattice carbon(C)solute segregation in nanoindented Fe-C samples containing complex dislocation networks.Our coupling framework yielded the final configuration through efficient parallelization and localized en-ergy computations,showing C Cottrell atmospheres near dislocations.Different initial C concentrations resulted in a consistent trend of C atoms migrating from less crystalline distortion to high crystalline distortion regions.Besides unraveling the strong spatial correlation between local C concentration and defect regions,our results revealed two crucial aspects of solute segregation preferences:(1)defect ener-getics hierarchy and(2)tensile strain fields near dislocations.The proposed approach is generic and can be applied to other material systems as well.展开更多
Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work pr...Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work proposes an Adaptive Reverse Monte Carlo(ARMC)method and develops an analytical model for the IR radiation of scramjet considering gaseous kerosene and hydrogen fueled conditions.The evaluation studies show that at a global equivalence ratio of 0.8,the IR radiation from hydrogen-fueled plume is predominantly from H_(2)O and spectral peak is 1.53 kW·Sr^(-1)·μm^(-1)at the 2.7μm band,while the kerosene-fueled plume exhibits a spectral intensity approaching 7.0 kW·Sr^(-1)·μm^(-1)at the 4.3μm band.At the backward detection angle,both types of scramjets exhibit spectral peaks within the 1.3-1.4μm band,with intensities around10 kW·Sr^(-1)·μm^(-1).The integral radiation intensity of hydrogen-fueled scramjet is generally higher than kerosene-fueled scramjet,particularly in 1-3μm band.Meanwhile,at wide detection angles,the solid walls become the predominant radiation source.The radiation intensity is highest in1-3μm and weakest in 8-14μm band,with values of 21.5 kW·Sr^(-1)and 0.57 kW·Sr^(-1)at the backward detection angles,respectively.Significant variations in the radiation contributions from gases and solids are observed across different bands under the two fuel conditions,especially within 3-5μm band.This research provides valuable insights into the IR radiation characteristics of scramjets,which can aid in the development of IR detection systems for AHV.展开更多
Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patte...Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patterns.This complexity poses significant challenges for slope stability analysis,requiring the development of specialized techniques to address these issues.This research presents a numerical methodology that incorporates spatial variability,nonlinear material characterization,and probabilistic analysis using a Monte Carlo framework to address this issue.The heterogeneous structure is represented by randomly assigning different lithotypes across the slope,while maintaining predefined global proportions.This contrasts with the more common approach of applying probabilistic variability to mechanical parameters within a homogeneous slope model.The material behavior is defined using complex nonlinear failure criteria,such as the Hoek-Brown model and a parabolic model with collapse,both implemented through linearization techniques.The Discontinuity Layout Optimization(DLO)method,a novel numerical approach based on limit analysis,is employed to efficiently incorporate these advances and compute the factor of safety of the slope.Within this framework,the Monte Carlo procedure is used to assess slope stability by conducting a large number of simulations,each with a different lithotype distribution.Based on the results,a hybrid method is proposed that combines probabilistic modeling with deterministic design principles for the slope stability assessment.As a case study,the methodology is applied to a 20-m-high vertical slope composed of three lithotypes(altered scoria,welded scoria,and basalt)randomly distributed in proportions of 15%,60%,and 25%,respectively.The results show convergence of mean values after approximately 400 simulations and highlight the significant influence of spatial heterogeneity,with variations of the factor of safety between 5 and 12 in 85%of cases.They also reveal non-circular and mid-slope failure wedges not captured by traditional stability methods.Finally,an equivalent normal probability distribution is proposed as a reliable approximation of the factor of safety for use in risk analysis and engineering decision-making.展开更多
In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper fi...In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper first proposes a semi-empirical model for calculating boundary layer thickness based on dimensional analysis,using published experimental data on microcapillary flow.Furthermore,considering the non-uniform distribution of fluid viscosity in the flow channels of tight reservoirs,a theoretical model for boundary layer thickness is established based on fractal theory,and permeability predictions are conducted through Monte Carlo simulations.Finally,sensitivity analyses of various influencing parameters are performed.The results show that,compared to other fractal-based analytical models,the proposed permeability probabilistic model integrates parameters affecting fluid flow with random numbers,reflecting both the fractal and randomness characteristics of capillary size distribution.The computational results exhibit the highest consistency with experimental data.Among the factors affecting the boundary layer,in addition to certain conventional physical and mechanical parameters,different microstructure parameters significantly influence the boundary layer as well.A higher tortuosity fractal dimension results in a thicker boundary layer,while increases in pore fractal dimension,porosity,and maximum capillary size help mitigate the boundary layer effect.It is also observed that the permeability of large pores exhibits greater sensitivity to changes in various influencing parameters.Considering micro-scale flow effects,the proposed model enhances the understanding of the physical mechanisms of fluid transport in dense porous media.展开更多
基金supported by National Key R&D Program of China(No.2022YFC2404604)Chongqing Research Institution Performance Incentive Guidance Special Project(No.CSTB2023JXJL-YFX0080)Chongqing Medical Scientific Research Project(Joint project of Chongqing Health Commission and Science and Technology Bureau)(No.2022DBXM005)。
文摘This study aimed to integrate Monte Carlo(MC)simulation with deep learning(DL)-based denoising techniques to achieve fast and accurate prediction of high-quality electronic portal imaging device(EPID)transmission dose(TD)for patientspecific quality assurance(PSQA).A total of 100 lung cases were used to obtain the noisy EPID TD by the ARCHER MC code under four kinds of particle numbers(1×10^(6),1×10^(7),1×10^(8)and 1×10^(9)),and the original EPID TD was denoised by the SUNet neural network.The denoised EPID TD was assessed both qualitatively and quantitatively using the structural similarity(SSIM),peak signal-to-noise ratio(PSNR),and gamma passing rate(GPR)with respect to 1×10^(9)as a reference.The computation times for both the MC simulation and DL-based denoising were recorded.As the number of particles increased,both the quality of the noisy EPID TD and computation time increased significantly(1×10^(6):1.12 s,1×10^(7):1.72 s,1×10^(8):8.62 s,and 1×10^(9):73.89 s).In contrast,the DL-based denoising time remained at 0.13-0.16 s.The denoised EPID TD shows a smoother visual appearance and profile curves,but differences between 1×10^(6)and 1×10^(9)still remain.SSIM improves from 0.61 to 0.95 for 1×10^(6),0.70 to 0.96 for 1×10^(7),and 0.90 to 0.97 for 1×10^(8).PSNR increases by>20%for 1×10^(6)and 1×10^(7),and>10%for 1×10^(8).GPR improves from 48.47%to 89.10%for 1×10^(6),61.04%to 94.35%for 1×10^(7),and 91.88%to 99.55%for 1×10^(8).The method that combines MC simulation with DL-based denoising for EPID TD generation can accelerate TD prediction and maintain high accuracy,offering a promising solution for efficient PSQA.
基金supported by the National Natural Science Foundation of China(Nos.U2167209 and 12375312)Open-end Fund Projects of China Institute for Radiation Protection Scientific Research Platform(CIRP-HYYFZH-2023ZD001).
文摘Computational phantoms play an essential role in radiation dosimetry and health physics.Although mesh-type phantoms offer a high resolution and adjustability,their use in dose calculations is limited by their slow computational speed.Progress in heterogeneous computing has allowed for substantial acceleration in the computation of mesh-type phantoms by utilizing hardware accelerators.In this study,a GPU-accelerated Monte Carlo method was developed to expedite the dose calculation for mesh-type computational phantoms.This involved designing and implementing the entire procedural flow of a GPUaccelerated Monte Carlo program.We employed acceleration structures to process the mesh-type phantom,optimized the traversal methodology,and achieved a flattened structure to overcome the limitations of GPU stack depths.Particle transport methods were realized within the mesh-type phantom,encompassing particle location and intersection techniques.In response to typical external irradiation scenarios,we utilized Geant4 along with the GPU program and its CPU serial code for dose calculations,assessing both computational accuracy and efficiency.In comparison with the benchmark simulated using Geant4 on the CPU using one thread,the relative differences in the organ dose calculated by the GPU program predominantly lay within a margin of 5%,whereas the computational time was reduced by a factor ranging from 120 to 2700.To the best of our knowledge,this study achieved a GPU-accelerated dose calculation method for mesh-type phantoms for the first time,reducing the computational time from hours to seconds per simulation of ten million particles and offering a swift and precise Monte Carlo method for dose calculation in mesh-type computational phantoms.
基金Under the auspices of the Natural Science Foundation of China(No.32371875,32001249)。
文摘Stand age plays a crucial role in forest biomass estimation and carbon cycle modeling.Assessing the uncertainty of stand age prediction models and identifying the key driving factors in the modeling process have become major challenges in forestry research.In this study,we selected the Shaanxi-Gansu-Ningxia region of Northeast China as the research area and utilized multi-source datasets from the summer of 2019 to extract information on spectral,textural,climatic,water balance,and stand characteristics.By integrating the Random Forest(RF)model with Monte Carlo(MC)simulation,we constructed six regression models based on different combina-tions of features and evaluated the uncertainty of each model.Furthermore,we investigated the driving factors influencing stand age modeling by analyzing the effects of different types of features on age inversion.Model performance and accuracy were assessed using the root mean square error(RMSE),mean absolute error(MAE),and the coefficient of determination(R^(2)),while the relative root mean square error(rRMSE)was employed to quantify model uncertainty.The results indicate that the scenarios with more obvious improve-ment in accuracy and effective reduction in uncertainty were Scenario 3 with the inclusion of climate and water balance information(RMSE=25.54 yr,MAE=18.03 yr,R^(2)=0.51,rRMSE=19.17%)and Scenario 5 with the inclusion of stand characterization informa-tion(RMSE=18.47 yr,MAE=13.05 yr,R^(2)=0.74,rRMSE=16.99%).Scenario 6,incorporating all feature types,achieved the highest accuracy(RMSE=17.60 yr,MAE=12.06 yr,R^(2)=0.77,rRMSE=14.19%).In this study,elevation,minimum temperature,and diameter at breast height(DBH)emerged as the key drivers of stand-age modeling.The proposed method can be used to identify drivers and to quantify uncertainty in stand-age estimation,providing a useful reference for improving model accuracy and uncertainty assessment.
基金supported by theHubei Provincial Technology Innovation Special Project and the Natural Science Foundation of Hubei Province under Grants 2023BEB024,2024AFC066,respectively.
文摘Underwater images frequently suffer from chromatic distortion,blurred details,and low contrast,posing significant challenges for enhancement.This paper introduces AquaTree,a novel underwater image enhancement(UIE)method that reformulates the task as a Markov Decision Process(MDP)through the integration of Monte Carlo Tree Search(MCTS)and deep reinforcement learning(DRL).The framework employs an action space of 25 enhancement operators,strategically grouped for basic attribute adjustment,color component balance,correction,and deblurring.Exploration within MCTS is guided by a dual-branch convolutional network,enabling intelligent sequential operator selection.Our core contributions include:(1)a multimodal state representation combining CIELab color histograms with deep perceptual features,(2)a dual-objective reward mechanism optimizing chromatic fidelity and perceptual consistency,and(3)an alternating training strategy co-optimizing enhancement sequences and network parameters.We further propose two inference schemes:an MCTS-based approach prioritizing accuracy at higher computational cost,and an efficient network policy enabling real-time processing with minimal quality loss.Comprehensive evaluations on the UIEB Dataset and Color correction and haze removal comparisons on the U45 Dataset demonstrate AquaTree’s superiority,significantly outperforming nine state-of-the-art methods across five established underwater image quality metrics.
基金Postgraduate Innovation Top notch Talent Training Project of Hunan Province,Grant/Award Number:CX20220045Scientific Research Project of National University of Defense Technology,Grant/Award Number:22-ZZCX-07+2 种基金New Era Education Quality Project of Anhui Province,Grant/Award Number:2023cxcysj194National Natural Science Foundation of China,Grant/Award Numbers:62201597,62205372,1210456foundation of Hefei Comprehensive National Science Center,Grant/Award Number:KY23C502。
文摘Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.
基金supported by the National Natural Science Foundation of China(Nos.12475174 and U2267207)YueLuShan Center Industrial Innovation(No.2024YCII0108)+2 种基金Natural Science Foundation of Hunan Province(No.2022JJ40345)Science and Technology Innovation Project of Hengyang(No.202250045336)the Project of State Key Laboratory of Radiation Medicine and Protection,Soochow University(No.GZK12023031)。
文摘The Monte Carlo(MC)method offers significant advantages in handling complex geometries and physical processes in particle transport problems and has become a widely used approach in reactor physics analysis,radiation shielding design,and medical physics.However,with the rapid advancement of new nuclear energy systems,the Monte Carlo method faces challenges in efficiency,accuracy,and adaptability,limiting its effectiveness in meeting modern design requirements.Overcoming technical obstacles related to high-fidelity coupling,high-resolution computation,and intelligent design is essential for using the Monte Carlo method as a reliable tool in numerical analysis for these new nuclear energy systems.To address these challenges,the Nuclear Energy and Application Laboratory(NEAL)team at the University of South China developed a multifunctional and generalized intelligent code platform called MagicMC,based on the Monte Carlo particle transport method.MagicMC is a developing tool dedicated to nuclear applications,incorporating intelligent methodologies.It consists of two primary components:a basic unit and a functional unit.The basic unit,which functions similarly to a standard Monte Carlo particle transport code,includes seven modules:geometry,source,transport,database,tally,output,and auxiliary.The functional unit builds on the basic unit by adding functional modules to address complex and diverse applications in nuclear analysis.MagicMC introduces a dynamic Monte Carlo particle transport algorithm to address time-space particle transport problems within emerging nuclear energy systems and incorporates a CPU-GPU heterogeneous parallel framework to enable high-efficiency,high-resolution simulations for large-scale computational problems.Anticipating future trends in intelligent design,MagicMC integrates several advanced features,including CAD-based geometry modeling,global variance reduction methods,multi-objective shielding optimization,high-resolution activation analysis,multi-physics coupling,and radiation therapy.In this paper,various numerical benchmarks-spanning reactor transient simulations,material activation analysis,radiation shielding optimization,and medical dosimetry analysis-are presented to validate MagicMC.The numerical results demonstrate MagicMC's efficiency,accuracy,and reliability in these preliminary applications,underscoring its potential to support technological advancements in developing high-fidelity,high-resolution,and high-intelligence MC-based tools for advanced nuclear applications.
基金the funding from the Ger-man Research Foundation(DFG)-BE 5360/1-1 and ThyssenKrupp Europe.
文摘A microscopic understanding of the complex solute-defect interaction is pivotal for optimizing the alloy’s macroscopic mechanical properties.Simulating solute segregation in a plastically deformed crystalline system at atomic resolution remains challenging.The objective is to efficiently model and predict a phys-ically informed segregated solute distribution rather than simulating a series of diffusion kinetics.To ad-dress this objective,we coupled molecular dynamics(MD)and Monte Carlo(MC)methods using a novel method based on virtual atoms technique.We applied our MD-MC coupling approach to model off-lattice carbon(C)solute segregation in nanoindented Fe-C samples containing complex dislocation networks.Our coupling framework yielded the final configuration through efficient parallelization and localized en-ergy computations,showing C Cottrell atmospheres near dislocations.Different initial C concentrations resulted in a consistent trend of C atoms migrating from less crystalline distortion to high crystalline distortion regions.Besides unraveling the strong spatial correlation between local C concentration and defect regions,our results revealed two crucial aspects of solute segregation preferences:(1)defect ener-getics hierarchy and(2)tensile strain fields near dislocations.The proposed approach is generic and can be applied to other material systems as well.
基金supported by the National Natural Science Foundation of China(No.12102356)。
文摘Scramjet is the most promising propulsion system for Air-breathing Hypersonic Vehicle(AHV),and the Infrared(IR)radiation it emits is critical for early warning,detection,and identification of such weapons.This work proposes an Adaptive Reverse Monte Carlo(ARMC)method and develops an analytical model for the IR radiation of scramjet considering gaseous kerosene and hydrogen fueled conditions.The evaluation studies show that at a global equivalence ratio of 0.8,the IR radiation from hydrogen-fueled plume is predominantly from H_(2)O and spectral peak is 1.53 kW·Sr^(-1)·μm^(-1)at the 2.7μm band,while the kerosene-fueled plume exhibits a spectral intensity approaching 7.0 kW·Sr^(-1)·μm^(-1)at the 4.3μm band.At the backward detection angle,both types of scramjets exhibit spectral peaks within the 1.3-1.4μm band,with intensities around10 kW·Sr^(-1)·μm^(-1).The integral radiation intensity of hydrogen-fueled scramjet is generally higher than kerosene-fueled scramjet,particularly in 1-3μm band.Meanwhile,at wide detection angles,the solid walls become the predominant radiation source.The radiation intensity is highest in1-3μm and weakest in 8-14μm band,with values of 21.5 kW·Sr^(-1)and 0.57 kW·Sr^(-1)at the backward detection angles,respectively.Significant variations in the radiation contributions from gases and solids are observed across different bands under the two fuel conditions,especially within 3-5μm band.This research provides valuable insights into the IR radiation characteristics of scramjets,which can aid in the development of IR detection systems for AHV.
基金the project PID2022-139202OB-I00Neural Networks and Optimization Techniques for the Design and Safe Maintenance of Transportation Infrastructures:Volcanic Rock Geotechnics and Slope Stability(IA-Pyroslope),funded by the Spanish State Research Agency of the Ministry of Science,Innovation and Universities of Spain and the European Regional Development Fund,MCIN/AEI/10.13039/501100011033/FEDER,EU。
文摘Volcanic terrains exhibit a complex structure of pyroclastic deposits interspersed with sedimentary processes,resulting in irregular lithological sequences that lack lateral continuity and distinct stratigraphic patterns.This complexity poses significant challenges for slope stability analysis,requiring the development of specialized techniques to address these issues.This research presents a numerical methodology that incorporates spatial variability,nonlinear material characterization,and probabilistic analysis using a Monte Carlo framework to address this issue.The heterogeneous structure is represented by randomly assigning different lithotypes across the slope,while maintaining predefined global proportions.This contrasts with the more common approach of applying probabilistic variability to mechanical parameters within a homogeneous slope model.The material behavior is defined using complex nonlinear failure criteria,such as the Hoek-Brown model and a parabolic model with collapse,both implemented through linearization techniques.The Discontinuity Layout Optimization(DLO)method,a novel numerical approach based on limit analysis,is employed to efficiently incorporate these advances and compute the factor of safety of the slope.Within this framework,the Monte Carlo procedure is used to assess slope stability by conducting a large number of simulations,each with a different lithotype distribution.Based on the results,a hybrid method is proposed that combines probabilistic modeling with deterministic design principles for the slope stability assessment.As a case study,the methodology is applied to a 20-m-high vertical slope composed of three lithotypes(altered scoria,welded scoria,and basalt)randomly distributed in proportions of 15%,60%,and 25%,respectively.The results show convergence of mean values after approximately 400 simulations and highlight the significant influence of spatial heterogeneity,with variations of the factor of safety between 5 and 12 in 85%of cases.They also reveal non-circular and mid-slope failure wedges not captured by traditional stability methods.Finally,an equivalent normal probability distribution is proposed as a reliable approximation of the factor of safety for use in risk analysis and engineering decision-making.
基金supported by the Hebei Provincial Natural Science Foundation of China(No.D2023402012)the Major Science and Technology Project of China National Petroleum Corporation(No.2024DJ87).
文摘In contrast to conventional reservoirs,tight formations have more complex pore structures and significant boundary layer effect,making it difficult to determine the effective permeability.To address this,this paper first proposes a semi-empirical model for calculating boundary layer thickness based on dimensional analysis,using published experimental data on microcapillary flow.Furthermore,considering the non-uniform distribution of fluid viscosity in the flow channels of tight reservoirs,a theoretical model for boundary layer thickness is established based on fractal theory,and permeability predictions are conducted through Monte Carlo simulations.Finally,sensitivity analyses of various influencing parameters are performed.The results show that,compared to other fractal-based analytical models,the proposed permeability probabilistic model integrates parameters affecting fluid flow with random numbers,reflecting both the fractal and randomness characteristics of capillary size distribution.The computational results exhibit the highest consistency with experimental data.Among the factors affecting the boundary layer,in addition to certain conventional physical and mechanical parameters,different microstructure parameters significantly influence the boundary layer as well.A higher tortuosity fractal dimension results in a thicker boundary layer,while increases in pore fractal dimension,porosity,and maximum capillary size help mitigate the boundary layer effect.It is also observed that the permeability of large pores exhibits greater sensitivity to changes in various influencing parameters.Considering micro-scale flow effects,the proposed model enhances the understanding of the physical mechanisms of fluid transport in dense porous media.