Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morp...Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morphology is to use construction ele- ment measure image morphology for solving understand problem.The article presented advanced cellular neural network that forms mathematical morphological cellular neural network (MMCNN) equation to be suit for mathematical morphology filter.It gave the theo- ries of MMCNN dynamic extent and stable state.It is evidenced that arrived mathematical morphology filter through steady of dynamic process in definite condition.展开更多
Rapidly-exploring Random Tree(RRT)and its variants have become foundational in path-planning research,yet in complex three-dimensional off-road environments their uniform blind sampling and limited safety guarantees l...Rapidly-exploring Random Tree(RRT)and its variants have become foundational in path-planning research,yet in complex three-dimensional off-road environments their uniform blind sampling and limited safety guarantees lead to slow convergence and force an unfavorable trade-off between path quality and traversal safety.To address these challenges,we introduce HS-APF-RRT*,a novel algorithm that fuses layered sampling,an enhanced Artificial Potential Field(APF),and a dynamic neighborhood-expansion mechanism.First,the workspace is hierarchically partitioned into macro,meso,and micro sampling layers,progressively biasing random samples toward safer,lower-energy regions.Second,we augment the traditional APF by incorporating a slope-dependent repulsive term,enabling stronger avoidance of steep obstacles.Third,a dynamic expansion strategy adaptively switches between 8 and 16 connected neighborhoods based on local obstacle density,striking an effective balance between search efficiency and collision-avoidance precision.In simulated off-road scenarios,HS-APF-RRT*is benchmarked against RRT*,GoalBiased RRT*,and APF-RRT*,and demonstrates significantly faster convergence,lower path-energy consumption,and enhanced safety margins.展开更多
Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal...Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal lung tissue,honeycombing lungs,and Ground Glass Opacity(GGO)in CT images is often challenging for radiologists and may lead to misinterpretations.Although earlier studies have proposed models to detect and classify HCL,many faced limitations such as high computational demands,lower accuracy,and difficulty distinguishing between HCL and GGO.CT images are highly effective for lung classification due to their high resolution,3D visualization,and sensitivity to tissue density variations.This study introduces Honeycombing Lungs Network(HCL Net),a novel classification algorithm inspired by ResNet50V2 and enhanced to overcome the shortcomings of previous approaches.HCL Net incorporates additional residual blocks,refined preprocessing techniques,and selective parameter tuning to improve classification performance.The dataset,sourced from the University Malaya Medical Centre(UMMC)and verified by expert radiologists,consists of CT images of normal,honeycombing,and GGO lungs.Experimental evaluations across five assessments demonstrated that HCL Net achieved an outstanding classification accuracy of approximately 99.97%.It also recorded strong performance in other metrics,achieving 93%precision,100%sensitivity,89%specificity,and an AUC-ROC score of 97%.Comparative analysis with baseline feature engineering methods confirmed the superior efficacy of HCL Net.The model significantly reduces misclassification,particularly between honeycombing and GGO lungs,enhancing diagnostic precision and reliability in lung image analysis.展开更多
Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching eng...Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching engines. However, state-of-theart matching methods often require a significant amount of pre-processing time and hence are not suitable for this fast updating scenario. In this paper, a novel matching engine called BFA is proposed to achieve high-speed regular expression matching with fast pre-processing. Experiments demonstrate that BFA obtains 5 to 20 times more update abilities compared to existing regular expression matching methods, and scales well on multi-core platforms.展开更多
In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflec...In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.展开更多
The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.How...The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.However,the complexity of resource allocation is increased because of the large number of tasks and satellites.Therefore,the primary problem of implementing concurrent multiple tasks via LEO mega-constellation is to pre-process tasks and observation re-sources.To address the challenge,we propose a pre-processing algorithm for the mega-constellation based on highly Dynamic Spatio-Temporal Grids(DSTG).In the first stage,this paper describes the management model of mega-constellation and the multiple tasks.Then,the coding method of DSTG is proposed,based on which the description of complex mega-constellation observation resources is realized.In the third part,the DSTG algorithm is used to realize the processing of concurrent multiple tasks at multiple levels,such as task space attribute,time attribute and grid task importance evaluation.Finally,the simulation result of the proposed method in the case of constellation has been given to verify the effectiveness of concurrent multi-task pre-processing based on DSTG.The autonomous processing process of task decomposition and task fusion and mapping to grids,and the convenient indexing process of time window are verified.展开更多
In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. Ho...In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on full channel state information (CSI), which improved the system performance. A pre-processing filter based on partial CSI at the transmitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on full CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.展开更多
The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1...The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1) investigate the morphological features and geological structures at the landing site; (2) integrated in-situ analysis of minerals and chemical compositions; (3) integrated exploration of the structure of the lunar interior; (4) exploration of the lunar-terrestrial space environment, lunar sur- face environment and acquire Moon-based ultraviolet astronomical observations. The Ground Research and Application System (GRAS) is in charge of data acquisition and pre-processing, management of the payload in orbit, and managing the data products and their applications. The Data Pre-processing Subsystem (DPS) is a part of GRAS. The task of DPS is the pre-processing of raw data from the eight instruments that are part of CE-3, including channel processing, unpacking, package sorting, calibration and correction, identification of geographical location, calculation of probe azimuth angle, probe zenith angle, solar azimuth angle, and solar zenith angle and so on, and conducting quality checks. These processes produce Level 0, Level 1 and Level 2 data. The computing platform of this subsystem is comprised of a high-performance computing cluster, including a real-time subsystem used for processing Level 0 data and a post-time subsystem for generating Level 1 and Level 2 data. This paper de- scribes the CE-3 data pre-processing method, the data pre-processing subsystem, data classification, data validity and data products that are used for scientific studies.展开更多
High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution.Preprocessing of ice cor...High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution.Preprocessing of ice cores has direct impacts on the data quality control for further analysis since the conventional ice core processing is time-consuming,produces qualitative data,leads to ice mass loss,and leads to risks of potential secondary pollution.However,over the past several decades,preprocessing of ice cores has received less attention than the improvement of ice drilling,the analytical methodology of various indices,and the researches on the climatic and environmental significance of ice core records.Therefore,this papers reviews the development of the processing for ice cores including framework,design as well as materials,analyzes the technical advantages and disadvantages of the different systems.In the past,continuous flowanalysis(CFA)has been successfully applied to process the polar ice cores.However,it is not suitable for ice cores outside polar region because of high level of particles,the memory effect between samples,and the filtration before injection.Ice core processing is a subtle and professional operation due to the fragility of the nonmetallic materials and the random distribution of particles and air bubbles in ice cores,which aggravates uncertainty in the measurements.The future developments of CFA are discussed in preprocessing,memory effect,challenge for brittle ice,coupling with real-time analysis and optimization of CFA in the field.Furthermore,non-polluting cutters with many different configurations could be designed to cut and scrape in multiple directions and to separate inner and outer portions of the core.This system also needs to be coupled with streamlined operation of packaging,coding,and stacking that can be implemented at high resolution and rate,avoiding manual intervention.At the same time,information of the longitudinal sections could be scanned andidentified,and then classified to obtain quantitative data.In addition,irregular ice volume and weight can also be obtained accurately.These improvements are recorded automatically via user-friendly interfaces.These innovations may be applied to other paleomedias with similar features and needs.展开更多
There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analys...There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analysis. This paper proposes a data pre-processing model based on intelligent algorithms. Firstly, we introduce the integrated network platform of ocean observation. Next, the preprocessing model of data is presemed, and an imelligent cleaning model of data is proposed. Based on fuzzy clustering, the Kohonen clustering network is improved to fulfill the parallel calculation of fuzzy c-means clustering. The proposed dynamic algorithm can automatically f'md the new clustering center with the updated sample data. The rapid and dynamic performance of the model makes it suitable for real time calculation, and the efficiency and accuracy of the model is proved by test results through observation data analysis.展开更多
A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanica...A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanical systems.Firstly,the singular points of original signals are eliminated effectively by using the first-order difference method.Then the OVMD method is applied for signal modal decomposition.Furthermore,correlation analysis is conducted to determine the degree of correlation between each mode and the original signal,so as to accurately separate the real operating signal from noise signal.On the basis of theoretical analysis and simulation,an edge node pre-processing system for distributed electromechanical system is designed.Finally,by virtue of the signal-to-noise ratio(SNR)and root-mean-square error(RMSE)indicators,the signal pre-processing effect is evaluated.The experimental results show that the OVMD-based edge node pre-processing system can extract signals with different characteristics and improve the SNR of reconstructed signals.Due to its high fidelity and reliability,this system can also provide data quality assurance for subsequent system health monitoring and fault diagnosis.展开更多
The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or ve...The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or very large complicated structures, we must use the parallel algorithm with the aid of high-performance computers to solve complex problems. This paper introduces the implementation process having the parallel with sparse linear equations from the perspective of sparse linear equation group.展开更多
Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the e...Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the effect of the noise constitutes a challenging problem in microarray analysis. Efficient denoising is often a necessary and the first step to be taken before the image data is analyzed to compensate for data corruption and for effective utilization for these data. Hence preprocessing of microarray image is an essential to eliminate the background noise in order to enhance the image quality and effective quantification. Existing denoising techniques based on transformed domain have been utilized for microarray noise reduction with their own limitations. The objective of this paper is to introduce novel preprocessing techniques such as optimized spatial resolution (OSR) and spatial domain filtering (SDF) for reduction of noise from microarray data and reduction of error during quantification process for estimating the microarray spots accurately to determine expression level of genes. Besides combined optimized spatial resolution and spatial filtering is proposed and found improved denoising of microarray data with effective quantification of spots. The proposed method has been validated in microarray images of gene expression profiles of Myeloid Leukemia using Stanford Microarray Database with various quality measures such as signal to noise ratio, peak signal to noise ratio, image fidelity, structural content, absolute average difference and correlation quality. It was observed by quantitative analysis that the proposed technique is more efficient for denoising the microarray image which enables to make it suitable for effective quantification.展开更多
Backfill is often employed in mining operations for ground support,with its positive impact on ground stability acknowledged in many underground mines.However,existing studies have predominantly focused only on the st...Backfill is often employed in mining operations for ground support,with its positive impact on ground stability acknowledged in many underground mines.However,existing studies have predominantly focused only on the stress development within the backfill material,leaving the influence of stope backfilling on stress distribution in surrounding rock mass and ground stability largely unexplored.Therefore,this paper presents numerical models in FLAC3D to investigate,for the first time,the time-dependent stress redistribution around a vertical backfilled stope and its implications on ground stability,considering the creep of surrounding rock mass.Using the Soft Soil constitutive model,the compressibility of backfill under large pressure was captured.It is found that the creep deformation of rock mass exercises compression on backfill and results in a less void ratio and increased modulus for fill material.The compacted backfill conversely influenced the stress distribution and ground stability of rock mass which was a combined effect of wall creep and compressibility of backfill.With the increase of time or/and creep deformation,the minimum principal stress in the rocks surrounding the backfilled stope increased towards the pre-mining stress state,while the deviatoric stress reduces leading to an increased factor of safety and improved ground stability.This improvement effect of backfill on ground stability increased with the increase of mine depth and stope height,while it is also more pronounced for the narrow stope,the backfill with a smaller compression index,and the soft rocks with a smaller viscosity coefficient.Furthermore,the results emphasize the importance of minimizing empty time and backfilling extracted stope as soon as possible for ground control.Reduction of filling gap height enhances the local stability around the roof of stope.展开更多
The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,...The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,in combination with numerical simulation,was conducted to investigate the influence of ground fissures on a metro shield tunnel.The results indicate that the lining contact pressure at the vault increases in the hanging wall while decreases in the footwall,resulting in a two-dimensional stress state of vertical shear and axial tension-compression,and simultaneous vertical dislocation and axial tilt for the segments around the ground fissure.In addition,the damage to curved bolts includes tensile yield,flexural yield,and shear twist,leading to obvious concrete lining damage,particularly at the vault,arch bottom,and hance,indicating that the joints in these positions are weak areas.The shield tunnel orthogonal to the ground fissure ultimately experiences shear failure,suggesting that the maximum actual dislocation of ground fissure that the structure can withstand is approximately 20 cm,and five segment rings in the hanging wall and six segment rings in the footwall also need to be reinforced.This study could provide a reference for metro design in ground fissure sites.展开更多
The deformation caused by tunnel excavation is quite important for safety,especially when it is adjacent to the existing tunnel.Nevertheless,the investigation of deformation characteristics in overlapped curved shield...The deformation caused by tunnel excavation is quite important for safety,especially when it is adjacent to the existing tunnel.Nevertheless,the investigation of deformation characteristics in overlapped curved shield tunneling remains inadequate.The analytical solution for calculating the deformation of the ground and existing tunnel induced by overlapped curved shield tunneling is derived by the Mirror theory,Mindlin solution and Euler-Bernoulli-Pasternak model,subsequently validated through both finite element simulation and field monitoring.It is determined that the overcutting plays a crucial role in the ground settlement resulting from curved shield tunneling compared to straight shield tunneling.The longitudinal settlement distribution can be categorized into five areas,with the area near the tunnel surface experiencing the most dramatic settlement changes.The deformation of the existing tunnel varies most significantly with turning radius compared to tunnel clearance and grouting pressure,especially when the turning radius is less than 30 times the tunnel diameter.The tunnel crown exhibits larger displacement than the tunnel bottom,resulting in a distinctive‘vertical egg'shape.Furthermore,an optimized overcutting mode is proposed,involving precise control of the extension speed and angular velocity of the overcutting cutter,which effectively mitigates ground deformation,ensuring the protection of the existing tunnel during the construction.展开更多
Ground source heat pump systems demonstrate significant potential for northern rural heating applications;however,the effectiveness of these systems is often limited by challenging geological conditions.For instance,i...Ground source heat pump systems demonstrate significant potential for northern rural heating applications;however,the effectiveness of these systems is often limited by challenging geological conditions.For instance,in certain regions,the installation of buried pipes for heat exchangers may be complicated,and these pipes may not always serve as efficient low-temperature heat sources for the heat pumps of the system.To address this issue,the current study explored the use of solar-energy-collecting equipment to supplement buried pipes.In this design,both solar energy and geothermal energy provide low-temperature heat to the heat pump.First,a simulation model of a solar‒ground source heat pump coupling system was established using TRNSYS.The accuracy of this model was validated through experiments and simulations on various system configurations,including varying numbers of buried pipes,different areas of solar collectors,and varying volumes of water tanks.The simulations examined the coupling characteristics of these components and their influence on system performance.The results revealed that the operating parameters of the system remained consistent across the following configurations:three buried pipes,burial depth of 20 m,collector area of 6 m^(2),and water tank volume of 0.5 m^(3);four buried pipes,burial depth of 20 m,collector area of 3 m^(2),and water tank volume of 0.5 m^(3);and five buried pipes with a burial depth of 20 m.Furthermore,the heat collection capacity of the solar collectors spanning an area of 3 m^(2)was found to be equivalent to that of one buried pipe.Moreover,the findings revealed that the solar‒ground source heat pump coupling system demonstrated a lower annual cumulative energy consumption compared to the ground source heat pump system,presenting a reduction of 5.31%compared to the energy consumption of the latter.展开更多
Leachate sludge,a byproduct of municipal solid waste leachate treated through biochemical processes,is characterized by high water content(761.1%)and significant organic matter content(71.2%).Cement that is commonly u...Leachate sludge,a byproduct of municipal solid waste leachate treated through biochemical processes,is characterized by high water content(761.1%)and significant organic matter content(71.2%).Cement that is commonly used for solidifying leachate sludge has shown limited effectiveness.To address this issue,an alkali-activated ground-granulated blast-furnace slag(GGBS)geopolymer blended with polypropylene fibers was developed to solidify leachate sludge.Moreover,unconfined compressive strength(UCS),immersion,as well as X-ray diffraction(XRD),Fourier transform infrared spectroscopy(FTIR),and scanning electron microscope(SEM)tests were conducted to investigate the solidification effect and mechanism of the GGBS-based geopolymer and fibers on leachate sludge.The results showed that:the 28-d UCS of the solidified sludge with 20%and 30%GGBS is 0.35 MPa and 1.85 MPa,and decreases to 0.18 MPa and 1.13 MPa,respectively,after soaked in water for 28 d.Notably,the UCS of the solidified sludge with 30%GGBS satisfied the strength requirement of roadbed materials.Polypropylene fibers significantly enhanced the strength,ductility and water stability of the solidified sludge,with an optimal fiber content of 0.3%.Alkali-activated GGBS geopolymer generated three-dimensional,cross-linked geopolymeric gels within the solidified sludge,cementing sludge particles and filling intergranular pores to form a stable cementitious structure,thereby achieving effective solidification.Furthermore,incorporating polypropylene fibers improved the bonding and anchoring effect between fiber and solidified sludge,constrained lateral deformation of the solidified sludge,restricted crack propagation,and enhanced engineering performance of the solidified leachate sludge.展开更多
The level of ground shaking,as determined by the peak ground acceleration(PGA),can be used to analyze seismic hazard at a certain location and is crucial for constructing earthquake-resistant structures.Predicting the...The level of ground shaking,as determined by the peak ground acceleration(PGA),can be used to analyze seismic hazard at a certain location and is crucial for constructing earthquake-resistant structures.Predicting the PGA immediately after an earthquake occurs allows for the issuing of a warning by an earthquake early warning system.In this study,we propose a deep learning model,ConvMixer,to predict the PGA recorded by weak-motion velocity seismometers in Japan.We use 5-s threecomponent seismograms,from 2 s before until 3 s after the P-wave arrival time of the earthquake.Our dataset comprised more than 50,000 single-station waveforms recorded by 10 seismic stations in the K-NET,Kiki-NET,and Hi-Net networks between 2004 and 2023.The proposed ConvMixer is a patch-based model that extracts global features from input seismic data and predicts the PGA of an earthquake by combining depth and pointwise convolutions.The proposed ConvMixer network had a mean absolute error of 2.143 when applied to the test set and outperformed benchmark deep learning models.In addition,the proposed ConvMixer demonstrated the ability to predict the PGA at the corresponding station site based on 1-second waveforms obtained immediately after the arrival time of the P-wave.展开更多
文摘Mathematical morphology is widely applicated in digital image procesing.Vari- ary morphology construction and algorithm being developed are used in deferent digital image processing.The basic idea of mathematical morphology is to use construction ele- ment measure image morphology for solving understand problem.The article presented advanced cellular neural network that forms mathematical morphological cellular neural network (MMCNN) equation to be suit for mathematical morphology filter.It gave the theo- ries of MMCNN dynamic extent and stable state.It is evidenced that arrived mathematical morphology filter through steady of dynamic process in definite condition.
基金supported in part by 14th Five Year National Key R&D Program Project(Project Number:2023YFB3211001)the National Natural Science Foundation of China(62273339,U24A201397).
文摘Rapidly-exploring Random Tree(RRT)and its variants have become foundational in path-planning research,yet in complex three-dimensional off-road environments their uniform blind sampling and limited safety guarantees lead to slow convergence and force an unfavorable trade-off between path quality and traversal safety.To address these challenges,we introduce HS-APF-RRT*,a novel algorithm that fuses layered sampling,an enhanced Artificial Potential Field(APF),and a dynamic neighborhood-expansion mechanism.First,the workspace is hierarchically partitioned into macro,meso,and micro sampling layers,progressively biasing random samples toward safer,lower-energy regions.Second,we augment the traditional APF by incorporating a slope-dependent repulsive term,enabling stronger avoidance of steep obstacles.Third,a dynamic expansion strategy adaptively switches between 8 and 16 connected neighborhoods based on local obstacle density,striking an effective balance between search efficiency and collision-avoidance precision.In simulated off-road scenarios,HS-APF-RRT*is benchmarked against RRT*,GoalBiased RRT*,and APF-RRT*,and demonstrates significantly faster convergence,lower path-energy consumption,and enhanced safety margins.
文摘Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal lung tissue,honeycombing lungs,and Ground Glass Opacity(GGO)in CT images is often challenging for radiologists and may lead to misinterpretations.Although earlier studies have proposed models to detect and classify HCL,many faced limitations such as high computational demands,lower accuracy,and difficulty distinguishing between HCL and GGO.CT images are highly effective for lung classification due to their high resolution,3D visualization,and sensitivity to tissue density variations.This study introduces Honeycombing Lungs Network(HCL Net),a novel classification algorithm inspired by ResNet50V2 and enhanced to overcome the shortcomings of previous approaches.HCL Net incorporates additional residual blocks,refined preprocessing techniques,and selective parameter tuning to improve classification performance.The dataset,sourced from the University Malaya Medical Centre(UMMC)and verified by expert radiologists,consists of CT images of normal,honeycombing,and GGO lungs.Experimental evaluations across five assessments demonstrated that HCL Net achieved an outstanding classification accuracy of approximately 99.97%.It also recorded strong performance in other metrics,achieving 93%precision,100%sensitivity,89%specificity,and an AUC-ROC score of 97%.Comparative analysis with baseline feature engineering methods confirmed the superior efficacy of HCL Net.The model significantly reduces misclassification,particularly between honeycombing and GGO lungs,enhancing diagnostic precision and reliability in lung image analysis.
基金supported by the National Key Technology R&D Program of China under Grant No. 2015BAK34B00the National Key Research and Development Program of China under Grant No. 2016YFB1000102
文摘Regular expression matching is playing an important role in deep inspection. The rapid development of SDN and NFV makes the network more dynamic, bringing serious challenges to traditional deep inspection matching engines. However, state-of-theart matching methods often require a significant amount of pre-processing time and hence are not suitable for this fast updating scenario. In this paper, a novel matching engine called BFA is proposed to achieve high-speed regular expression matching with fast pre-processing. Experiments demonstrate that BFA obtains 5 to 20 times more update abilities compared to existing regular expression matching methods, and scales well on multi-core platforms.
基金Projects 50221402, 50490271 and 50025413 supported by the National Natural Science Foundation of Chinathe National Basic Research Program of China (2009CB219603, 2009 CB724601, 2006CB202209 and 2005CB221500)+1 种基金the Key Project of the Ministry of Education (306002)the Program for Changjiang Scholars and Innovative Research Teams in Universities of MOE (IRT0408)
文摘In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.
基金supported by the National Natural Science Foundation of China(Nos.62003115 and 11972130)the Shenzhen Science and Technology Program,China(JCYJ20220818102207015)the Heilongjiang Touyan Team Program,China。
文摘The Low Earth Orbit(LEO)remote sensing satellite mega-constellation has the characteristics of large quantity and various types which make it have unique superiority in the realization of concurrent multiple tasks.However,the complexity of resource allocation is increased because of the large number of tasks and satellites.Therefore,the primary problem of implementing concurrent multiple tasks via LEO mega-constellation is to pre-process tasks and observation re-sources.To address the challenge,we propose a pre-processing algorithm for the mega-constellation based on highly Dynamic Spatio-Temporal Grids(DSTG).In the first stage,this paper describes the management model of mega-constellation and the multiple tasks.Then,the coding method of DSTG is proposed,based on which the description of complex mega-constellation observation resources is realized.In the third part,the DSTG algorithm is used to realize the processing of concurrent multiple tasks at multiple levels,such as task space attribute,time attribute and grid task importance evaluation.Finally,the simulation result of the proposed method in the case of constellation has been given to verify the effectiveness of concurrent multi-task pre-processing based on DSTG.The autonomous processing process of task decomposition and task fusion and mapping to grids,and the convenient indexing process of time window are verified.
基金supported by the National Natural Science Foundation of China(60902045)the National High-Tech Research and Developmeent Program of China(863 Program)(2011AA01A105)
文摘In order to meet the demands for high transmission rates and high service quality in broadband wireless communication systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on full channel state information (CSI), which improved the system performance. A pre-processing filter based on partial CSI at the transmitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on full CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.
文摘The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1) investigate the morphological features and geological structures at the landing site; (2) integrated in-situ analysis of minerals and chemical compositions; (3) integrated exploration of the structure of the lunar interior; (4) exploration of the lunar-terrestrial space environment, lunar sur- face environment and acquire Moon-based ultraviolet astronomical observations. The Ground Research and Application System (GRAS) is in charge of data acquisition and pre-processing, management of the payload in orbit, and managing the data products and their applications. The Data Pre-processing Subsystem (DPS) is a part of GRAS. The task of DPS is the pre-processing of raw data from the eight instruments that are part of CE-3, including channel processing, unpacking, package sorting, calibration and correction, identification of geographical location, calculation of probe azimuth angle, probe zenith angle, solar azimuth angle, and solar zenith angle and so on, and conducting quality checks. These processes produce Level 0, Level 1 and Level 2 data. The computing platform of this subsystem is comprised of a high-performance computing cluster, including a real-time subsystem used for processing Level 0 data and a post-time subsystem for generating Level 1 and Level 2 data. This paper de- scribes the CE-3 data pre-processing method, the data pre-processing subsystem, data classification, data validity and data products that are used for scientific studies.
基金supported by the National Natural Science Foundation of China(Grant No.41630754)the State Key Laboratory of Cryospheric Science(SKLCS-ZZ-2017)CAS Key Technology Talent Program and Open Foundation of State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering(2017490711)
文摘High-resolution ice core records covering long time spans enable reconstruction of the past climatic and environmental conditions allowing the investigation of the earth system's evolution.Preprocessing of ice cores has direct impacts on the data quality control for further analysis since the conventional ice core processing is time-consuming,produces qualitative data,leads to ice mass loss,and leads to risks of potential secondary pollution.However,over the past several decades,preprocessing of ice cores has received less attention than the improvement of ice drilling,the analytical methodology of various indices,and the researches on the climatic and environmental significance of ice core records.Therefore,this papers reviews the development of the processing for ice cores including framework,design as well as materials,analyzes the technical advantages and disadvantages of the different systems.In the past,continuous flowanalysis(CFA)has been successfully applied to process the polar ice cores.However,it is not suitable for ice cores outside polar region because of high level of particles,the memory effect between samples,and the filtration before injection.Ice core processing is a subtle and professional operation due to the fragility of the nonmetallic materials and the random distribution of particles and air bubbles in ice cores,which aggravates uncertainty in the measurements.The future developments of CFA are discussed in preprocessing,memory effect,challenge for brittle ice,coupling with real-time analysis and optimization of CFA in the field.Furthermore,non-polluting cutters with many different configurations could be designed to cut and scrape in multiple directions and to separate inner and outer portions of the core.This system also needs to be coupled with streamlined operation of packaging,coding,and stacking that can be implemented at high resolution and rate,avoiding manual intervention.At the same time,information of the longitudinal sections could be scanned andidentified,and then classified to obtain quantitative data.In addition,irregular ice volume and weight can also be obtained accurately.These improvements are recorded automatically via user-friendly interfaces.These innovations may be applied to other paleomedias with similar features and needs.
基金Key Science and Technology Project of the Shanghai Committee of Science and Technology, China (No.06dz1200921)Major Basic Research Project of the Shanghai Committee of Science and Technology(No.08JC1400100)+1 种基金Shanghai Talent Developing Foundation, China(No.001)Specialized Foundation for Excellent Talent of Shanghai,China
文摘There are a number of dirty data in observation data set derived from integrated ocean observing network system. Thus, the data must be carefully and reasonably processed before they are used for forecasting or analysis. This paper proposes a data pre-processing model based on intelligent algorithms. Firstly, we introduce the integrated network platform of ocean observation. Next, the preprocessing model of data is presemed, and an imelligent cleaning model of data is proposed. Based on fuzzy clustering, the Kohonen clustering network is improved to fulfill the parallel calculation of fuzzy c-means clustering. The proposed dynamic algorithm can automatically f'md the new clustering center with the updated sample data. The rapid and dynamic performance of the model makes it suitable for real time calculation, and the efficiency and accuracy of the model is proved by test results through observation data analysis.
基金National Natural Science Foundation of China(No.61903291)Industrialization Project of Shaanxi Provincial Department of Education(No.18JC018)。
文摘A signal pre-processing method based on optimal variational mode decomposition(OVMD)is proposed to improve the efficiency and accuracy of local data filtering and analysis of edge nodes in distributed electromechanical systems.Firstly,the singular points of original signals are eliminated effectively by using the first-order difference method.Then the OVMD method is applied for signal modal decomposition.Furthermore,correlation analysis is conducted to determine the degree of correlation between each mode and the original signal,so as to accurately separate the real operating signal from noise signal.On the basis of theoretical analysis and simulation,an edge node pre-processing system for distributed electromechanical system is designed.Finally,by virtue of the signal-to-noise ratio(SNR)and root-mean-square error(RMSE)indicators,the signal pre-processing effect is evaluated.The experimental results show that the OVMD-based edge node pre-processing system can extract signals with different characteristics and improve the SNR of reconstructed signals.Due to its high fidelity and reliability,this system can also provide data quality assurance for subsequent system health monitoring and fault diagnosis.
文摘The solution of linear equation group can be applied to the oil exploration, the structure vibration analysis, the computational fluid dynamics, and other fields. When we make the in-depth analysis of some large or very large complicated structures, we must use the parallel algorithm with the aid of high-performance computers to solve complex problems. This paper introduces the implementation process having the parallel with sparse linear equations from the perspective of sparse linear equation group.
文摘Microarray data is inherently noisy due to the noise contaminated from various sources during the preparation of microarray slide and thus it greatly affects the accuracy of the gene expression. How to eliminate the effect of the noise constitutes a challenging problem in microarray analysis. Efficient denoising is often a necessary and the first step to be taken before the image data is analyzed to compensate for data corruption and for effective utilization for these data. Hence preprocessing of microarray image is an essential to eliminate the background noise in order to enhance the image quality and effective quantification. Existing denoising techniques based on transformed domain have been utilized for microarray noise reduction with their own limitations. The objective of this paper is to introduce novel preprocessing techniques such as optimized spatial resolution (OSR) and spatial domain filtering (SDF) for reduction of noise from microarray data and reduction of error during quantification process for estimating the microarray spots accurately to determine expression level of genes. Besides combined optimized spatial resolution and spatial filtering is proposed and found improved denoising of microarray data with effective quantification of spots. The proposed method has been validated in microarray images of gene expression profiles of Myeloid Leukemia using Stanford Microarray Database with various quality measures such as signal to noise ratio, peak signal to noise ratio, image fidelity, structural content, absolute average difference and correlation quality. It was observed by quantitative analysis that the proposed technique is more efficient for denoising the microarray image which enables to make it suitable for effective quantification.
基金the funding support from the National Natural Science Foundation of China(Grant Nos.52304101 and 52004206)the China Postdoctoral Science Foundation(Grant No.2023MD734215)。
文摘Backfill is often employed in mining operations for ground support,with its positive impact on ground stability acknowledged in many underground mines.However,existing studies have predominantly focused only on the stress development within the backfill material,leaving the influence of stope backfilling on stress distribution in surrounding rock mass and ground stability largely unexplored.Therefore,this paper presents numerical models in FLAC3D to investigate,for the first time,the time-dependent stress redistribution around a vertical backfilled stope and its implications on ground stability,considering the creep of surrounding rock mass.Using the Soft Soil constitutive model,the compressibility of backfill under large pressure was captured.It is found that the creep deformation of rock mass exercises compression on backfill and results in a less void ratio and increased modulus for fill material.The compacted backfill conversely influenced the stress distribution and ground stability of rock mass which was a combined effect of wall creep and compressibility of backfill.With the increase of time or/and creep deformation,the minimum principal stress in the rocks surrounding the backfilled stope increased towards the pre-mining stress state,while the deviatoric stress reduces leading to an increased factor of safety and improved ground stability.This improvement effect of backfill on ground stability increased with the increase of mine depth and stope height,while it is also more pronounced for the narrow stope,the backfill with a smaller compression index,and the soft rocks with a smaller viscosity coefficient.Furthermore,the results emphasize the importance of minimizing empty time and backfilling extracted stope as soon as possible for ground control.Reduction of filling gap height enhances the local stability around the roof of stope.
基金supported by the National Key Research&Development Program of China(Grant No.2023YFC3008404)the Key Laboratory of Earth Fissures Geological Disaster,Ministry of Natural Resources,China(Grant Nos.EFGD20240609 and EFGD20240610).
文摘The recent upsurge in metro construction emphasizes the necessity of understanding the mechanical performance of metro shield tunnel subjected to the influence of ground fissures.In this study,a largescale experiment,in combination with numerical simulation,was conducted to investigate the influence of ground fissures on a metro shield tunnel.The results indicate that the lining contact pressure at the vault increases in the hanging wall while decreases in the footwall,resulting in a two-dimensional stress state of vertical shear and axial tension-compression,and simultaneous vertical dislocation and axial tilt for the segments around the ground fissure.In addition,the damage to curved bolts includes tensile yield,flexural yield,and shear twist,leading to obvious concrete lining damage,particularly at the vault,arch bottom,and hance,indicating that the joints in these positions are weak areas.The shield tunnel orthogonal to the ground fissure ultimately experiences shear failure,suggesting that the maximum actual dislocation of ground fissure that the structure can withstand is approximately 20 cm,and five segment rings in the hanging wall and six segment rings in the footwall also need to be reinforced.This study could provide a reference for metro design in ground fissure sites.
基金financially supported by the National Natural Science Foundation of China(Grant No.52078334)the National Key Research and Development Program of China(Grant No.2017YFC0805402)the Tianjin Research Innovation Project for Postgraduate Students(Grant No.2021YJSB141).
文摘The deformation caused by tunnel excavation is quite important for safety,especially when it is adjacent to the existing tunnel.Nevertheless,the investigation of deformation characteristics in overlapped curved shield tunneling remains inadequate.The analytical solution for calculating the deformation of the ground and existing tunnel induced by overlapped curved shield tunneling is derived by the Mirror theory,Mindlin solution and Euler-Bernoulli-Pasternak model,subsequently validated through both finite element simulation and field monitoring.It is determined that the overcutting plays a crucial role in the ground settlement resulting from curved shield tunneling compared to straight shield tunneling.The longitudinal settlement distribution can be categorized into five areas,with the area near the tunnel surface experiencing the most dramatic settlement changes.The deformation of the existing tunnel varies most significantly with turning radius compared to tunnel clearance and grouting pressure,especially when the turning radius is less than 30 times the tunnel diameter.The tunnel crown exhibits larger displacement than the tunnel bottom,resulting in a distinctive‘vertical egg'shape.Furthermore,an optimized overcutting mode is proposed,involving precise control of the extension speed and angular velocity of the overcutting cutter,which effectively mitigates ground deformation,ensuring the protection of the existing tunnel during the construction.
基金supported by 2024 Central Guidance Local Science and Technology Development Fund Project"Study on the mechanism and evaluation method of thermal pollution in water bodies,as well as research on thermal carrying capacity".(Grant 246Z4506G)Key Research and Development Project in Hebei Province:"Key Technologies and Equipment Research and Demonstration of Multiple Energy Complementary(Electricity,Heat,Cold System)for Solar Energy,Geothermal Energy,Phase Change Energy"(Grant 236Z4310G)the Hebei Academy of Sciences Key Research and Development Program"Research on Heat Transfer Mechanisms and Efficient Applications of Intermediate and Deep Geothermal Energy"(22702)。
文摘Ground source heat pump systems demonstrate significant potential for northern rural heating applications;however,the effectiveness of these systems is often limited by challenging geological conditions.For instance,in certain regions,the installation of buried pipes for heat exchangers may be complicated,and these pipes may not always serve as efficient low-temperature heat sources for the heat pumps of the system.To address this issue,the current study explored the use of solar-energy-collecting equipment to supplement buried pipes.In this design,both solar energy and geothermal energy provide low-temperature heat to the heat pump.First,a simulation model of a solar‒ground source heat pump coupling system was established using TRNSYS.The accuracy of this model was validated through experiments and simulations on various system configurations,including varying numbers of buried pipes,different areas of solar collectors,and varying volumes of water tanks.The simulations examined the coupling characteristics of these components and their influence on system performance.The results revealed that the operating parameters of the system remained consistent across the following configurations:three buried pipes,burial depth of 20 m,collector area of 6 m^(2),and water tank volume of 0.5 m^(3);four buried pipes,burial depth of 20 m,collector area of 3 m^(2),and water tank volume of 0.5 m^(3);and five buried pipes with a burial depth of 20 m.Furthermore,the heat collection capacity of the solar collectors spanning an area of 3 m^(2)was found to be equivalent to that of one buried pipe.Moreover,the findings revealed that the solar‒ground source heat pump coupling system demonstrated a lower annual cumulative energy consumption compared to the ground source heat pump system,presenting a reduction of 5.31%compared to the energy consumption of the latter.
基金financially supported by the National Natural Science Foundation of China(Grant No.52078142).
文摘Leachate sludge,a byproduct of municipal solid waste leachate treated through biochemical processes,is characterized by high water content(761.1%)and significant organic matter content(71.2%).Cement that is commonly used for solidifying leachate sludge has shown limited effectiveness.To address this issue,an alkali-activated ground-granulated blast-furnace slag(GGBS)geopolymer blended with polypropylene fibers was developed to solidify leachate sludge.Moreover,unconfined compressive strength(UCS),immersion,as well as X-ray diffraction(XRD),Fourier transform infrared spectroscopy(FTIR),and scanning electron microscope(SEM)tests were conducted to investigate the solidification effect and mechanism of the GGBS-based geopolymer and fibers on leachate sludge.The results showed that:the 28-d UCS of the solidified sludge with 20%and 30%GGBS is 0.35 MPa and 1.85 MPa,and decreases to 0.18 MPa and 1.13 MPa,respectively,after soaked in water for 28 d.Notably,the UCS of the solidified sludge with 30%GGBS satisfied the strength requirement of roadbed materials.Polypropylene fibers significantly enhanced the strength,ductility and water stability of the solidified sludge,with an optimal fiber content of 0.3%.Alkali-activated GGBS geopolymer generated three-dimensional,cross-linked geopolymeric gels within the solidified sludge,cementing sludge particles and filling intergranular pores to form a stable cementitious structure,thereby achieving effective solidification.Furthermore,incorporating polypropylene fibers improved the bonding and anchoring effect between fiber and solidified sludge,constrained lateral deformation of the solidified sludge,restricted crack propagation,and enhanced engineering performance of the solidified leachate sludge.
基金the National Research Institute of Astronomy and Geophysics (NRIAG) for supporting this work
文摘The level of ground shaking,as determined by the peak ground acceleration(PGA),can be used to analyze seismic hazard at a certain location and is crucial for constructing earthquake-resistant structures.Predicting the PGA immediately after an earthquake occurs allows for the issuing of a warning by an earthquake early warning system.In this study,we propose a deep learning model,ConvMixer,to predict the PGA recorded by weak-motion velocity seismometers in Japan.We use 5-s threecomponent seismograms,from 2 s before until 3 s after the P-wave arrival time of the earthquake.Our dataset comprised more than 50,000 single-station waveforms recorded by 10 seismic stations in the K-NET,Kiki-NET,and Hi-Net networks between 2004 and 2023.The proposed ConvMixer is a patch-based model that extracts global features from input seismic data and predicts the PGA of an earthquake by combining depth and pointwise convolutions.The proposed ConvMixer network had a mean absolute error of 2.143 when applied to the test set and outperformed benchmark deep learning models.In addition,the proposed ConvMixer demonstrated the ability to predict the PGA at the corresponding station site based on 1-second waveforms obtained immediately after the arrival time of the P-wave.