The surrounding rock is prone to large-scale loosening and failure after the excavation of shallow large-span caverns because of the thin overlying strata and large cross-section span.The rational design of bolt suppo...The surrounding rock is prone to large-scale loosening and failure after the excavation of shallow large-span caverns because of the thin overlying strata and large cross-section span.The rational design of bolt support is very important to the safety control of surrounding rock as a common support means.The control mechanism and design method of bolt support for shallow-buried large-span caverns is carried out.The calculation method of bolt prestress and length based on arched failure and collapsed failure mode is established.The influence mechanism of different influencing factors on the bolt prestress and length is clarified.At the same time,the constant resistance energy-absorbing bolt with high strength and high toughness is developed,and the comparative test of mechanical properties is carried out.On this basis,the design method of high prestressed bolt support for shallow-buried large-span caverns is put forward,and the field test is carried out in Qingdao metro station in China.The monitoring results show that the maximum roof settlement is 6.8 mm after the new design method is adopted,and the effective control of the shallow-buried large-span caverns is realized.The research results can provide theoretical and technical support for the safety control of shallow-buried large-span caverns.展开更多
Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces c...Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces controversy regarding its observational epoch.Determining this epoch via precession assumes accurate ancient coordinates and correspondence with contemporary stars,posing significant challenges.This study introduces a novel method using the Generalized Hough Transform to ascertain the catalog’s observational epoch.This approach statistically accommodates errors in ancient coordinates and discrepancies between ancient and modern stars,addressing limitations in prior methods.Our findings date Shi’s Star Catalog to the 4th century BCE,with 2nd-century CE adjustments.In comparison,the Western tradition’s oldest known catalog,the Ptolemaic Star Catalog(2nd century CE),likely derives from the Hipparchus Star Catalog(2nd century BCE).Thus,Shi’s Star Catalog is identified as the world’s oldest known star catalog.Beyond establishing its observation period,this study aims to consolidate and digitize these cultural artifacts.展开更多
Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the...Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the Terrestrial Time TT(BIPMXXXX)-TAI but PT has larger measurement error.In this paper,we discuss the smoothing method of PT using a combined smoothing filter and compare the results with that from other filters.The clock difference sequence between PT-TAI and the first time derivative series of the TT(BIPMXXXX)-TAI can be combined by a combined smoothing filter to yield two smooth curves tied by the constraints assuring that the latter is the derivative of the former.The ensemble pulsar time IPTA2016 with respect to TAI published by G.Hobbs et al.and first time derivative series of the TT(BIPM2017)-TAI with quadratic polynomial terms removed are processed by combined smoothing filter in order to demonstrate the properties of the smoothed results.How to correctly estimate two smoothing coefficients is described and the output results of the combined smoothing filter are analyzed.The results show that the combined smoothing method efficiently removes high frequency noises of two input data series and the smoothed data of the PT-TAI combine long term fractional frequency stability of the pulsar time and frequency accuracy of the terrestrial time.Fractional frequency stability analysis indicates that both short and medium time interval stability of the smoothed PT-TAI is improved while keeping its original long term frequency stability level.The combined smoothing filter is more suitable for smoothing observational pulsar timescale data than any filter that only performs smoothing of a single pulsar time series.The smoothed pulsar time by combined smoothing filter is a pulsar atomic time combined timescale.This kind of combined timescale can also be used as terrestrial time.展开更多
The 21 cm radiation of neutral hydrogen provides crucial information for studying the early universe and its evolution.To advance this research,countries have made significant investments in constructing large lowfreq...The 21 cm radiation of neutral hydrogen provides crucial information for studying the early universe and its evolution.To advance this research,countries have made significant investments in constructing large lowfrequency radio telescope arrays,such as the Low Frequency Array and the Square Kilometre Array Phase 1 Low Frequency.These instruments are pivotal for radio astronomy research.However,challenges such as ionospheric plasma interference,ambient radio noise,and instrument-related effects have become increasingly prominent,posing major obstacles in cosmology research.To address these issues,this paper proposes an efficient signal processing method that combines wavelet transform and mathematical morphology.The method involves the following steps:Background Subtraction:Background interference in radio observation signals is eliminated.Wavelet Transform:The signal,after removing background noise,undergoes a two-dimensional discrete wavelet transform.Threshold processing is then applied to the wavelet coefficients to effectively remove interference components.Wavelet Inversion:The processed signal is reconstructed using wavelet inversion.Mathematical Morphology:The reconstructed signal is further optimized using mathematical morphology to refine the results.Experimental verification was conducted using solar observation data from the Xinjiang Observatory and the Yunnan Observatory.The results demonstrate that this method successfully removes interference signals while preserving useful signals,thus improving the accuracy of radio astronomy observations and reducing the impact of radio frequency interference.展开更多
Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly ...Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly to substitute petroleum-based products.They are a definite class of sustainable materials of the forestry industry.They have been in operation for hundreds of years to manufacture leather and now for a growing number of applications in a variety of other industries,such as wood adhesives,metal coating,pharmaceutical/medical applications and several others.This review presents the main sources,either already or potentially commercial of this forestry by-materials,their industrial and laboratory extraction systems,their systems of analysis with their advantages and drawbacks,be these methods so simple to even appear primitive but nonetheless of proven effectiveness,or very modern and instrumental.It constitutes a basic but essential summary of what is necessary to know of these sustainable materials.In doing so,the review highlights some of the main challenges that remain to be addressed to deliver the quality and economics of tannin supply necessary to fulfill the industrial production requirements for some materials-based uses.展开更多
In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to...In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to the interval of the grid. To improve estimation accuracy and reduce the computational load, we propose a fast twodimensional positioning method for the crab pulsar based on multiple optimization algorithms(FTPCO). The FTPCO uses the Levenberg–Marquardt(LM) algorithm, three-point orientation(TPO) method, particle swarm optimization(PSO) and Newton–Raphson-based optimizer(NRBO) to substitute the grid method. First, to avoid the influence of the non-sensitive direction on positioning, we take an orbital error and the distortion of the pulsar profile as optimization objectives and combine the grid method with the LM algorithm or PSO to search for the non-sensitive direction. Then, on the sensitive plane perpendicular to the non-sensitive direction, the TPO method is proposed to fast search the sensitive direction and sub-sensitive direction. Finally, the NRBO is employed on the sensitive and sub-sensitive directions to achieve two-dimensional positioning of the Crab pulsar. The simulation results show that the computational load of the FTPCO is reduced by 89.4% and the positioning accuracy of the FTPCO is improved by approximately 38% compared with the grid method. The FTPCO has the advantage of high real-time accuracy and does not fall into the local optimum.展开更多
In this paper,two types of fractional nonlinear equations in Caputo sense,time-fractional Newell–Whitehead equation(FNWE)and time-fractional generalized Hirota–Satsuma coupled KdV system(HS-cKdVS),are investigated b...In this paper,two types of fractional nonlinear equations in Caputo sense,time-fractional Newell–Whitehead equation(FNWE)and time-fractional generalized Hirota–Satsuma coupled KdV system(HS-cKdVS),are investigated by means of the q-homotopy analysis method(q-HAM).The approximate solutions of the proposed equations are constructed in the form of a convergent series and are compared with the corresponding exact solutions.Due to the presence of the auxiliary parameter h in this method,just a few terms of the series solution are required in order to obtain better approximation.For the sake of visualization,the numerical results obtained in this paper are graphically displayed with the help of Maple.展开更多
Electroencephalography(EEG)is a non-invasive measurement method for brain activity.Due to its safety,high resolution,and hypersensitivity to dynamic changes in brain neural signals,EEG has aroused much interest in sci...Electroencephalography(EEG)is a non-invasive measurement method for brain activity.Due to its safety,high resolution,and hypersensitivity to dynamic changes in brain neural signals,EEG has aroused much interest in scientific research and medical felds.This article reviews the types of EEG signals,multiple EEG signal analysis methods,and the application of relevant methods in the neuroscience feld and for diagnosing neurological diseases.First,3 types of EEG signals,including time-invariant EEG,accurate event-related EEG,and random event-related EEG,are introduced.Second,5 main directions for the methods of EEG analysis,including power spectrum analysis,time-frequency analysis,connectivity analysis,source localization methods,and machine learning methods,are described in the main section,along with diferent sub-methods and effect evaluations for solving the same problem.Finally,the application scenarios of different EEG analysis methods are emphasized,and the advantages and disadvantages of similar methods are distinguished.This article is expected to assist researchers in selecting suitable EEG analysis methods based on their research objectives,provide references for subsequent research,and summarize current issues and prospects for the future.展开更多
Most existing star-galaxy classifiers depend on the reduced information from catalogs,necessitating careful data processing and feature extraction.In this study,we employ a supervised machine learning method(GoogLeNet...Most existing star-galaxy classifiers depend on the reduced information from catalogs,necessitating careful data processing and feature extraction.In this study,we employ a supervised machine learning method(GoogLeNet)to automatically classify stars and galaxies in the COSMOS field.Unlike traditional machine learning methods,we introduce several preprocessing techniques,including noise reduction and the unwrapping of denoised images in polar coordinates,applied to our carefully selected samples of stars and galaxies.By dividing the selected samples into training and validation sets in an 8:2 ratio,we evaluate the performance of the GoogLeNet model in distinguishing between stars and galaxies.The results indicate that the GoogLeNet model is highly effective,achieving accuracies of 99.6% and 99.9% for stars and galaxies,respectively.Furthermore,by comparing the results with and without preprocessing,we find that preprocessing can significantly improve classification accuracy(by approximately 2.0% to 6.0%)when the images are rotated.In preparation for the future launch of the China Space Station Telescope(CSST),we also evaluate the performance of the GoogLeNet model on the CSST simulation data.These results demonstrate a high level of accuracy(approximately 99.8%),indicating that this model can be effectively utilized for future observations with the CSST.展开更多
To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRD...To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRDP can perform operations such as baseband data unpacking,channel separation,coherent dedispersion,Stokes detection,phase and folding period prediction,and folding integration in GPU clusters.We tested the algorithm using the J0437-4715 pulsar baseband data generated by the CASPSR and Medusa backends of the Parkes,and the J0332+5434 pulsar baseband data generated by the self-developed backend of the Nan Shan Radio Telescope.We obtained the pulse profiles of each baseband data.Through experimental analysis,we have found that the pulse profiles generated by the PSRDP algorithm in this paper are essentially consistent with the processing results of Digital Signal Processing Software for Pulsar Astronomy(DSPSR),which verified the effectiveness of the PSRDP algorithm.Furthermore,using the same baseband data,we compared the processing speed of PSRDP with DSPSR,and the results showed that PSRDP was not slower than DSPSR in terms of speed.The theoretical and technical experience gained from the PSRDP algorithm research in this article lays a technical foundation for the real-time processing of QTT(Qi Tai radio Telescope)ultra-wide bandwidth pulsar baseband data.展开更多
The Solar Polar-orbit Observatory(SPO),proposed by Chinese scientists,is designed to observe the solar polar regions in an unprecedented way with a spacecraft traveling in a large solar inclination angle and a small e...The Solar Polar-orbit Observatory(SPO),proposed by Chinese scientists,is designed to observe the solar polar regions in an unprecedented way with a spacecraft traveling in a large solar inclination angle and a small ellipticity.However,one of the most significant challenges lies in ultra-long-distance data transmission,particularly for the Magnetic and Helioseismic Imager(MHI),which is the most important payload and generates the largest volume of data in SPO.In this paper,we propose a tailored lossless data compression method based on the measurement mode and characteristics of MHI data.The background out of the solar disk is removed to decrease the pixel number of an image under compression.Multiple predictive coding methods are combined to eliminate the redundancy utilizing the correlation(space,spectrum,and polarization)in data set,improving the compression ratio.Experimental results demonstrate that our method achieves an average compression ratio of 3.67.The compression time is also less than the general observation period.The method exhibits strong feasibility and can be easily adapted to MHI.展开更多
Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has b...Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has been widely used for all-sky gamma-ray monitors.There are two major methods for this count distribution localization:χ^(2)minimization method and the Bayesian method.Here we propose a modified Bayesian method that could take advantage of both the accuracy of the Bayesian method and the simplicity of the χ^(2)method.With comprehensive simulations,we find that our Bayesian method with Poisson likelihood is generally more applicable for various bursts than the χ^(2)method,especially for weak bursts.We further proposed a location-spectrum iteration approach based on the Bayesian inference,which could alleviate the problems caused by the spectral difference between the burst and location templates.Our method is very suitable for scenarios with limited computation resources or timesensitive applications,such as in-flight localization software,and low-latency localization for rapidly follow-up observations.展开更多
BACKGROUND The prevalence and severity of noncommunicable chronic diseases(NCDs)among Chinese residents have been increasing with mental health emerging as a critical challenge in disease management.AIM To examine the...BACKGROUND The prevalence and severity of noncommunicable chronic diseases(NCDs)among Chinese residents have been increasing with mental health emerging as a critical challenge in disease management.AIM To examine the interactions between depression,anxiety symptoms,and related factors,and to identify key factors in the Chinese population with NCDs.METHODS Data from the Psychology and Behavior Investigation of Chinese Residents were used in a cross-sectional survey of 6182 individuals with NCDs.This study measured depression and anxiety symptoms as well as their influencing factorsincluding social environments,individual behaviors and lifestyles,and subjective indicators.A network analysis approach was used for data assessment.RESULTS Network analysis demonstrated that several central factors(media exposure,family health,problematic internet use,suboptimal health status,intimate relationship violence,tired or little energy,and nervousness/anxious/on edge)and bridge factors(media exposure,problematic internet use,intimate partner violence,health literacy,and suboptimal health status)that significantly influenced the co-occurrence and interconnectedness of depression and anxiety symptoms.Additionally,gender,ethnicity,residency,and living status did not significantly influence the overall network strength.CONCLUSION Depression and anxiety are prevalent among the Chinese population with NCDs.Effective interventions should focus on managing key symptoms,promoting correct media use for health information,and fostering healthier family relationships.展开更多
The underwater anechoic coating technology,which considers pressure resistance and low-frequency broadband sound absorption,has become a research hotspot in underwater acoustics and has received wide attention to addr...The underwater anechoic coating technology,which considers pressure resistance and low-frequency broadband sound absorption,has become a research hotspot in underwater acoustics and has received wide attention to address the increasingly advanced low-frequency sonar detection technology and adapt to the working environment of underwater vehicles in deep submergence.One the one hand,controlling low-frequency sound waves in water is more challenging than in air.On the other hand,in addition to initiating structural deformation,hydrostatic pressure also changes material parameters,both of which have a major effect on the sound absorption performance of the anechoic coating.Therefore,resolving the pressure resistance and acoustic performance of underwater acoustic coatings is difficult.Particularly,a bottleneck problem that must be addressed in this field is the acoustic structure design with low-frequency broadband sound absorption under high hydrostatic pressure.Based on the influence of hydrostatic pressure on underwater anechoic coatings,the research status of underwater acoustic structures under hydrostatic pressure from the aspects of sound absorption mechanisms,analysis methods,and structural designs is reviewed in this paper.Finally,the challenges and research trends encountered by underwater anechoic coating technology under hydrostatic pressure are summarized,providing a reference for the design and research of low-frequency broadband anechoic coating.展开更多
Objective To establish a scientific,reliable,objective,and effective clinical comprehensive evaluation system for drugs centralized bidding procurement by government,and to conduct reliability and validity test and em...Objective To establish a scientific,reliable,objective,and effective clinical comprehensive evaluation system for drugs centralized bidding procurement by government,and to conduct reliability and validity test and empirical analysis of the evaluation index system through simulated measurement.Methods Literature research method was used to select comprehensive evaluation indicators for drugs centralized bidding procurement.Then,Dephi method was applied to screen the final evaluation indicators,and the weight of the indicators was determined using analytic hierarchy process.Results and Conclusion The final clinical efficacy evaluation index system for drugs centralized bidding procurement includes 5 primary indicators and 13 secondary indicators.The experts authority coefficient in this study is high,and their opinions relatively coincide.Through the empirical research,the reliability and structural validity of the indicator system is good.This indicator system enriches methods and tools for scientifically evaluating the clinical efficacy of drugs centralized bidding procurement.展开更多
The Mini-SiTian(MST)project is a pathfinder for China's next-generation large-scale time-domain survey,SiTian,aimed at discovering variable stars,transients,and explosive events.MST generates hundreds of thousands...The Mini-SiTian(MST)project is a pathfinder for China's next-generation large-scale time-domain survey,SiTian,aimed at discovering variable stars,transients,and explosive events.MST generates hundreds of thousands of transient alerts every night,approximately 99%of which are false alarms,posing a significant challenge to its scientific goals.To mitigate the impact of false positives,we propose a deep learning–based solution and systematically evaluate 13 convolutional neural networks.The results show that ResNet achieves exceptional specificity(99.70%),EfficientNet achieves the highest recall rate(98.68%),and DenseNet provides balanced performance with a recall rate of 94.55%and specificity of 98.66%.Leveraging these complementary strengths,we developed a bagging-based ensemble classifier that integrates ResNet18,DenseNet121,and EfficientNet_B0 using a soft voting strategy.This classifier achieved the best AUC value(0.9961)among all models,with a recall rate of95.37%and specificity of 99.25%.It has now been successfully deployed in the MST real-time data processing pipeline.Validation using 5000 practically processed samples with a classification threshold of 0.798 showed that the classifier achieved 88.31%accuracy,91.89%recall rate,and 99.82%specificity,confirming its effectiveness and robustness under real application conditions.展开更多
As a pathfinder of the SiTian project,the Mini-SiTian(MST)Array,employed three commercial CMOS cameras,represents a next-generation,cost-effective optical time-domain survey project.This paper focuses primarily on the...As a pathfinder of the SiTian project,the Mini-SiTian(MST)Array,employed three commercial CMOS cameras,represents a next-generation,cost-effective optical time-domain survey project.This paper focuses primarily on the precise data processing pipeline designed for wide-field,CMOS-based devices,including the removal of instrumental effects,astrometry,photometry,and flux calibration.When applying this pipeline to approximately3000 observations taken in the Field 02(f02)region by MST,the results demonstrate a remarkable astrometric precision of approximately 70–80 mas(about 0.1 pixel),an impressive calibration accuracy of approximately1 mmag in the MST zero points,and a photometric accuracy of about 4 mmag for bright stars.Our studies demonstrate that MST CMOS can achieve photometric accuracy comparable to that of CCDs,highlighting the feasibility of large-scale CMOS-based optical time-domain surveys and their potential applications for cost optimization in future large-scale time-domain surveys,like the SiTian project.展开更多
In the task of classifying massive celestial data,the accurate classification of galaxies,stars,and quasars usually relies on spectral labels.However,spectral data account for only a small fraction of all astronomical...In the task of classifying massive celestial data,the accurate classification of galaxies,stars,and quasars usually relies on spectral labels.However,spectral data account for only a small fraction of all astronomical observation data,and the target source classification information in vast photometric data has not been accurately measured.To address this,we propose a novel deep learning-based algorithm,YL8C4Net,for the automatic detection and classification of target sources in photometric images.This algorithm combines the YOLOv8 detection network with the Conv4Net classification network.Additionally,we propose a novel magnitude-based labeling method for target source annotation.In the performance evaluation,the YOLOv8 achieves impressive performance with average precision scores of 0.824 for AP@0.5 and 0.795 for AP@0.5:0.95.Meanwhile,the constructed Conv4Net attains an accuracy of 0.8895.Overall,YL8C4Net offers the advantages of fewer parameters,faster processing speed,and higher classification accuracy,making it particularly suitable for large-scale data processing tasks.Furthermore,we employed the YL8C4Net model to conduct target source detection and classification on photometric images from 20 sky regions in SDSS-DR17.As a result,a catalog containing about 9.39 million target source classification results has been preliminarily constructed,thereby providing valuable reference data for astronomical research.展开更多
Stellar classification is a fundamental task in astronomical data analysis.Photometric data offer a significant advantage over spectral data in terms of data volume.Their lower acquisition cost and broader coverage ma...Stellar classification is a fundamental task in astronomical data analysis.Photometric data offer a significant advantage over spectral data in terms of data volume.Their lower acquisition cost and broader coverage make them more suitable for stellar classification applications.This study selects photometric data from the SDSS DR18.Instead of using traditional RGB image formats,a series of preprocessing steps were applied to generate five-channel Numpy files as the data set.To enhance stellar classification performance,we propose a deep learning model based on photometric feature fusion–Stellar Photometric Features Fusion Network.Additionally,we introduce the Dynamic Enhanced Stellar Squeeze-and-Excitation module,designed to optimize the weight allocation of different photometric bands in the classification task,and investigate the impact of each band's features on classification performance.Ultimately,we found that the information from the r and z bands played a more crucial role in the stellar classification task,achieving a final classification accuracy of 87.47%,thereby demonstrating the effectiveness of photometric data in stellar classification.展开更多
BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Pack...BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Package)dedicated to automatically reducing the long-slit and echelle spectra obtained by these two instruments.The package supports bias and flat-fielding correction,order location,background subtraction,automatic wavelength calibration,and absolute flux calibration.The optimal extraction method maximizes the signal-to-noise ratio and removes most of the cosmic rays imprinted in the spectra.A comparison with the 1D spectra reduced with IRAF verifies the reliability of the results.This open-source software is publicly available to the community.展开更多
基金Project(2023YFC3805700) supported by the National Key Research and Development Program of ChinaProjects(42477166,42277174) supported by the National Natural Science Foundation of China+2 种基金Project(2024JCCXSB01) supported by the Fundamental Research Funds for the Central Universities,ChinaProject(KFJJ24-01M) supported by the State Key Laboratory of Explosion Science and Safety Protection,Beijing Institute of Technology,ChinaProject(HLCX-2024-04) supported by the Open Foundation of Collaborative Innovation Center of Green Development and Ecological Restoration of Mineral Resources,China。
文摘The surrounding rock is prone to large-scale loosening and failure after the excavation of shallow large-span caverns because of the thin overlying strata and large cross-section span.The rational design of bolt support is very important to the safety control of surrounding rock as a common support means.The control mechanism and design method of bolt support for shallow-buried large-span caverns is carried out.The calculation method of bolt prestress and length based on arched failure and collapsed failure mode is established.The influence mechanism of different influencing factors on the bolt prestress and length is clarified.At the same time,the constant resistance energy-absorbing bolt with high strength and high toughness is developed,and the comparative test of mechanical properties is carried out.On this basis,the design method of high prestressed bolt support for shallow-buried large-span caverns is put forward,and the field test is carried out in Qingdao metro station in China.The monitoring results show that the maximum roof settlement is 6.8 mm after the new design method is adopted,and the effective control of the shallow-buried large-span caverns is realized.The research results can provide theoretical and technical support for the safety control of shallow-buried large-span caverns.
基金supported by China National Astronomical Data Center(NADC),CAS Astronomical Data Center and Chinese Virtual Observatory(China-VO)supported by Astronomical Big Data Joint Research Center,co-founded by National Astronomical Observatories,Chinese Academy of Sciences and Alibaba Cloud。
文摘Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces controversy regarding its observational epoch.Determining this epoch via precession assumes accurate ancient coordinates and correspondence with contemporary stars,posing significant challenges.This study introduces a novel method using the Generalized Hough Transform to ascertain the catalog’s observational epoch.This approach statistically accommodates errors in ancient coordinates and discrepancies between ancient and modern stars,addressing limitations in prior methods.Our findings date Shi’s Star Catalog to the 4th century BCE,with 2nd-century CE adjustments.In comparison,the Western tradition’s oldest known catalog,the Ptolemaic Star Catalog(2nd century CE),likely derives from the Hipparchus Star Catalog(2nd century BCE).Thus,Shi’s Star Catalog is identified as the world’s oldest known star catalog.Beyond establishing its observation period,this study aims to consolidate and digitize these cultural artifacts.
基金supported by the Strategic Priority Research Program of Chinese Academy of Sciences(grant No.XDA0350502)the National SKA Program of China(grant No.2020SKA0120103)the National Natural Science Foundation of China(NSFC,Grant Nos.U1831130 and 11973046)。
文摘Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the Terrestrial Time TT(BIPMXXXX)-TAI but PT has larger measurement error.In this paper,we discuss the smoothing method of PT using a combined smoothing filter and compare the results with that from other filters.The clock difference sequence between PT-TAI and the first time derivative series of the TT(BIPMXXXX)-TAI can be combined by a combined smoothing filter to yield two smooth curves tied by the constraints assuring that the latter is the derivative of the former.The ensemble pulsar time IPTA2016 with respect to TAI published by G.Hobbs et al.and first time derivative series of the TT(BIPM2017)-TAI with quadratic polynomial terms removed are processed by combined smoothing filter in order to demonstrate the properties of the smoothed results.How to correctly estimate two smoothing coefficients is described and the output results of the combined smoothing filter are analyzed.The results show that the combined smoothing method efficiently removes high frequency noises of two input data series and the smoothed data of the PT-TAI combine long term fractional frequency stability of the pulsar time and frequency accuracy of the terrestrial time.Fractional frequency stability analysis indicates that both short and medium time interval stability of the smoothed PT-TAI is improved while keeping its original long term frequency stability level.The combined smoothing filter is more suitable for smoothing observational pulsar timescale data than any filter that only performs smoothing of a single pulsar time series.The smoothed pulsar time by combined smoothing filter is a pulsar atomic time combined timescale.This kind of combined timescale can also be used as terrestrial time.
基金funded by the National Key Research and Development Program’s intergovernmental International Science and Technology Innovation Cooperation project,titled Remote Sensing and Radio Astronomy Observation of Space Weather in Low and Middle Latitudes(project number:2022YFE0140000)Supported by International Partnership Program of Chinese Academy of Sciences,grant No.114A11KYSB20200001。
文摘The 21 cm radiation of neutral hydrogen provides crucial information for studying the early universe and its evolution.To advance this research,countries have made significant investments in constructing large lowfrequency radio telescope arrays,such as the Low Frequency Array and the Square Kilometre Array Phase 1 Low Frequency.These instruments are pivotal for radio astronomy research.However,challenges such as ionospheric plasma interference,ambient radio noise,and instrument-related effects have become increasingly prominent,posing major obstacles in cosmology research.To address these issues,this paper proposes an efficient signal processing method that combines wavelet transform and mathematical morphology.The method involves the following steps:Background Subtraction:Background interference in radio observation signals is eliminated.Wavelet Transform:The signal,after removing background noise,undergoes a two-dimensional discrete wavelet transform.Threshold processing is then applied to the wavelet coefficients to effectively remove interference components.Wavelet Inversion:The processed signal is reconstructed using wavelet inversion.Mathematical Morphology:The reconstructed signal is further optimized using mathematical morphology to refine the results.Experimental verification was conducted using solar observation data from the Xinjiang Observatory and the Yunnan Observatory.The results demonstrate that this method successfully removes interference signals while preserving useful signals,thus improving the accuracy of radio astronomy observations and reducing the impact of radio frequency interference.
文摘Condensed and hydrolysable tannins are non-toxic natural polyphenols that are a commercial commodity industrialized for tanning hides to obtain leather and for a growing number of other industrial applications mainly to substitute petroleum-based products.They are a definite class of sustainable materials of the forestry industry.They have been in operation for hundreds of years to manufacture leather and now for a growing number of applications in a variety of other industries,such as wood adhesives,metal coating,pharmaceutical/medical applications and several others.This review presents the main sources,either already or potentially commercial of this forestry by-materials,their industrial and laboratory extraction systems,their systems of analysis with their advantages and drawbacks,be these methods so simple to even appear primitive but nonetheless of proven effectiveness,or very modern and instrumental.It constitutes a basic but essential summary of what is necessary to know of these sustainable materials.In doing so,the review highlights some of the main challenges that remain to be addressed to deliver the quality and economics of tannin supply necessary to fulfill the industrial production requirements for some materials-based uses.
基金supported by the National Natural Science Foundation of China (Nos. 61873196 and 62373030)the Innovation Program for Quantum Science and Technology(No. 2021ZD0303400)。
文摘In the two-dimensional positioning method of pulsars, the grid method is used to provide non-sensitive direction and positional estimates. However, the grid method has a high computational load and low accuracy due to the interval of the grid. To improve estimation accuracy and reduce the computational load, we propose a fast twodimensional positioning method for the crab pulsar based on multiple optimization algorithms(FTPCO). The FTPCO uses the Levenberg–Marquardt(LM) algorithm, three-point orientation(TPO) method, particle swarm optimization(PSO) and Newton–Raphson-based optimizer(NRBO) to substitute the grid method. First, to avoid the influence of the non-sensitive direction on positioning, we take an orbital error and the distortion of the pulsar profile as optimization objectives and combine the grid method with the LM algorithm or PSO to search for the non-sensitive direction. Then, on the sensitive plane perpendicular to the non-sensitive direction, the TPO method is proposed to fast search the sensitive direction and sub-sensitive direction. Finally, the NRBO is employed on the sensitive and sub-sensitive directions to achieve two-dimensional positioning of the Crab pulsar. The simulation results show that the computational load of the FTPCO is reduced by 89.4% and the positioning accuracy of the FTPCO is improved by approximately 38% compared with the grid method. The FTPCO has the advantage of high real-time accuracy and does not fall into the local optimum.
基金supported by the National Natural Science Foundation of China(Grant No.12271433)。
文摘In this paper,two types of fractional nonlinear equations in Caputo sense,time-fractional Newell–Whitehead equation(FNWE)and time-fractional generalized Hirota–Satsuma coupled KdV system(HS-cKdVS),are investigated by means of the q-homotopy analysis method(q-HAM).The approximate solutions of the proposed equations are constructed in the form of a convergent series and are compared with the corresponding exact solutions.Due to the presence of the auxiliary parameter h in this method,just a few terms of the series solution are required in order to obtain better approximation.For the sake of visualization,the numerical results obtained in this paper are graphically displayed with the help of Maple.
基金supported by the STI2030 Major Projects(2021ZD0204300)the National Natural Science Foundation of China(61803003,62003228).
文摘Electroencephalography(EEG)is a non-invasive measurement method for brain activity.Due to its safety,high resolution,and hypersensitivity to dynamic changes in brain neural signals,EEG has aroused much interest in scientific research and medical felds.This article reviews the types of EEG signals,multiple EEG signal analysis methods,and the application of relevant methods in the neuroscience feld and for diagnosing neurological diseases.First,3 types of EEG signals,including time-invariant EEG,accurate event-related EEG,and random event-related EEG,are introduced.Second,5 main directions for the methods of EEG analysis,including power spectrum analysis,time-frequency analysis,connectivity analysis,source localization methods,and machine learning methods,are described in the main section,along with diferent sub-methods and effect evaluations for solving the same problem.Finally,the application scenarios of different EEG analysis methods are emphasized,and the advantages and disadvantages of similar methods are distinguished.This article is expected to assist researchers in selecting suitable EEG analysis methods based on their research objectives,provide references for subsequent research,and summarize current issues and prospects for the future.
基金supported by the Strategic Priority Research Program of Chinese Academy of Sciences(grant No.XDB41000000)the National Natural Science Foundation of China(NSFC,Grant Nos.12233008 and 11973038)+2 种基金the China Manned Space Project(No.CMS-CSST-2021-A07)the Cyrus Chun Ying Tang Foundationsthe support from Hong Kong Innovation and Technology Fund through the Research Talent Hub program(GSP028)。
文摘Most existing star-galaxy classifiers depend on the reduced information from catalogs,necessitating careful data processing and feature extraction.In this study,we employ a supervised machine learning method(GoogLeNet)to automatically classify stars and galaxies in the COSMOS field.Unlike traditional machine learning methods,we introduce several preprocessing techniques,including noise reduction and the unwrapping of denoised images in polar coordinates,applied to our carefully selected samples of stars and galaxies.By dividing the selected samples into training and validation sets in an 8:2 ratio,we evaluate the performance of the GoogLeNet model in distinguishing between stars and galaxies.The results indicate that the GoogLeNet model is highly effective,achieving accuracies of 99.6% and 99.9% for stars and galaxies,respectively.Furthermore,by comparing the results with and without preprocessing,we find that preprocessing can significantly improve classification accuracy(by approximately 2.0% to 6.0%)when the images are rotated.In preparation for the future launch of the China Space Station Telescope(CSST),we also evaluate the performance of the GoogLeNet model on the CSST simulation data.These results demonstrate a high level of accuracy(approximately 99.8%),indicating that this model can be effectively utilized for future observations with the CSST.
基金supported by the National Key R&D Program of China Nos.2021YFC2203502 and 2022YFF0711502the National Natural Science Foundation of China(NSFC)(12173077 and 12003062)+5 种基金the Tianshan Innovation Team Plan of Xinjiang Uygur Autonomous Region(2022D14020)the Tianshan Talent Project of Xinjiang Uygur Autonomous Region(2022TSYCCX0095)the Scientific Instrument Developing Project of the Chinese Academy of Sciences,grant No.PTYQ2022YZZD01China National Astronomical Data Center(NADC)the Operation,Maintenance and Upgrading Fund for Astronomical Telescopes and Facility Instruments,budgeted from the Ministry of Finance of China(MOF)and administrated by the Chinese Academy of Sciences(CAS)Natural Science Foundation of Xinjiang Uygur Autonomous Region(2022D01A360)。
文摘To address the problem of real-time processing of ultra-wide bandwidth pulsar baseband data,we designed and implemented a pulsar baseband data processing algorithm(PSRDP)based on GPU parallel computing technology.PSRDP can perform operations such as baseband data unpacking,channel separation,coherent dedispersion,Stokes detection,phase and folding period prediction,and folding integration in GPU clusters.We tested the algorithm using the J0437-4715 pulsar baseband data generated by the CASPSR and Medusa backends of the Parkes,and the J0332+5434 pulsar baseband data generated by the self-developed backend of the Nan Shan Radio Telescope.We obtained the pulse profiles of each baseband data.Through experimental analysis,we have found that the pulse profiles generated by the PSRDP algorithm in this paper are essentially consistent with the processing results of Digital Signal Processing Software for Pulsar Astronomy(DSPSR),which verified the effectiveness of the PSRDP algorithm.Furthermore,using the same baseband data,we compared the processing speed of PSRDP with DSPSR,and the results showed that PSRDP was not slower than DSPSR in terms of speed.The theoretical and technical experience gained from the PSRDP algorithm research in this article lays a technical foundation for the real-time processing of QTT(Qi Tai radio Telescope)ultra-wide bandwidth pulsar baseband data.
基金supported by the National Key R&D Program of China(grant No.2022YFF0503800)by the National Natural Science Foundation of China(NSFC)(grant No.11427901)+1 种基金by the Strategic Priority Research Program of the Chinese Academy of Sciences(CAS-SPP)(grant No.XDA15320102)by the Youth Innovation Promotion Association(CAS No.2022057)。
文摘The Solar Polar-orbit Observatory(SPO),proposed by Chinese scientists,is designed to observe the solar polar regions in an unprecedented way with a spacecraft traveling in a large solar inclination angle and a small ellipticity.However,one of the most significant challenges lies in ultra-long-distance data transmission,particularly for the Magnetic and Helioseismic Imager(MHI),which is the most important payload and generates the largest volume of data in SPO.In this paper,we propose a tailored lossless data compression method based on the measurement mode and characteristics of MHI data.The background out of the solar disk is removed to decrease the pixel number of an image under compression.Multiple predictive coding methods are combined to eliminate the redundancy utilizing the correlation(space,spectrum,and polarization)in data set,improving the compression ratio.Experimental results demonstrate that our method achieves an average compression ratio of 3.67.The compression time is also less than the general observation period.The method exhibits strong feasibility and can be easily adapted to MHI.
基金supported by the National Key R&D Program of China(2021YFA0718500)support from the Strategic Priority Research Program on Space Science,the Chinese Academy of Sciences(grant Nos.XDA15360102,XDA15360300,XDA15052700 and E02212A02S)+1 种基金the National Natural Science Foundation of China(grant Nos.12173038 and U2038106)the National HEP Data Center(grant No.E029S2S1)。
文摘Fast and reliable localization of high-energy transients is crucial for characterizing the burst properties and guiding the follow-up observations.Localization based on the relative counts of different detectors has been widely used for all-sky gamma-ray monitors.There are two major methods for this count distribution localization:χ^(2)minimization method and the Bayesian method.Here we propose a modified Bayesian method that could take advantage of both the accuracy of the Bayesian method and the simplicity of the χ^(2)method.With comprehensive simulations,we find that our Bayesian method with Poisson likelihood is generally more applicable for various bursts than the χ^(2)method,especially for weak bursts.We further proposed a location-spectrum iteration approach based on the Bayesian inference,which could alleviate the problems caused by the spectral difference between the burst and location templates.Our method is very suitable for scenarios with limited computation resources or timesensitive applications,such as in-flight localization software,and low-latency localization for rapidly follow-up observations.
文摘BACKGROUND The prevalence and severity of noncommunicable chronic diseases(NCDs)among Chinese residents have been increasing with mental health emerging as a critical challenge in disease management.AIM To examine the interactions between depression,anxiety symptoms,and related factors,and to identify key factors in the Chinese population with NCDs.METHODS Data from the Psychology and Behavior Investigation of Chinese Residents were used in a cross-sectional survey of 6182 individuals with NCDs.This study measured depression and anxiety symptoms as well as their influencing factorsincluding social environments,individual behaviors and lifestyles,and subjective indicators.A network analysis approach was used for data assessment.RESULTS Network analysis demonstrated that several central factors(media exposure,family health,problematic internet use,suboptimal health status,intimate relationship violence,tired or little energy,and nervousness/anxious/on edge)and bridge factors(media exposure,problematic internet use,intimate partner violence,health literacy,and suboptimal health status)that significantly influenced the co-occurrence and interconnectedness of depression and anxiety symptoms.Additionally,gender,ethnicity,residency,and living status did not significantly influence the overall network strength.CONCLUSION Depression and anxiety are prevalent among the Chinese population with NCDs.Effective interventions should focus on managing key symptoms,promoting correct media use for health information,and fostering healthier family relationships.
基金Supported by the National Natural Science Foundation of China(Grant No.52271309)Natural Science Foundation of Heilongjiang Province of China(Grant No.YQ2022E104)Doctoral Science and Technology Innovation Fund of Harbin Engineering University(Grant No.3072023GIP0302).
文摘The underwater anechoic coating technology,which considers pressure resistance and low-frequency broadband sound absorption,has become a research hotspot in underwater acoustics and has received wide attention to address the increasingly advanced low-frequency sonar detection technology and adapt to the working environment of underwater vehicles in deep submergence.One the one hand,controlling low-frequency sound waves in water is more challenging than in air.On the other hand,in addition to initiating structural deformation,hydrostatic pressure also changes material parameters,both of which have a major effect on the sound absorption performance of the anechoic coating.Therefore,resolving the pressure resistance and acoustic performance of underwater acoustic coatings is difficult.Particularly,a bottleneck problem that must be addressed in this field is the acoustic structure design with low-frequency broadband sound absorption under high hydrostatic pressure.Based on the influence of hydrostatic pressure on underwater anechoic coatings,the research status of underwater acoustic structures under hydrostatic pressure from the aspects of sound absorption mechanisms,analysis methods,and structural designs is reviewed in this paper.Finally,the challenges and research trends encountered by underwater anechoic coating technology under hydrostatic pressure are summarized,providing a reference for the design and research of low-frequency broadband anechoic coating.
文摘Objective To establish a scientific,reliable,objective,and effective clinical comprehensive evaluation system for drugs centralized bidding procurement by government,and to conduct reliability and validity test and empirical analysis of the evaluation index system through simulated measurement.Methods Literature research method was used to select comprehensive evaluation indicators for drugs centralized bidding procurement.Then,Dephi method was applied to screen the final evaluation indicators,and the weight of the indicators was determined using analytic hierarchy process.Results and Conclusion The final clinical efficacy evaluation index system for drugs centralized bidding procurement includes 5 primary indicators and 13 secondary indicators.The experts authority coefficient in this study is high,and their opinions relatively coincide.Through the empirical research,the reliability and structural validity of the indicator system is good.This indicator system enriches methods and tools for scientifically evaluating the clinical efficacy of drugs centralized bidding procurement.
基金supported by the National Key Basic R&D Program of China via 2023YFA1608303the Strategic Priority Research Program of the Chinese Academy of Sciences(XDB0550103)the National Natural Science Foundation of China under grant Nos.12273076,12133001,12422303 and12261141690。
文摘The Mini-SiTian(MST)project is a pathfinder for China's next-generation large-scale time-domain survey,SiTian,aimed at discovering variable stars,transients,and explosive events.MST generates hundreds of thousands of transient alerts every night,approximately 99%of which are false alarms,posing a significant challenge to its scientific goals.To mitigate the impact of false positives,we propose a deep learning–based solution and systematically evaluate 13 convolutional neural networks.The results show that ResNet achieves exceptional specificity(99.70%),EfficientNet achieves the highest recall rate(98.68%),and DenseNet provides balanced performance with a recall rate of 94.55%and specificity of 98.66%.Leveraging these complementary strengths,we developed a bagging-based ensemble classifier that integrates ResNet18,DenseNet121,and EfficientNet_B0 using a soft voting strategy.This classifier achieved the best AUC value(0.9961)among all models,with a recall rate of95.37%and specificity of 99.25%.It has now been successfully deployed in the MST real-time data processing pipeline.Validation using 5000 practically processed samples with a classification threshold of 0.798 showed that the classifier achieved 88.31%accuracy,91.89%recall rate,and 99.82%specificity,confirming its effectiveness and robustness under real application conditions.
基金supported by the National Key Basic R&D Program of China via 2023YFA1608303the Strategic Priority Research Program of the Chinese Academy of Sciences(XDB0550103)+3 种基金the National Science Foundation of China 12422303,12403024,12222301,12173007,and 12261141690the Postdoctoral Fellowship Program of CPSF under grant Number GZB20240731the Young Data Scientist Project of the National Astronomical Data Center,and the China Post-doctoral Science Foundation No.2023M743447support from the NSFC through grant No.12303039 and No.12261141690.
文摘As a pathfinder of the SiTian project,the Mini-SiTian(MST)Array,employed three commercial CMOS cameras,represents a next-generation,cost-effective optical time-domain survey project.This paper focuses primarily on the precise data processing pipeline designed for wide-field,CMOS-based devices,including the removal of instrumental effects,astrometry,photometry,and flux calibration.When applying this pipeline to approximately3000 observations taken in the Field 02(f02)region by MST,the results demonstrate a remarkable astrometric precision of approximately 70–80 mas(about 0.1 pixel),an impressive calibration accuracy of approximately1 mmag in the MST zero points,and a photometric accuracy of about 4 mmag for bright stars.Our studies demonstrate that MST CMOS can achieve photometric accuracy comparable to that of CCDs,highlighting the feasibility of large-scale CMOS-based optical time-domain surveys and their potential applications for cost optimization in future large-scale time-domain surveys,like the SiTian project.
基金supported by the National Natural Science Foundation of China (NSFC, Grant No. U1731128)
文摘In the task of classifying massive celestial data,the accurate classification of galaxies,stars,and quasars usually relies on spectral labels.However,spectral data account for only a small fraction of all astronomical observation data,and the target source classification information in vast photometric data has not been accurately measured.To address this,we propose a novel deep learning-based algorithm,YL8C4Net,for the automatic detection and classification of target sources in photometric images.This algorithm combines the YOLOv8 detection network with the Conv4Net classification network.Additionally,we propose a novel magnitude-based labeling method for target source annotation.In the performance evaluation,the YOLOv8 achieves impressive performance with average precision scores of 0.824 for AP@0.5 and 0.795 for AP@0.5:0.95.Meanwhile,the constructed Conv4Net attains an accuracy of 0.8895.Overall,YL8C4Net offers the advantages of fewer parameters,faster processing speed,and higher classification accuracy,making it particularly suitable for large-scale data processing tasks.Furthermore,we employed the YL8C4Net model to conduct target source detection and classification on photometric images from 20 sky regions in SDSS-DR17.As a result,a catalog containing about 9.39 million target source classification results has been preliminarily constructed,thereby providing valuable reference data for astronomical research.
文摘Stellar classification is a fundamental task in astronomical data analysis.Photometric data offer a significant advantage over spectral data in terms of data volume.Their lower acquisition cost and broader coverage make them more suitable for stellar classification applications.This study selects photometric data from the SDSS DR18.Instead of using traditional RGB image formats,a series of preprocessing steps were applied to generate five-channel Numpy files as the data set.To enhance stellar classification performance,we propose a deep learning model based on photometric feature fusion–Stellar Photometric Features Fusion Network.Additionally,we introduce the Dynamic Enhanced Stellar Squeeze-and-Excitation module,designed to optimize the weight allocation of different photometric bands in the classification task,and investigate the impact of each band's features on classification performance.Ultimately,we found that the information from the r and z bands played a more crucial role in the stellar classification task,achieving a final classification accuracy of 87.47%,thereby demonstrating the effectiveness of photometric data in stellar classification.
基金supported by the National Natural Science Foundation of China under grant No.U2031144partially supported by the Open Project Program of the Key Laboratory of Optical Astronomy,National Astronomical Observatories,Chinese Academy of Sciences+5 种基金supported by the National Key R&D Program of China with No.2021YFA1600404the National Natural Science Foundation of China(12173082)the Yunnan Fundamental Research Projects(grant 202201AT070069)the Top-notch Young Talents Program of Yunnan Provincethe Light of West China Program provided by the Chinese Academy of Sciencesthe International Centre of Supernovae,Yunnan Key Laboratory(No.202302AN360001)。
文摘BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Package)dedicated to automatically reducing the long-slit and echelle spectra obtained by these two instruments.The package supports bias and flat-fielding correction,order location,background subtraction,automatic wavelength calibration,and absolute flux calibration.The optimal extraction method maximizes the signal-to-noise ratio and removes most of the cosmic rays imprinted in the spectra.A comparison with the 1D spectra reduced with IRAF verifies the reliability of the results.This open-source software is publicly available to the community.